WorldWideScience

Sample records for test process efficiency

  1. Efficient separations and processing crosscutting program: Develop and test sorbents

    International Nuclear Information System (INIS)

    Bray, L.A.

    1995-09-01

    This report summarizes work performed during FY 1995 under the task open-quotes Develop and Test Sorbents,close quotes the purpose of which is to develop high-capacity, selective solid extractants to recover cesium, strontium, and technetium from nuclear wastes. This work is being done for the Efficient Separations and Processing Crosscutting Program (ESP), operated by the U.S. Department of Energy's Office of Environmental Management's Office of Technology Development. The task is under the direction of staff at Pacific Northwest Laboratory (PNL) with key participation from industrial and university staff at 3M, St. Paul, Minnesota; IBC Advanced Technologies, Inc., American Forks, Utah; AlliedSignal, Inc., Des Plaines, Illinois, and Texas A ampersand M University, College Station, Texas. 3M and IBC are responsible for ligand and membrane technology development; AlliedSignal and Texas A ampersand M are developing sodium titanate powders; and PNL is testing the materials developed by the industry/university team members. Major accomplishments for FY 1995 are summarized in this report

  2. Hard rock tunnel boring machine penetration test as an indicator of chipping process efficiency

    Directory of Open Access Journals (Sweden)

    M.C. Villeneuve

    2017-08-01

    Full Text Available The transition from grinding to chipping can be observed in tunnel boring machine (TBM penetration test data by plotting the penetration rate (distance/revolution against the net cutter thrust (force per cutter over the full range of penetration rates in the test. Correlating penetration test data to the geological and geomechanical characteristics of rock masses through which a penetration test is conducted provides the ability to reveal the efficiency of the chipping process in response to changing geological conditions. Penetration test data can also be used to identify stress-induced tunnel face instability. This research shows that the strength of the rock is an important parameter for controlling how much net cutter thrust is required to transition from grinding to chipping. It also shows that the geological characteristics of a rock will determine how efficient chipping occurs once it has begun. In particular, geological characteristics that lead to efficient fracture propagation, such as fabric and mica contents, will lead to efficient chipping. These findings will enable a better correlation between TBM performance and geological conditions for use in TBM design, as a basis for contractual payments where penetration rate dominates the excavation cycle and in further academic investigations into the TBM excavation process.

  3. A test of processing efficiency theory in a team sport context.

    Science.gov (United States)

    Smith, N C; Bellamy, M; Collins, D J; Newell, D

    2001-05-01

    In this study, we tested some key postulates of Eysenck and Calvo's processing efficiency theory in a team sport. The participants were 12 elite male volleyball players who were followed throughout the course of a competitive season. Self-report measures of pre-match and in-game cognitive anxiety and mental effort were collected in groups of players high and low in dispositional anxiety. Player performance was determined from the statistical analysis of match-play. Sets were classified according to the point spread separating the two teams into one of three levels of criticality. Game momentum was also analysed to determine its influence on in-game state anxiety. Significant differences in in-game cognitive anxiety were apparent between high and low trait anxiety groups. An interaction between anxiety grouping and momentum condition was also evident in cognitive anxiety. Differences in set criticality were reflected in significant elevations in mental effort, an effect more pronounced in dispositionally high anxious performers. Consistent with the predictions of processing efficiency theory, mental effort ratings were higher in high trait-anxious players in settings where their performance was equivalent to that of low trait-anxious performers. The usefulness of processing efficiency theory as an explanatory framework in sport anxiety research is discussed in the light of these findings.

  4. Efficient tests for equivalence of hidden Markov processes and quantum random walks

    NARCIS (Netherlands)

    U. Faigle; A. Schönhuth (Alexander)

    2011-01-01

    htmlabstractWhile two hidden Markov process (HMP) resp.~quantum random walk (QRW) parametrizations can differ from one another, the stochastic processes arising from them can be equivalent. Here a polynomial-time algorithm is presented which can determine equivalence of two HMP parametrizations

  5. Technology’s present situation and the development prospects of energy efficiency monitoring as well as performance testing & analysis for process flow compressors

    Science.gov (United States)

    Li, L.; Zhao, Y.; Wang, L.; Yang, Q.; Liu, G.; Tang, B.; Xiao, J.

    2017-08-01

    In this paper, the background of performance testing of in-service process flow compressors set in user field are introduced, the main technique barriers faced in the field test are summarized, and the factors that result in real efficiencies of most process flow compressors being lower than the guaranteed by manufacturer are analysed. The authors investigated the present operational situation of process flow compressors in China and found that low efficiency operation of flow compressors is because the compressed gas is generally forced to flow back into the inlet pipe for adapting to the process parameters variety. For example, the anti-surge valve is always opened for centrifugal compressor. To improve the operation efficiency of process compressors the energy efficiency monitoring technology was overviewed and some suggestions are proposed in the paper, which is the basis of research on energy efficiency evaluation and/or labelling of process compressors.

  6. Testing for Stochastic Dominance Efficiency

    NARCIS (Netherlands)

    G.T. Post (Thierry); O. Linton; Y-J. Whang

    2005-01-01

    textabstractWe propose a new test of the stochastic dominance efficiency of a given portfolio over a class of portfolios. We establish its null and alternative asymptotic properties, and define a method for consistently estimating critical values. We present some numerical evidence that our

  7. Idaho Chemical Processing Plant Process Efficiency improvements

    International Nuclear Information System (INIS)

    Griebenow, B.

    1996-03-01

    In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond

  8. Efficient separations & processing crosscutting program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-08-01

    The Efficient Separations and Processing Crosscutting Program (ESP) was created in 1991 to identify, develop, and perfect chemical and physical separations technologies and chemical processes which treat wastes and address environmental problems throughout the DOE complex. The ESP funds several multiyear tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESP supports applied research and development (R & D) leading to the demonstration or use of these separations technologies by other organizations within the Department of Energy (DOE), Office of Environmental Management.

  9. Efficiency of cellular information processing

    International Nuclear Information System (INIS)

    Barato, Andre C; Hartich, David; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)

  10. Use of automated test equipment and open-quotes paperlessclose quotes process control to implement efficient production of SSC dipole magnets

    International Nuclear Information System (INIS)

    Tobin, T.; Fagan, R.; Mitchell, D.

    1994-01-01

    In an effort to minimize human error and maximize process control and test capabilities during Collider Dipole Magnets (CDM) production, General Dynamics is developing automated test and process control equipment; known as Test ampersand Process Control Modules (TPCM's). When used along with software designed to create open-quotes paperlessclose quotes process control documentation, the system becomes the Test ampersand Process Control System (TPCS). This system simplifies business decisions and eliminates some problems normally associated with process control documentation, while reducing human errors during CDM production. It is also designed to reduce test operator errors normally incurred during test setup and data analysis. The authors present an overview of the TPCS hardware and software being developed at General Dynamics, along with the process control techniques included in TPCS

  11. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  12. Terrestrial photovoltaic cell process testing

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    The paper examines critical test parameters, criteria for selecting appropriate tests, and the use of statistical controls and test patterns to enhance PV-cell process test results. The coverage of critical test parameters is evaluated by examining available test methods and then screening these methods by considering the ability to measure those critical parameters which are most affected by the generic process, the cost of the test equipment and test performance, and the feasibility for process testing.

  13. Efficiency of composite tests in gastroentestinal cancer

    International Nuclear Information System (INIS)

    Christensen, M.; Jacobsen, P.M.

    1987-01-01

    The efficiency of composite tests (liver scintigraphy, serum alkaline phosphatase, and serum carcinoembryonic antigen) in findings or excluding liver metastases preoperatively was evaluated in 185 surgical patients with high probability for gastrointestinal cancer, 142 with colorectal and 43 with gastric disorders. A pathoanatomic verification procedure showed liver metastases in 21 and 7 patients, respectively. For each test two cut-off levels were defined in accordance with the operational purpose of the test: either to diagnose metastases (no false-positive test results) or to exclude metastases (no false-negative test results). Generally, composite tests increased overall efficiency. In the colorectal group 39% of the patients were correctly classified by the combined, triple test; in the gastric group 94% were correctly classified. It is concluded that composite tests seem useful, and the operational approach described may be helpful in decision-making and test evaluation

  14. Testing the Weak Form Efficiency of Karachi Stock Exchange

    Directory of Open Access Journals (Sweden)

    Muhammad Arshad Haroon

    2012-12-01

    Full Text Available In an efficient market, share prices reflect all available information. The study of efficient market hypothesis helps to take right decisions related to investments. In this research,weak form efficiency has been tested of Karachi Stock Exchange—KSE covering the period of 2nd November 1991 to 2nd November 2011. Descriptive statistics indicated the absence of weak form efficiency while results of non-parametric tests, showed consistency as well. We employed non-parametric tests were KS Goodness-of-Fit test,run test and autocorrelation test to find out serial independency of the data. Results prove that KSE is not weak-form-efficient. This happens because KSE is an emerging market and there, it has been observed that information take time to be processed. Thus it can besaid that technical analysis may be applied to gain abnormal returns.

  15. Nonparametric Efficiency Testing of Asian Stock Markets Using Weekly Data

    OpenAIRE

    CORNELIS A. LOS

    2004-01-01

    The efficiency of speculative markets, as represented by Fama's 1970 fair game model, is tested on weekly price index data of six Asian stock markets - Hong Kong, Indonesia, Malaysia, Singapore, Taiwan and Thailand - using Sherry's (1992) non-parametric methods. These scientific testing methods were originally developed to analyze the information processing efficiency of nervous systems. In particular, the stationarity and independence of the price innovations are tested over ten years, from ...

  16. Efficiency Tests in Foreign Exchange Market

    Directory of Open Access Journals (Sweden)

    Yi Hsien Lee

    2012-01-01

    Full Text Available The main purpose of the paper is applying filter rules to examine the efficiency of foreign exchange. This paper uses three strategies of filter rules (buy long, sell short, buy long and sell short strategies to test the performance of the transaction for EUR, JPY, GBP. The findings show that people will obtain more return by taking buy long/ sell short strategies of filter rules without considering transaction cost. However, the transaction of these three foreign exchange rate (EUR, JPY, GBP will be more efficient by considering transaction cost. The results imply the foreign exchange market is efficient for the EUR, JPY and GBP.

  17. Efficiency Test Method for Electric Vehicle Chargers

    DEFF Research Database (Denmark)

    Kieldsen, Andreas; Thingvad, Andreas; Martinenas, Sergejus

    2016-01-01

    This paper investigates different methods for measuring the charger efficiency of mass produced electric vehicles (EVs), in order to compare the different models. The consumers have low attention to the loss in the charger though the impact on the driving cost is high. It is not a high priority...... different vehicles. A unified method for testing the efficiency of the charger in EVs, without direct access to the component, is presented. The method is validated through extensive tests of the models Renault Zoe, Nissan LEAF and Peugeot iOn. The results show a loss between 15 % and 40 %, which is far...

  18. Development and efficiency assessment of process lubrication for hot forging

    Science.gov (United States)

    Kargin, S.; Artyukh, Viktor; Ignatovich, I.; Dikareva, Varvara

    2017-10-01

    The article considers innovative technologies in testing and production of process lubricants for hot bulk forging. There were developed new compositions of eco-friendly water-graphite process lubricants for hot extrusion and forging. New approaches to efficiency assessment of process lubricants are developed and described in the following article. Laboratory and field results are presented.

  19. Flight Test Maneuvers for Efficient Aerodynamic Modeling

    Science.gov (United States)

    Morelli, Eugene A.

    2011-01-01

    Novel flight test maneuvers for efficient aerodynamic modeling were developed and demonstrated in flight. Orthogonal optimized multi-sine inputs were applied to aircraft control surfaces to excite aircraft dynamic response in all six degrees of freedom simultaneously while keeping the aircraft close to chosen reference flight conditions. Each maneuver was designed for a specific modeling task that cannot be adequately or efficiently accomplished using conventional flight test maneuvers. All of the new maneuvers were first described and explained, then demonstrated on a subscale jet transport aircraft in flight. Real-time and post-flight modeling results obtained using equation-error parameter estimation in the frequency domain were used to show the effectiveness and efficiency of the new maneuvers, as well as the quality of the aerodynamic models that can be identified from the resultant flight data.

  20. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  1. Improved energy efficiency in the process industries

    Energy Technology Data Exchange (ETDEWEB)

    Pilavachi, P A [Commission of the European Communities, Brussels (Belgium)

    1992-12-31

    The European Commission, through the JOULE Programme, is promoting energy efficient technologies in the process industries; the topics of the various R and D activities are: heat exchangers (enhanced evaporation, shell and tube heat exchangers including distribution of fluids, and fouling), low energy separation processes (adsorption, melt-crystallization and supercritical extraction), chemical reactors (methanol synthesis and reactors with integral heat exchangers), other unit operations (evaporators, glass-melting furnaces, cement kilns and baking ovens, dryers and packed columns and replacements for R12 in refrigeration), energy and system process models (batch processes, simulation and control of transients and energy synthesis), development of advanced sensors.

  2. Efficiency of European Dairy Processing Firms

    NARCIS (Netherlands)

    Soboh, R.A.M.E.; Oude Lansink, A.G.J.M.; Dijk, van G.

    2014-01-01

    This paper compares the technical efficiency and production frontier of dairy processing cooperativesand investor owned firms in six major dairy producing European countries. Two parametric produc-tion frontiers are estimated, i.e. for cooperatives and investor owned firms separately, which are

  3. The Efficient Separations and Processing Integrated Program

    International Nuclear Information System (INIS)

    Kuhn, W.L.; Gephart, J.M.

    1994-08-01

    The Efficient Separations and Processing Integrated Program (ESPIP) was created in 1991 to identify, develop, and perfect separations technologies and processes to treat wastes and address environmental problems throughout the US Department of Energy (DOE) complex. The ESPIP funds several multiyear tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESPIP supports applied R ampersand D leading to demonstration or use of these separations technologies by other organizations within DOE's Office of Environmental Restoration and Waste Management. Examples of current ESPIP-funded separations technologies are described here

  4. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  5. Efficient Separations and Processing Crosscutting Program: Develop and test sorbents. Fiscal year 1996 annual progress report, October 1, 1995--September 30, 1996

    International Nuclear Information System (INIS)

    Brown, G.N.

    1997-01-01

    Ion exchange removal of Cs, Sr, Tc, TRU, etc. has been proposed for minimizing the amount of HLW at Hanford. Purpose of this project is to test sequestering agents and substrates in representative physical/chemical/radiation environments. A small pilot-scale skid system was built. 7 ion exchange materials (CS-100, R-F, SuperLig 644, IE-911, TIE-96, NaTi) were evaluated for pretreatment of actual/simulated Hanford DSSF tank waste. An onsite technology demonstration was done at Hanford 100-N Area N-Springs. A second PADU test demonstrated the 3M web technology for radioactive Cs and Sr decontamination of 105-N-Reactor basin. Other collaborative efforts between PNNL and industry/university participants are reported

  6. Survival Processing Enhances Visual Search Efficiency.

    Science.gov (United States)

    Cho, Kit W

    2018-05-01

    Words rated for their survival relevance are remembered better than when rated using other well-known memory mnemonics. This finding, which is known as the survival advantage effect and has been replicated in many studies, suggests that our memory systems are molded by natural selection pressures. In two experiments, the present study used a visual search task to examine whether there is likewise a survival advantage for our visual systems. Participants rated words for their survival relevance or for their pleasantness before locating that object's picture in a search array with 8 or 16 objects. Although there was no difference in search times among the two rating scenarios when set size was 8, survival processing reduced visual search times when set size was 16. These findings reflect a search efficiency effect and suggest that similar to our memory systems, our visual systems are also tuned toward self-preservation.

  7. Evaluation of economic efficiency of process improvement in food packaging

    Directory of Open Access Journals (Sweden)

    Jana Hron

    2012-01-01

    Full Text Available In general, we make gains in process by the three fundamental ways. First, we define or redefine our process in a strategic sense. Second, once defined or redefined, we commence process operations and use process control methods to target and stabilize our process. Third, we use process improvement methods, as described in this paper, along with process control to fully exploit our process management and/or technology. Process improvement is focused primarily in our subprocesses and sub-subprocesses. Process leverage is the key to process improvement initiatives. This means that small improvements of the basic manufacturing operations can have (with the assumption of mass repetition of the operation a big impact on the functioning of the whole production unit. The complexity within even small organizations, in people, products, and processes, creates significant challenges in effectively and efficiently using these initiatives tools. In this paper we are going to place process purposes in the foreground and initiatives and tools in the background as facilitator to help accomplish process purpose. Initiatives and tools are not the ends we are seeking; result/outcomes in physical, economics, timeliness, and customer service performance matter. In the paper process boundaries (in a generic sense are set by our process purpose and our process definition. Process improvement is initiated within our existing process boundaries. For example, in a fast-food restaurant, if we define our cooking process around a frying technology, then we provide process improvements within our frying technology. On the other hand, if we are considering changing to a broiling technology, then we are likely faced with extensive change, impacting our external customers, and a process redefinition may be required. The result / aim of the paper are based on the example of the process improving of a food packaging quality. Specifically, the integration of two approaches

  8. Process-based organization design and hospital efficiency.

    Science.gov (United States)

    Vera, Antonio; Kuntz, Ludwig

    2007-01-01

    The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.

  9. Asympotic efficiency of signed - rank symmetry tests under skew alternatives.

    OpenAIRE

    Alessandra Durio; Yakov Nikitin

    2002-01-01

    The efficiency of some known tests for symmetry such as the sign test, the Wilcoxon signed-rank test or more general linear signed rank tests was studied mainly under the classical alternatives of location. However it is interesting to compare the efficiencies of these tests under asymmetric alternatives like the so-called skew alternative proposed in Azzalini (1985). We find and compare local Bahadur efficiencies of linear signed-rank statistics for skew alternatives and discuss also the con...

  10. Testing market informational efficiency of Constanta port operators

    Science.gov (United States)

    Roşca, E.; Popa, M.; Ruscă, F.; Burciu, Ş.

    2015-11-01

    The Romanian capital market is still an emergent one. Following the mass- privatization process and the private investments, three of the most important handling and storage companies acting in Constantza Port (OIL Terminal, Comvex and SOCEP) are listed on Romanian Stock Exchange. The paper investigates their evolution on the market, identifying the expected rate of return and the components of the shares risk (specific and systematic). Also, the price evolution could be analyzed through the informational efficiency which instantly reflects the price relevance. The Jarque-Bera normality test regarding the shares return rate distribution and the Fama test for the informational efficiency are completed for each company. The market price model is taken into consideration for price forecasting, computing the return rate auto-correlations. The results are subject of interpretation considering additional managerial and financial information of the companies’ activity.

  11. Efficient Bayesian inference for ARFIMA processes

    Science.gov (United States)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  12. Testing efficiency transfer codes for equivalence

    International Nuclear Information System (INIS)

    Vidmar, T.; Celik, N.; Cornejo Diaz, N.; Dlabac, A.; Ewa, I.O.B.; Carrazana Gonzalez, J.A.; Hult, M.; Jovanovic, S.; Lepy, M.-C.; Mihaljevic, N.; Sima, O.; Tzika, F.; Jurado Vargas, M.; Vasilopoulou, T.; Vidmar, G.

    2010-01-01

    Four general Monte Carlo codes (GEANT3, PENELOPE, MCNP and EGS4) and five dedicated packages for efficiency determination in gamma-ray spectrometry (ANGLE, DETEFF, GESPECOR, ETNA and EFFTRAN) were checked for equivalence by applying them to the calculation of efficiency transfer (ET) factors for a set of well-defined sample parameters, detector parameters and energies typically encountered in environmental radioactivity measurements. The differences between the results of the different codes never exceeded a few percent and were lower than 2% in the majority of cases.

  13. LERF Basin 44 Process Test Plan

    International Nuclear Information System (INIS)

    LUECK, K.J.

    1999-01-01

    This document presents a plan to process a portion of the Liquid Effluent Retention Facility (LERF) Basin 44 wastewater through the 200 Area Effluent Treatment Facility (ETF). The objective of this process test is to determine the most effective/efficient method to treat the wastewater currently stored in LERF Basin 44. The process test will determine the operational parameters necessary to comply with facility effluent discharge permit limits (Ecology 1995) and the Environmental Restoration Disposal Facility (ERDF) acceptance criteria (BHI-00139), while achieving ALARA goals and maintaining the integrity of facility equipment. A major focus of the test plan centers on control of contamination due to leaks and/or facility maintenance. As a pre-startup item, all known leaks will be fixed before the start of the test. During the course of the test, a variety of contamination control measures will be evaluated for implementation during the treatment of the remaining Basin 44 inventory. Of special interest will be techniques and tools used to prevent contamination spread during sampling and when opening contaminated facility equipment/piping. At the conclusion of the test, a post ALARA review will be performed to identify lessons learned from the test run which can be applied to the treatment of the remaining Basin 44 inventory. The volume of wastewater to be treated during this test run is 500,000 gallons. This volume limit is necessary to maintain the ETF radiological inventory limits per the approved authorization basis. The duration of the process test is approximately 30 days

  14. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  15. Annular centrifugal contactors for TRPO process test

    International Nuclear Information System (INIS)

    Duan, W.H.; Wang, J.C.; Chen, J.; Zhou, X.Z.; Zhou, J.Z.; Song, C.L.

    2005-01-01

    The TRPO process has been developed in China for removing TRU elements from high-level liquid waste (HLLW) since 1980s. Centrifugal contactors have several advantages such as low hold-up volume, short residence time, low solvent degradation, small space requirements and short start-up time. Therefore, they are favored for both the reprocessing of spent fuel and the treatment of HLLW. In order to meet study on the TRPO test, a series of annular centrifugal contactors have been developed in Institute of Nuclear and -New Energy Technology, Tsinghua University, China (INET). In particular, the 10-mm annular centrifugal contactor for the laboratory-scale test has been applied successfully in the cold and hot tests of the TRPO process. The 70-mm annular centrifugal contactor for the industry-scale test has two new design characteristics, namely a modular design and an overflow structure. The modular design makes the contactor to be disassembled and assembled fast by simply moving the modules up and down. With the overflow structure, even though one stage or non-adjacent stages of the multi-stage cascade in operation are ceased to work, the cascade can continue to operate. Both the hydraulic performance and the mass-transfer efficiency of these contactors are excellent, and the extraction stage efficiency is greater than 95% at suitable operating conditions.

  16. Role of computational efficiency in process simulation

    Directory of Open Access Journals (Sweden)

    Kurt Strand

    1989-07-01

    Full Text Available It is demonstrated how efficient numerical algorithms may be combined to yield a powerful environment for analysing and simulating dynamic systems. The importance of using efficient numerical algorithms is emphasized and demonstrated through examples from the petrochemical industry.

  17. Testing Local Independence between Two Point Processes

    DEFF Research Database (Denmark)

    Allard, Denis; Brix, Anders; Chadæuf, Joël

    2001-01-01

    Independence test, Inhomogeneous point processes, Local test, Monte Carlo, Nonstationary, Rotations, Spatial pattern, Tiger bush......Independence test, Inhomogeneous point processes, Local test, Monte Carlo, Nonstationary, Rotations, Spatial pattern, Tiger bush...

  18. High Efficiency Refrigeration Process, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — It has been proposed by NASA JSC studies, that the most mass efficient (non-nuclear) method of Lunar habitat cooling is via photovoltaic (PV) direct vapor...

  19. Current efficiency in the chlorate cell process

    Directory of Open Access Journals (Sweden)

    Spasojević Miroslav D.

    2014-01-01

    Full Text Available A mathematical model has been set up for current efficiency in a chlorate cell acting as an ideal electrochemical tubular reactor with a linear increase in hypochlorite concentration from the entrance to the exit. Good agreement was found between the results on current efficiency experimentally obtained under simulated industrial chlorate production conditions and the theoretical values provided by the mathematical model. [Projekat Ministarstva nauke Republike Srbije, br. 172057 i br. 172062

  20. Working memory capacity and redundant information processing efficiency.

    Science.gov (United States)

    Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R

    2015-01-01

    Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  1. Process improvement in healthcare: Overall resource efficiency

    NARCIS (Netherlands)

    de Mast, J.; Kemper, B.; Does, R.J.M.M.; Mandjes, M.; van der Bijl, Y.

    2011-01-01

    This paper aims to develop a unifying and quantitative conceptual framework for healthcare processes from the viewpoint of process improvement. The work adapts standard models from operation management to the specifics of healthcare processes. We propose concepts for organizational modeling of

  2. Designing reactive distillation processes with improved efficiency

    NARCIS (Netherlands)

    Almeida-Rivera, C.P.

    2005-01-01

    In this dissertation a life-span inspired perspective is taken on the conceptual design of grassroots reactive distillation processes. Attention was paid to the economic performance of the process and to potential losses of valuable resources over the process life span. The research was cast in a

  3. Efficient bootstrap with weakly dependent processes

    NARCIS (Netherlands)

    Bravo, Francesco; Crudu, Federico

    The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, the J-statistic for overidentifying

  4. Efficient Cross-Device Query Processing

    NARCIS (Netherlands)

    H. Pirk (Holger)

    2012-01-01

    htmlabstractThe increasing diversity of hardware within a single system promises large performance gains but also poses a challenge for data management systems. Strategies for the efficient use of hardware with large performance differences are still lacking. For example, existing research on GPU

  5. Pragmatic Software Testing Becoming an Effective and Efficient Test Professional

    CERN Document Server

    Black, Rex

    2011-01-01

    A hands-on guide to testing techniques that deliver reliable software and systemsTesting even a simple system can quickly turn into a potentially infinite task. Faced with tight costs and schedules, testers need to have a toolkit of practical techniques combined with hands-on experience and the right strategies in order to complete a successful project. World-renowned testing expert Rex Black provides you with the proven methods and concepts that test professionals must know. He presents you with the fundamental techniques for testing and clearly shows you how to select and apply successful st

  6. Journal: Efficient Hydrologic Tracer-Test Design for Tracer ...

    Science.gov (United States)

    Hydrological tracer testing is the most reliable diagnostic technique available for the determination of basic hydraulic and geometric parameters necessary for establishing operative solute-transport processes. Tracer-test design can be difficult because of a lack of prior knowledge of the basic hydraulic and geometric parameters desired and the appropriate tracer mass to release. A new efficient hydrologic tracer-test design (EHTD) methodology has been developed to facilitate the design of tracer tests by root determination of the one-dimensional advection-dispersion equation (ADE) using a preset average tracer concentration which provides a theoretical basis for an estimate of necessary tracer mass. The method uses basic measured field parameters (e.g., discharge, distance, cross-sectional area) that are combined in functional relatipnships that descrive solute-transport processes related to flow velocity and time of travel. These initial estimates for time of travel and velocity are then applied to a hypothetical continuous stirred tank reactor (CSTR) as an analog for the hydrological-flow system to develop initial estimates for tracer concentration, tracer mass, and axial dispersion. Application of the predicted tracer mass with the hydraulic and geometric parameters in the ADE allows for an approximation of initial sample-collection time and subsequent sample-collection frequency where a maximum of 65 samples were determined to be necessary for descri

  7. Test of special resolution and trigger efficiency

    CERN Document Server

    Benhammou, Y

    2015-01-01

    The forthcoming luminosity upgrade of LHC to super-LHC (sLHC) will increase the expected background rate in the forward region of the ATLAS Muon Spectrometer by approximately the factor of five. Some of the present Muon Spectrometer components will fail to cope with these high rates and will have to be replaced. The results of a test of a device consisting of Thin Gap Chambers (TGC) and a fast small-diameter Muon Drift Tube Chamber (sMDT) using the 180 GeV/c muons at the SPS-H8 muon beam at CERN are presented. The goal of the test was to study the combined TGC-sMDT system as tracking and triggering device in the ATLAS muon spectrometer after high-luminosity upgrades of the LHC. The analysis of the recorded data shows a very good correlation between the TGC and sMDT track position and inclination. This technology offers the combination of trigger and tracking and has good angular and spatial resolutions. The angular resolution is 0.4 mrad for each system individually. For the spatial resolution, the width of t...

  8. DU Processing Efficiency and Reclamation: Plasma Arc Melting

    Energy Technology Data Exchange (ETDEWEB)

    Imhoff, Seth D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Aikin, Jr., Robert M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Swenson, Hunter [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Solis, Eunice Martinez [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    The work described here corresponds to one piece of a larger effort to increase material usage efficiency during DU processing operations. In order to achieve this goal, multiple technologies and approaches are being tested. These technologies occupy a spectrum of technology readiness levels (TRLs). Plasma arc melting (PAM) is one of the technologies being investigated. PAM utilizes a high temperature plasma to melt materials. Depending on process conditions, there are potential opportunities for recycling and material reclamation. When last routinely operational, the LANL research PAM showed extremely promising results for recycling and reclamation of DU and DU alloys. The current TRL is lower due to machine idleness for nearly two decades, which has proved difficult to restart. This report describes the existing results, promising techniques, and the process of bringing this technology back to readiness at LANL.

  9. Efficient querying of large process model repositories

    NARCIS (Netherlands)

    Jin, Tao; Wang, Jianmin; La Rosa, M.; Hofstede, ter A.H.M.; Wen, Lijie

    2013-01-01

    Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business

  10. Data-driven efficient score tests for deconvolution hypotheses

    NARCIS (Netherlands)

    Langovoy, M.

    2008-01-01

    We consider testing statistical hypotheses about densities of signals in deconvolution models. A new approach to this problem is proposed. We constructed score tests for the deconvolution density testing with the known noise density and efficient score tests for the case of unknown density. The

  11. Refractories for Industrial Processing. Opportunities for Improved Energy Efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Hemrick, James G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hayden, H. Wayne [Metals Manufacture Process and Controls Technology, Inc., Oak Ridge, TN (United States); Angelini, Peter [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Moore, Robert E. [R.E. Moore Associates, Maricopa, AZ (United States); Headrick, William L. [R.E. Moore Associates, Maricopa, AZ (United States)

    2005-01-01

    Refractories are a class of materials of critical importance to manufacturing industries with high-temperature unit processes. This study describes industrial refractory applications and identifies refractory performance barriers to energy efficiency for processing. The report provides recommendations for R&D pathways leading to improved refractories for energy-efficient manufacturing and processing.

  12. Design of sample analysis device for iodine adsorption efficiency test in NPPs

    International Nuclear Information System (INIS)

    Ji Jinnan

    2015-01-01

    In nuclear power plants, iodine adsorption efficiency test is used to check the iodine adsorption efficiency of the iodine adsorber. The iodine adsorption efficiency can be calculated through the analysis of the test sample, and thus to determine if the performance of the adsorber meets the requirement on the equipment operation and emission. Considering the process of test and actual demand, in this paper, a special device for the analysis of this kind of test sample is designed. The application shows that the device is with convenient operation and high reliability and accurate calculation, and improves the experiment efficiency and reduces the experiment risk. (author)

  13. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  14. Efficiency of manufacturing processes energy and ecological perspectives

    CERN Document Server

    Li, Wen

    2015-01-01

     This monograph presents a reliable methodology for characterising the energy and eco-efficiency of unit manufacturing processes. The Specific Energy Consumption, SEC, will be identified as the key indicator for the energy efficiency of unit processes.  An empirical approach will be validated on different machine tools and manufacturing processes to depict the relationship between process parameters and energy consumptions. Statistical results and additional validation runs will corroborate the high level of accuracy in predicting the energy consumption. In relation to the eco-efficiency, the value and the associated environmental impacts of  manufacturing processes will also be discussed. The interrelationship between process parameters, process value and the associated environmental impact will be integrated in the evaluation of eco-efficiency. The book concludes with a further investigation of the results in order to develop strategies for further efficiency improvement. The target audience primarily co...

  15. Efficient individualization of hearing aid processed sound

    DEFF Research Database (Denmark)

    Nielsen, Jens Brehm; Nielsen, Jakob

    2013-01-01

    Due to the large amount of options offered by the vast number of adjustable parameters in modern digital hearing aids, it is becoming increasingly daunting—even for a fine-tuning professional—to perform parameter fine tuning to satisfactorily meet the preference of the hearing aid user. In addition......, the communication between the fine-tuning professional and the hearing aid user might muddle the task. In the present paper, an interactive system is proposed to ease and speed up fine tuning of hearing aids to suit the preference of the individual user. The system simultaneously makes the user conscious of his own...... preferences while the system itself learns the user’s preference. Since the learning is based on probabilistic modeling concepts, the system handles inconsistent user feedback efficiently. Experiments with hearing impaired subjects show that the system quickly discovers individual preferred hearing...

  16. Industrial burner and process efficiency program

    Science.gov (United States)

    Huebner, S. R.; Prakash, S. N.; Hersh, D. B.

    1982-10-01

    There is an acute need for a burner that does not use excess air to provide the required thermal turndown and internal recirculation of furnace gases in direct fired batch type furnaces. Such a burner would improve fuel efficiency and product temperature uniformity. A high velocity burner has been developed which is capable of multi-fuel, preheated air, staged combustion. This burner is operated by a microprocessor to fire in a discrete pulse mode using Frequency Modulation (FM) for furnace temperature control by regulating the pulse duration. A flame safety system has been designed to monitor the pulse firing burners using Factory Mutual approved components. The FM combustion system has been applied to an industrial batch hardening furnace (1800 F maximum temperature, 2500 lbs load capacity).

  17. Power and thermal efficient numerical processing

    DEFF Research Database (Denmark)

    Liu, Wei; Nannarelli, Alberto

    2015-01-01

    Numerical processing is at the core of applications in many areas ranging from scientific and engineering calculations to financial computing. These applications are usually executed on large servers or supercomputers to exploit their high speed, high level of parallelism and high bandwidth...

  18. Enhanced substrate conversion efficiency of fermentation processes

    NARCIS (Netherlands)

    Sanders, J.P.M.; Weusthuis, R.A.; Mooibroek, H.

    2006-01-01

    The present invention relates to the field of fermentation technology. In particular the invention relates to fermentation processes for the production of a first and a second fermentation product by a single production organism wherein the first product is in a more reduced state than the substrate

  19. Process development for high-efficiency silicon solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Gee, J.M.; Basore, P.A.; Buck, M.E.; Ruby, D.S.; Schubert, W.K.; Silva, B.L.; Tingley, J.W.

    1991-12-31

    Fabrication of high-efficiency silicon solar cells in an industrial environment requires a different optimization than in a laboratory environment. Strategies are presented for process development of high-efficiency silicon solar cells, with a goal of simplifying technology transfer into an industrial setting. The strategies emphasize the use of statistical experimental design for process optimization, and the use of baseline processes and cells for process monitoring and quality control. 8 refs.

  20. Optimization and Improvement of Test Processes on a Production Line

    Science.gov (United States)

    Sujová, Erika; Čierna, Helena

    2018-06-01

    The paper deals with increasing processes efficiency at a production line of cylinder heads of engines in a production company operating in the automotive industry. The goal is to achieve improvement and optimization of test processes on a production line. It analyzes options for improving capacity, availability and productivity of processes of an output test by using modern technology available on the market. We have focused on analysis of operation times before and after optimization of test processes at specific production sections. By analyzing measured results we have determined differences in time before and after improvement of the process. We have determined a coefficient of efficiency OEE and by comparing outputs we have confirmed real improvement of the process of the output test of cylinder heads.

  1. The Efficiency of Quantum Identity Testing of Multiple States

    OpenAIRE

    Kada, Masaru; Nishimura, Harumichi; Yamakami, Tomoyuki

    2008-01-01

    We examine two quantum operations, the Permutation Test and the Circle Test, which test the identity of n quantum states. These operations naturally extend the well-studied Swap Test on two quantum states. We first show the optimality of the Permutation Test for any input size n as well as the optimality of the Circle Test for three input states. In particular, when n=3, we present a semi-classical protocol, incorporated with the Swap Test, which approximates the Circle Test efficiently. Furt...

  2. MODEL TESTING OF LOW PRESSURE HYDRAULIC TURBINE WITH HIGHER EFFICIENCY

    Directory of Open Access Journals (Sweden)

    V. K. Nedbalsky

    2007-01-01

    Full Text Available A design of low pressure turbine has been developed and it is covered by an invention patent and a useful model patent. Testing of the hydraulic turbine model has been carried out when it was installed on a vertical shaft. The efficiency was equal to 76–78 % that exceeds efficiency of the known low pressure blade turbines. 

  3. PV inverter test setup for European efficiency, static and dynamic MPPT efficiency evaluation

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Valentini, Massimo

    2008-01-01

    This paper concerns the evaluation of performance of grid-connected PV inverters in terms of conversion efficiency, European efficiency, static and dynamic MPP efficiency. Semi-automated tests were performed in the PV laboratory of the Institute of Energy Technology at the Aalborg University...... (Denmark) on a commercial transformerless PV inverter. Thanks to the available experimental test setups, that provide the required high measuring accuracy, and the developed PV simulator, which is required for MPPT performance evaluation, PV Inverters can be pretested before being tested by accredited...

  4. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  5. Eco-efficiency of grinding processes and systems

    CERN Document Server

    Winter, Marius

    2016-01-01

    This research monograph aims at presenting an integrated assessment approach to describe, model, evaluate and improve the eco-efficiency of existing and new grinding processes and systems. Various combinations of grinding process parameters and system configurations can be evaluated based on the eco-efficiency. The book presents the novel concept of empirical and physical modeling of technological, economic and environmental impact indicators. This includes the integrated evaluation of different grinding process and system scenarios. The book is a valuable read for research experts and practitioners in the field of eco-efficiency of manufacturing processes but the book may also be beneficial for graduate students.

  6. The impact of transport processes standardization on supply chain efficiency

    Directory of Open Access Journals (Sweden)

    Maciej Stajniak

    2016-03-01

    Full Text Available Background: During continuous market competition, focusing on the customer service level, lead times and supply flexibility is very important to analyze the efficiency of logistics processes. Analysis of supply chain efficiency is one of the fundamental elements of controlling analysis. Transport processes are a key process that provides physical material flow through the supply chain. Therefore, in this article Authors focus attention on the transport processes efficiency. Methods: The research carried out in the second half of 2014 year, in 210 enterprises of the Wielkopolska Region. Observations and business practice studies conducted by the authors, demonstrate a significant impact of standardization processes on supply chain efficiency. Based on the research results, have been developed standard processes that have been assessed as being necessary to standardize in business practice. Results: Based on these research results and observations, authors have developed standards for transport processes by BPMN notation. BPMN allows authors to conduct multivariate simulation of these processes in further stages of research. Conclusions: Developed standards are the initial stage of research conducted by Authors in the assessment of transport processes efficiency. Further research direction is to analyze the use efficiency of transport processes standards in business practice and their impact on the effectiveness of the entire supply chain.

  7. Greater efficiency in attentional processing related to mindfulness meditation.

    NARCIS (Netherlands)

    Hurk, P.A.M. van den; Giommi, F.; Gielen, S.C.A.M.; Speckens, A.E.M.; Barendregt, H.P.

    2010-01-01

    In this study, attentional processing in relation to mindfulness meditation was investigated. Since recent studies have suggested that mindfulness meditation may induce improvements in attentional processing, we have tested 20 expert mindfulness meditators in the attention network test. Their

  8. Energy efficient process planning based on numerical simulations

    OpenAIRE

    Neugebauer, Reimund; Hochmuth, C.; Schmidt, G.; Dix, M.

    2011-01-01

    The main goal of energy-efficient manufacturing is to generate products with maximum value-added at minimum energy consumption. To this end, in metal cutting processes, it is necessary to reduce the specific cutting energy while, at the same time, precision requirements have to be ensured. Precision is critical in metal cutting processes because they often constitute the final stages of metalworking chains. This paper presents a method for the planning of energy-efficient machining processes ...

  9. Investigation of combined coagulation and advanced oxidation process efficiency for the removal of Clarithromycin from wastewater

    Directory of Open Access Journals (Sweden)

    ahmad reza Yazdanbakhsh

    2011-06-01

    Conclusion: In general the results of the performed tests indicated that combined coagulation and advanced oxidation process has high efficiency in removal of Claritromycin wastewater COD. But application this method in the industry should be surveyed.

  10. Decontamination Efficiency of Fish Bacterial Flora from Processing Surfaces

    Directory of Open Access Journals (Sweden)

    Birna Guðbjörnsdóttir

    2009-01-01

    Full Text Available There are numerous parameters that can influence bacterial decontamination during washing of machinery and equipment in a food processing establishment. Incomplete decontamination of bacteria will increase the risk of biofilm formation and consequently increase the risk of pathogen contamination or prevalence of other undesirable microorganisms such as spoilage bacteria in the processing line. The efficiency of a typical washing protocol has been determined by testing three critical parameters and their effects on bacterial decontamination. Two surface materials (plastic and stainless steel, water temperatures (7 and 25 °C and detergent concentrations (2 and 4 % were used for this purpose in combination with two types of detergents. Biofilm was prepared on the surfaces with undefined bacterial flora obtained from minced cod fillets. The bacterial flora of the biofilm was characterised by cultivation and molecular analysis of 16S rRNA genes. All different combinations of washing protocols tested were able to remove more than 99.9 % of the bacteria in the biofilm and reduce the cell number from 7 to 0 or 2 log units of bacteria/cm2. The results show that it is possible to use less diluted detergents than recommended with comparable success, and it is easier to clean surface material made of stainless steel compared to polyethylene plastic.

  11. Efficient Processing of Multiple DTW Queries in Time Series Databases

    DEFF Research Database (Denmark)

    Kremer, Hardy; Günnemann, Stephan; Ivanescu, Anca-Maria

    2011-01-01

    . In many of today’s applications, however, large numbers of queries arise at any given time. Existing DTW techniques do not process multiple DTW queries simultaneously, a serious limitation which slows down overall processing. In this paper, we propose an efficient processing approach for multiple DTW...... for multiple DTW queries....

  12. Validation of measured friction by process tests

    DEFF Research Database (Denmark)

    Eriksen, Morten; Henningsen, Poul; Tan, Xincai

    The objective of sub-task 3.3 is to evaluate under actual process conditions the friction formulations determined by simulative testing. As regards task 3.3 the following tests have been used according to the original project plan: 1. standard ring test and 2. double cup extrusion test. The task...... has, however, been extended to include a number of new developed process tests: 3. forward rod extrusion test, 4. special ring test at low normal pressure, 5. spike test (especially developed for warm and hot forging). Validation of the measured friction values in cold forming from sub-task 3.1 has...... been made with forward rod extrusion, and very good agreement was obtained between the measured friction values in simulative testing and process testing....

  13. Recovery Efficiency Test Project: Phase 1, Activity report

    Energy Technology Data Exchange (ETDEWEB)

    Overbey, W.K. Jr.; Wilkins, D.W.; Keltch, B.; Saradji, B.; Salamy, S.P.

    1988-04-01

    This report is the second volume of the Recovery Efficiency Test Phase I Report of Activities. Volume 1 covered selection, well planning, drilling, coring, logging and completion operations. This volume reports on well testing activities, reclamation activities on the drilling site and access roads, and the results of physical and mechanical properties tests on the oriented core material obtained from a horizontal section of the well. 3 refs., 21 figs., 10 tabs.

  14. Event-related potential evidence for the processing efficiency theory.

    Science.gov (United States)

    Murray, N P; Janelle, C M

    2007-01-15

    The purpose of this study was to examine the central tenets of the processing efficiency theory using psychophysiological measures of attention and effort. Twenty-eight participants were divided equally into either a high or low trait anxiety group. They were then required to perform a simulated driving task while responding to one of four target light-emitting diodes. Cortical activity and dual task performance were recorded under two conditions -- baseline and competition -- with cognitive anxiety being elevated in the competitive session by an instructional set. Although driving speed was similar across sessions, a reduction in P3 amplitude to cue onset in the light detection task occurred for both groups during the competitive session, suggesting a reduction in processing efficiency as participants became more state anxious. Our findings provide more comprehensive and mechanistic evidence for processing efficiency theory, and confirm that increases in cognitive anxiety can result in a reduction of processing efficiency with little change in performance effectiveness.

  15. Price limits and stock market efficiency: Evidence from rolling bicorrelation test statistic

    International Nuclear Information System (INIS)

    Lim, Kian-Ping; Brooks, Robert D.

    2009-01-01

    Using the rolling bicorrelation test statistic, the present paper compares the efficiency of stock markets from China, Korea and Taiwan in selected sub-periods with different price limits regimes. The statistical results do not support the claims that restrictive price limits and price limits per se are jeopardizing market efficiency. However, the evidence does not imply that price limits have no effect on the price discovery process but rather suggesting that market efficiency is not merely determined by price limits.

  16. Testing efficiency and unbiasedness in the oil market

    International Nuclear Information System (INIS)

    Moosa, I.A.; Al-Loughani, N.

    1994-03-01

    This paper presents some empirical evidence on speculative efficiency or unbiasedness in the crude oil futures market and some related issues. On the basis of monthly observations on spot and futures prices of the WTI crude oil, several tests are carried out on the relevant hypotheses. The evidence suggests that futures prices are neither unbiased nor efficient forecasters of spot prices. Furthermore, a GARCH-M(1,1) model reveals the existence of a time-varying risk premium. (author)

  17. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  18. Energy efficient processing of natural resources; Energieeffiziente Verarbeitung natuerlicher Rohstoffe

    Energy Technology Data Exchange (ETDEWEB)

    Pehlken, Alexandra [Univ. Bremen (Germany). Projekt FU2; Hans, Carl [Bremer Institut fuer Produktion und Logistik GmbH BIBA, Bremen (Germany). Abt. Intelligente Informations- und Kommunikationsumgebungen fuer die kooperative Produktion im Forschungsbereich Informations- und Kommunikationstechnische Anwendungen; Thoben, Klaus-Dieter [Univ. Bremen (Germany). Inst. fuer integrierte Produktentwicklung; Bremer Institut fuer Produktion und Logistik GmbH BIBA, Bremen (Germany). Forschungsbereich Informations- und kommunikationstechnische Anwendungen; Austing, Bernhard [Fa. Austing, Damme (Germany)

    2012-10-15

    Energy efficiency is gaining high importance in production processes. High energy consumption is directly related to high costs. The processing of natural resources is resulting in additional energy input because of defined output quality demands. This paper discussed approaches and IT-solutions for the automatically adjustment of production processes to cope with varying input qualities. The intention is to achieve the lowest energy input into the process without quality restraints.

  19. Efficient testing methodologies for microcameras in a gigapixel imaging system

    Science.gov (United States)

    Youn, Seo Ho; Marks, Daniel L.; McLaughlin, Paul O.; Brady, David J.; Kim, Jungsang

    2013-04-01

    Multiscale parallel imaging--based on a monocentric optical design--promises revolutionary advances in diverse imaging applications by enabling high resolution, real-time image capture over a wide field-of-view (FOV), including sport broadcast, wide-field microscopy, astronomy, and security surveillance. Recently demonstrated AWARE-2 is a gigapixel camera consisting of an objective lens and 98 microcameras spherically arranged to capture an image over FOV of 120° by 50°, using computational image processing to form a composite image of 0.96 gigapixels. Since microcameras are capable of individually adjusting exposure, gain, and focus, true parallel imaging is achieved with a high dynamic range. From the integration perspective, manufacturing and verifying consistent quality of microcameras is a key to successful realization of AWARE cameras. We have developed an efficient testing methodology that utilizes a precisely fabricated dot grid chart as a calibration target to extract critical optical properties such as optical distortion, veiling glare index, and modulation transfer function to validate imaging performance of microcameras. This approach utilizes an AWARE objective lens simulator which mimics the actual objective lens but operates with a short object distance, suitable for a laboratory environment. Here we describe the principles of the methodologies developed for AWARE microcameras and discuss the experimental results with our prototype microcameras. Reference Brady, D. J., Gehm, M. E., Stack, R. A., Marks, D. L., Kittle, D. S., Golish, D. R., Vera, E. M., and Feller, S. D., "Multiscale gigapixel photography," Nature 486, 386--389 (2012).

  20. Firing: the proof test for ceramic processing

    International Nuclear Information System (INIS)

    Kingery, W.D.

    1975-01-01

    The object of ceramic processing is to form ware having certain shapes and properties. Thus, one test of the success of processing procedures must be in terms of the resulting structure and characteristics of a material after firing. During the firing process some few variations resulting from processing may be evened out, but the great majority of variation tends to be amplified. Examination of a few cases illustrates the nature of the defect amplification process. (U.S.)

  1. Process Cycle Efficiency Improvement Through Lean: A Case Study

    Directory of Open Access Journals (Sweden)

    P.V. Mohanram

    2011-06-01

    Full Text Available Lean manufacturing is an applied methodology of scientific, objective techniques that cause work tasks in a process to be performed with a minimum of non-value adding activities resulting in greatly reduced wait time, queue time, move time, administrative time, and other delays. This work addresses the implementation of lean principles in a construction equipment company. The prime objective is to evolve and test several strategies to eliminate waste on the shop floor. This paper describes an application of value stream mapping (VSM. Consequently, the present and future states of value stream maps are constructed to improve the production process by identifying waste and its sources. A noticeable reduction in cycle time and increase in cycle efficiency is confirmed. The production flow was optimized thus minimizing several non-value added activities/times such as bottlenecking time, waiting time, material handling time, etc. This case study can be useful in developing a more generic approach to design lean environment.

  2. Evaluation of Test Method for Solar Collector Efficiency

    DEFF Research Database (Denmark)

    Fan, Jianhua; Shah, Louise Jivan; Furbo, Simon

    The test method of the standard EN12975-2 (European Committee for Standardization, 2004) is used by European test laboratories to determine the efficiency of solar collectors. In the test methods the mean solar collector fluid temperature in the solar collector, Tm is determined by the approximat...... and the sky temperature. Based on the investigations, recommendations for change of the test methods and test conditions are considered. The investigations are carried out within the NEGST (New Generation of Solar Thermal Systems) project financed by EU.......The test method of the standard EN12975-2 (European Committee for Standardization, 2004) is used by European test laboratories to determine the efficiency of solar collectors. In the test methods the mean solar collector fluid temperature in the solar collector, Tm is determined by the approximated...... equation where Tin is the inlet temperature to the collector and Tout is the outlet temperature from the collector. The specific heat of the solar collector fluid is in the test method as an approximation determined as a constant equal to the specific heat of the solar collector fluid at the temperature Tm...

  3. Simple processing of high efficiency silicon solar cells

    International Nuclear Information System (INIS)

    Hamammu, I.M.; Ibrahim, K.

    2006-01-01

    Cost effective photovoltaic devices have been an area research since the development of the first solar cells, as cost is the major factor in their usage. Silicon solar cells have the biggest share in the photovoltaic market, though silicon os not the optimal material for solar cells. This work introduces a simplified approach for high efficiency silicon solar cell processing, by minimizing the processing steps and thereby reducing cost. The suggested procedure might also allow for the usage of lower quality materials compared to the one used today. The main features of the present work fall into: simplifying the diffusion process, edge shunt isolation and using acidic texturing instead of the standard alkaline processing. Solar cells of 17% efficiency have been produced using this procedure. Investigations on the possibility of improving the efficiency and using less quality material are still underway

  4. Regression Tests and the Efficiency of Fixed Odds Betting Markets

    NARCIS (Netherlands)

    Koning, Ruud H.

    The informational content of odds posted in sports betting market has been an ongoing topic of research. In this paper, I test whether fixed odds betting markets in soccer are informationally efficient. The contributions of the paper are threefold: first, I propose a simple yet flexible statistical

  5. Testing Methodology in the Student Learning Process

    Science.gov (United States)

    Gorbunova, Tatiana N.

    2017-01-01

    The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…

  6. Live testing of the SCAT management process

    International Nuclear Information System (INIS)

    Martin, V.; Duhaime, C.; Boule, M.; Lamarche, A.

    2002-01-01

    The techniques developed by Environment Canada's Shoreline Cleanup Assessment Team (SCAT) have become the world standard for consistency in remedial efforts following an oil spill. This paper presented the results of a workshop that was aimed at testing the management process developed by two different agencies, Environment Canada and Eastern Canada Response Corporation (ECRC), following a spill incident in 1999 in which 150 km of Quebec's north shore near Havre-Saint-Pierre was polluted with 49 tonnes of bunker oil 180 from an ore ship. The issues of specific concern included fishing, mollusc harvesting, tourism, hunting and sites of environmental interest in the Mingan National Park. Both agencies realized they had to use the SCAT approach, but for different reasons. Environment Canada had to identify environmental impacts, while ECRC had to plan methods for shoreline treatment. Both agencies had to document the pollution using the SCAT method, therefore, they joined efforts and pooled their expertise to optimize resources. The newly developed management structure was aimed at determining how the SCAT approach should be planned, how the data quality could be secured, and how the information should be managed. The main benefits of the joint structure was a flow chart and description of the different functions, and a list of deliverables to be produced by those in charge of managing the SCAT approach. It was determined that the new management process is efficient. A SCAT assessment and situation report were both produced within a prescribed time frame. Working in partnership allowed participants to acquire a common understanding of the SCAT approach. 2 refs., 2 tabs., 3 figs

  7. Mobile Energy Laboratory energy-efficiency testing programs

    Energy Technology Data Exchange (ETDEWEB)

    Parker, G B; Currie, J W

    1992-03-01

    This report summarizes energy-efficiency testing activities applying the Mobile Energy Laboratory (MEL) testing capabilities during the third and fourth quarters of fiscal year (FY) 1991. The MELs, developed by the US Department of Energy (DOE) Federal Energy Management Program (FEMP), are administered by Pacific Northwest Laboratory (PNL) and the Naval Energy and Environmental Support Activity (NEESA) for energy testing and energy conservation program support functions at federal facilities. The using agencies principally fund MEL applications, while DOE/FEMP funds program administration and capability enhancement activities. This report fulfills the requirements established in Section 8 of the MEL Use Plan (PNL-6861) for semi-annual reporting on energy-efficiency testing activities using the MEL capabilities. The MEL Use Committee, formally established in 1989, developed the MEL Use Plan and meets semi-annually to establish priorities for energy-efficient testing applications using the MEL capabilities. The MEL Use Committee is composed of one representative each of the US Department of Energy, US Army, US Air Force, US Navy, and other federal agencies.

  8. Mobile Energy Laboratory energy-efficiency testing programs

    International Nuclear Information System (INIS)

    Parker, G.B.; Currie, J.W.

    1992-03-01

    This report summarizes energy-efficiency testing activities applying the Mobile Energy Laboratory (MEL) testing capabilities during the third and fourth quarters of fiscal year (FY) 1991. The MELs, developed by the US Department of Energy (DOE) Federal Energy Management Program (FEMP), are administered by Pacific Northwest Laboratory (PNL) and the Naval Energy and Environmental Support Activity (NEESA) for energy testing and energy conservation program support functions at federal facilities. The using agencies principally fund MEL applications, while DOE/FEMP funds program administration and capability enhancement activities. This report fulfills the requirements established in Section 8 of the MEL Use Plan (PNL-6861) for semi-annual reporting on energy-efficiency testing activities using the MEL capabilities. The MEL Use Committee, formally established in 1989, developed the MEL Use Plan and meets semi-annually to establish priorities for energy-efficient testing applications using the MEL capabilities. The MEL Use Committee is composed of one representative each of the US Department of Energy, US Army, US Air Force, US Navy, and other federal agencies

  9. Efficient p-value evaluation for resampling-based tests

    KAUST Repository

    Yu, K.

    2011-01-05

    The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated calculations of the test statistic on a large number of simulated data sets for its significance level assessment, and thus it could become very computationally intensive. Here, we propose an efficient p-value evaluation procedure by adapting the stochastic approximation Markov chain Monte Carlo algorithm. The new procedure can be used easily for estimating the p-value for any resampling-based test. We show through numeric simulations that the proposed procedure can be 100-500 000 times as efficient (in term of computing time) as the standard resampling-based procedure when evaluating a test statistic with a small p-value (e.g. less than 10( - 6)). With its computational burden reduced by this proposed procedure, the versatile resampling-based test would become computationally feasible for a much wider range of applications. We demonstrate the application of the new method by applying it to a large-scale genetic association study of prostate cancer.

  10. Comparison of high efficiency particulate filter testing methods

    International Nuclear Information System (INIS)

    1985-01-01

    High Efficiency Particulate Air (HEPA) filters are used for the removal of submicron size particulates from air streams. In nuclear industry they are used as an important engineering safeguard to prevent the release of air borne radioactive particulates to the environment. HEPA filters used in the nuclear industry should therefore be manufactured and operated under strict quality control. There are three levels of testing HEPA filters: i) testing of the filter media; ii) testing of the assembled filter including filter media and filter housing; and iii) on site testing of the complete filter installation before putting into operation and later for the purpose of periodic control. A co-ordinated research programme on particulate filter testing methods was taken up by the Agency and contracts were awarded to the Member Countries, Belgium, German Democratic Republic, India and Hungary. The investigations carried out by the participants of the present co-ordinated research programme include the results of the nowadays most frequently used HEPA filter testing methods both for filter medium test, rig test and in-situ test purposes. Most of the experiments were carried out at ambient temperature and humidity, but indications were given to extend the investigations to elevated temperature and humidity in the future for the purpose of testing the performance of HEPA filter under severe conditions. A major conclusion of the co-ordinated research programme was that it was not possible to recommend one method as a reference method for in situ testing of high efficiency particulate air filters. Most of the present conventional methods are adequate for current requirements. The reasons why no method is to be recommended were multiple, ranging from economical aspects, through incompatibility of materials to national regulations

  11. Process efficiency. Redesigning social networks to improve surgery patient flow.

    Science.gov (United States)

    Samarth, Chandrika N; Gloor, Peter A

    2009-01-01

    We propose a novel approach to improve throughput of the surgery patient flow process of a Boston area teaching hospital. A social network analysis was conducted in an effort to demonstrate that process efficiency gains could be achieved through redesign of social network patterns at the workplace; in conjunction with redesign of organization structure and the implementation of workflow over an integrated information technology system. Key knowledge experts and coordinators in times of crisis were identified and a new communication structure more conducive to trust and knowledge sharing was suggested. The new communication structure is scalable without compromising on coordination required among key roles in the network for achieving efficiency gains.

  12. Highly Automated Agile Testing Process: An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Jarosław Berłowski

    2016-09-01

    Full Text Available This paper presents a description of an agile testing process in a medium size software project that is developed using Scrum. The research methods used is the case study were as follows: surveys, quantifiable project data sources and qualitative project members opinions were used for data collection. Challenges related to the testing process regarding a complex project environment and unscheduled releases were identified. Based on the obtained results, we concluded that the described approach addresses well the aforementioned issues. Therefore, recommendations were made with regard to the employed principles of agility, specifically: continuous integration, responding to change, test automation and test driven development. Furthermore, an efficient testing environment that combines a number of test frameworks (e.g. JUnit, Selenium, Jersey Test with custom-developed simulators is presented.

  13. In-situ high efficiency filter testing at AEE Winfrith

    International Nuclear Information System (INIS)

    Fraser, D.C.

    1977-01-01

    This paper discusses experience in the testing of high efficiency filters in a variety of reactor and plant installations at AEE Winfrith. There is rarely any concern about the effectiveness of the filter as supplied by any reputable manufacturer. Experience has shown there is a need to check for defects in the installation of filters which could lead to by-passing of aerosols and it is desirable to perform periodical re-tests to ensure that no subsequent deterioration occurs. It is important to use simple, portable apparatus for such tests; methods based on the use of sodium chloride aerosols, although suitable for the testing of filters prior to installation, involve apparatus which is too bulky for in-situ testing. At Winfrith a double automatic Pollak counter has been developed and used routinely since 1970. The aerosol involved has a particle size far smaller than the size most likely to penetrate intact filters, but this is irrelevant when one is primarily interested in particles which by-pass the filter. Comparisons with other methods of testing filters will be described. There is remarkably good agreement between the efficiency of the filter installation as measured by a Pollak counter compared with techniques involving aerosols of sodium chloride and Dioctyl Phthalate (DOP), presumably because the leakage around the filter is independent of particle size

  14. Implementation of a pedagogically efficient system for electronic testing

    OpenAIRE

    Preskar, Peter

    2012-01-01

    Nowadays online learning is a very common process. Together with online learning there has been strong development of online assessment systems. Time is money and online assessment or electronic tests save us exactly that - time. For a teacher and for a student it enables fast feedback information. The diploma thesis at first presents information and communications technology (ICT) and the role of ICT in development of electronic tests and standardisation of records of electronic tests. I...

  15. Test Program for High Efficiency Gas Turbine Exhaust Diffuser

    Energy Technology Data Exchange (ETDEWEB)

    Norris, Thomas R.

    2009-12-31

    This research relates to improving the efficiency of flow in a turbine exhaust, and thus, that of the turbine and power plant. The Phase I SBIR project demonstrated the technical viability of “strutlets” to control stalls on a model diffuser strut. Strutlets are a novel flow-improving vane concept intended to improve the efficiency of flow in turbine exhausts. Strutlets can help reduce turbine back pressure, and incrementally improve turbine efficiency, increase power, and reduce greenhouse gas emmission. The long-term goal is a 0.5 percent improvement of each item, averaged over the US gas turbine fleet. The strutlets were tested in a physical scale model of a gas turbine exhaust diffuser. The test flow passage is a straight, annular diffuser with three sets of struts. At the end of Phase 1, the ability of strutlets to keep flow attached to struts was demonstrated, but the strutlet drag was too high for a net efficiency advantage. An independently sponsored followup project did develop a highly-modified low-drag strutlet. In combination with other flow improving vanes, complicance to the stated goals was demonstrated for for simple cycle power plants, and to most of the goals for combined cycle power plants using this particular exhaust geometry. Importantly, low frequency diffuser noise was reduced by 5 dB or more, compared to the baseline. Appolicability to other diffuser geometries is yet to be demonstrated.

  16. An algorithm for testing the efficient market hypothesis.

    Directory of Open Access Journals (Sweden)

    Ioana-Andreea Boboc

    Full Text Available The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA, Moving Average Convergence Divergence (MACD, Relative Strength Index (RSI and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH.

  17. An algorithm for testing the efficient market hypothesis.

    Science.gov (United States)

    Boboc, Ioana-Andreea; Dinică, Mihai-Cristian

    2013-01-01

    The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH).

  18. Catalysts Efficiency Evaluation by using CC Analysis Test

    Directory of Open Access Journals (Sweden)

    Arina Negoitescu

    2011-10-01

    Full Text Available The study emphasizes the necessity of the catalysts efficiency testing. Diagnosis systems using lambda probes are based on the capacity of the catalyst oxygen storage. Comparing the lambda probe signals upstream and downstream of catalyst provides an indication on catalyst activity, although the correlation between oxygen storage capacity and catalyst efficiency is still difficult. Diagnosis for the 1.4 Renault Clio Symbol was accomplished in the Road Vehicles Lab at the Politehnica University of Timisoara using AVL Dicom 4000. The tests showed that the engine worked with lean mixture being necessary a fuel mixture correction calculated by the control unit ECU. A compensation of 0.14 % vol is required for the engine correct operation and emissions integration within permissible limits

  19. Testing weak form efficiency on the capital markets in Serbia

    Directory of Open Access Journals (Sweden)

    Kršikapa-Rašajski Jovana

    2016-01-01

    Full Text Available Weak-form efficient market hypothesis assumes that participants on the financial markets are not able to achieve above-average returns based on historical prices. In order to establish the presence of a weak-form market efficiency in the Serbian market, the analysis incorporates daily data of the two most prominent indices on the Belgrade Stock Exchange, BELEX 15 and BELEX LINE, since their inception until 31 December 2014. Results obtained by the analysis and testing indicate that the capital market in Serbia can not be considered sufficiently efficient, more precisely it indicates that postulates assumed by the weak-form market efficiency are not fully met. Taking into account that the capital market in Serbia is still underdeveloped, primarily because of the small volumes, turnover and types of securities which are traded on the market, as well as the fact that it is not sufficiently regulated and transparent, lack of investors is noticeable. Consequently, analysis presented in this paper indicates a weak sustainability of the efficient market hypothesis in Serbia.

  20. Efficiency tests on the pyrolysis gasifier stove Peko Pe

    DEFF Research Database (Denmark)

    Nielsen, Per Sieverts

    1996-01-01

    This paper presents results from water boiling tests on the pyrolysis gasifier stove Peko Pe, which has been developed by the Norwegian Paal Wendelbo. The stove efficiency determined vary between 21 and 29% when burning dry Danish woodchips (10% moisture) with an estimated caloric value of 16 MJ...... the water content in the grass. In Adjumani refugee camp it was furthermore found that the stove was able to provide sufficient energy from solid combustion, after the pyrolysis was stopped, to boil water for additional 25-30 minutes with lid. This effect was not seen in the tests on woodchips in Denmark...

  1. In Situ Field Testing of Processes

    International Nuclear Information System (INIS)

    Wang, J.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to update and document the data and subsequent analyses from ambient field-testing activities performed in underground drifts of the Yucca Mountain Site Characterization Project (YMP). This revision updates data and analyses presented in the initial issue of this AMR. This AMR was developed in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' and ''Technical Work Plan for UZ Flow, Transport, and Coupled Processes Process Model Report. These activities were performed to investigate in situ flow and transport processes. The evaluations provide the necessary framework to: (1) refine and confirm the conceptual model of matrix and fracture processes in the unsaturated zone (UZ) and (2) analyze the impact of excavation (including use of construction water and effect of ventilation) on the UZ flow and transport processes. This AMR is intended to support revisions to ''Conceptual and Numerical Models for UZ Flow and Transport'' and ''Unsaturated Zone Flow and Transport Model Process Model Report''. In general, the results discussed in this AMR are from studies conducted using a combination or a subset of the following three approaches: (1) air-injection tests, (2) liquid-release tests, and (3) moisture monitoring using in-drift sensors or in-borehole sensors, to evaluate the impact of excavation, ventilation, and construction-water usage on the surrounding rocks. The liquid-release tests and air-injection tests provide an evaluation of in situ fracture flow and the competing processes of matrix imbibition. Only the findings from testing and data not covered in the ''Seepage Calibration Model and Seepage Testing Data'' are analyzed in detail in the AMR

  2. In Situ Field Testing of Processes

    Energy Technology Data Exchange (ETDEWEB)

    J. Wang

    2001-12-14

    The purpose of this Analysis/Model Report (AMR) is to update and document the data and subsequent analyses from ambient field-testing activities performed in underground drifts of the Yucca Mountain Site Characterization Project (YMP). This revision updates data and analyses presented in the initial issue of this AMR. This AMR was developed in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' and ''Technical Work Plan for UZ Flow, Transport, and Coupled Processes Process Model Report. These activities were performed to investigate in situ flow and transport processes. The evaluations provide the necessary framework to: (1) refine and confirm the conceptual model of matrix and fracture processes in the unsaturated zone (UZ) and (2) analyze the impact of excavation (including use of construction water and effect of ventilation) on the UZ flow and transport processes. This AMR is intended to support revisions to ''Conceptual and Numerical Models for UZ Flow and Transport'' and ''Unsaturated Zone Flow and Transport Model Process Model Report''. In general, the results discussed in this AMR are from studies conducted using a combination or a subset of the following three approaches: (1) air-injection tests, (2) liquid-release tests, and (3) moisture monitoring using in-drift sensors or in-borehole sensors, to evaluate the impact of excavation, ventilation, and construction-water usage on the surrounding rocks. The liquid-release tests and air-injection tests provide an evaluation of in situ fracture flow and the competing processes of matrix imbibition. Only the findings from testing and data not covered in the ''Seepage Calibration Model and Seepage Testing Data'' are analyzed in detail in the AMR.

  3. Efficient field testing for load rating railroad bridges

    Science.gov (United States)

    Schulz, Jeffrey L.; Brett C., Commander

    1995-06-01

    As the condition of our infrastructure continues to deteriorate, and the loads carried by our bridges continue to increase, an ever growing number of railroad and highway bridges require load limits. With safety and transportation costs at both ends of the spectrum. the need for accurate load rating is paramount. This paper describes a method that has been developed for efficient load testing and evaluation of short- and medium-span bridges. Through the use of a specially-designed structural testing system and efficient load test procedures, a typical bridge can be instrumented and tested at 64 points in less than one working day and with minimum impact on rail traffic. Various techniques are available to evaluate structural properties and obtain a realistic model. With field data, a simple finite element model is 'calibrated' and its accuracy is verified. Appropriate design and rating loads are applied to the resulting model and stress predictions are made. This technique has been performed on numerous structures to address specific problems and to provide accurate load ratings. The merits and limitations of this approach are discussed in the context of actual examples of both rail and highway bridges that were tested and evaluated.

  4. Design process of an area-efficient photobioreactor

    NARCIS (Netherlands)

    Zijffers, J.F.; Janssen, M.G.J.; Tramper, J.; Wijffels, R.H.

    2008-01-01

    This article describes the design process of the Green Solar Collector (GSC), an area-efficient photobioreactor for the outdoor cultivation of microalgae. The overall goal has been to design a system in which all incident sunlight on the area covered by the reactor is delivered to the algae at such

  5. Boosting the IGCLC process efficiency by optimizing the desulfurization step

    International Nuclear Information System (INIS)

    Hamers, H.P.; Romano, M.C.; Spallina, V.; Chiesa, P.; Gallucci, F.; Sint Annaland, M. van

    2015-01-01

    Highlights: • Pre-CLC hot gas desulfurization and post-CLC desulfurization are assessed. • Process efficiency increases by 0.5–1% points with alternative desulfurization methods. • Alternative desulfurization methods are more beneficial for CFB configurations. - Abstract: In this paper the influence of the desulfurization method on the process efficiency of an integrated gasification chemical-looping combustion (IGCLC) systems is investigated for both packed beds and circulating fluidized bed CLC systems. Both reactor types have been integrated in an IGCLC power plant, in which three desulfurization methods have been compared: conventional cold gas desulfurization with Selexol (CGD), hot gas desulfurization with ZnO (HGD) and flue gas desulfurization after the CLC reactors (post-CLC). For CLC with packed bed reactors, the efficiency gain of the alternative desulfurization methods is about 0.5–0.7% points. This is relatively small, because of the relatively large amount of steam that has to be mixed with the fuel to avoid carbon deposition on the oxygen carrier. The HGD and post-CLC configurations do not contain a saturator and therefore more steam has to be mixed with a negative influence on the process efficiency. Carbon deposition is not an issue for circulating fluidized bed systems and therefore a somewhat higher efficiency gain of 0.8–1.0% point can be reached for this reactor system, assuming that complete fuel conversion can be reached and no sulfur species are formed on the solid, which is however thermodynamically possible for iron and manganese based oxygen carriers. From this study, it can be concluded that the adaptation of the desulfurization method results in higher process efficiencies, especially for the circulating fluidized bed system, while the number of operating units is reduced.

  6. Vitrification process testing for reference HWVP waste

    International Nuclear Information System (INIS)

    Perez, J.M. Jr.; Goles, R.W.; Nakaoka, R.K.; Kruger, O.L.

    1991-01-01

    The Hanford Waste Vitrification Plant (HWVP) is being designed to vitrify high-level radioactive wastes stored on the Hanford site. The vitrification flow-sheet is being developed to assure the plant will achieve plant production requirements and the glass product will meet all waste form requirements for final geologic disposal. The first Hanford waste to be processed by the HWVP will be a neutralized waste resulting from PUREX fuel reprocessing operations. Testing is being conducted using representative nonradioactive simulants to obtain process and product data required to support design, environmental, and qualification activities. Plant/process criteria, testing requirements and approach, and results to date will be presented

  7. Alexandria: towards an efficient centralised document management. More efficient business processes

    International Nuclear Information System (INIS)

    Couvreur, D.

    2011-01-01

    The capital of SCK-CEN is the knowledge of its staff. There is an enormous amount of information circulating within the research centre. A centralised management for all documents is also critical to efficiently manage, share and unlock the expertise. Since 2009, SCK-CEN has been working on a document management system: Alexandria. A first test draft was completed in 2010.

  8. Origin of poor doping efficiency in solution processed organic semiconductors.

    Science.gov (United States)

    Jha, Ajay; Duan, Hong-Guang; Tiwari, Vandana; Thorwart, Michael; Miller, R J Dwayne

    2018-05-21

    Doping is an extremely important process where intentional insertion of impurities in semiconductors controls their electronic properties. In organic semiconductors, one of the convenient, but inefficient, ways of doping is the spin casting of a precursor mixture of components in solution, followed by solvent evaporation. Active control over this process holds the key to significant improvements over current poor doping efficiencies. Yet, an optimized control can only come from a detailed understanding of electronic interactions responsible for the low doping efficiencies. Here, we use two-dimensional nonlinear optical spectroscopy to examine these interactions in the course of the doping process by probing the solution mixture of doped organic semiconductors. A dopant accepts an electron from the semiconductor and the two ions form a duplex of interacting charges known as ion-pair complexes. Well-resolved off-diagonal peaks in the two-dimensional spectra clearly demonstrate the electronic connectivity among the ions in solution. This electronic interaction represents a well resolved electrostatically bound state, as opposed to a random distribution of ions. We developed a theoretical model to recover the experimental data, which reveals an unexpectedly strong electronic coupling of ∼250 cm -1 with an intermolecular distance of ∼4.5 Å between ions in solution, which is approximately the expected distance in processed films. The fact that this relationship persists from solution to the processed film gives direct evidence that Coulomb interactions are retained from the precursor solution to the processed films. This memory effect renders the charge carriers equally bound also in the film and, hence, results in poor doping efficiencies. This new insight will help pave the way towards rational tailoring of the electronic interactions to improve doping efficiencies in processed organic semiconductor thin films.

  9. Valve leakage inspection testing and maintenance process

    International Nuclear Information System (INIS)

    Aikin, J.A.; Reinwald, J.W.; Kittmer, C.A.

    1991-01-01

    In valve maintenance, packing rings that prevent leakage along the valve stem must periodically be replaced, either during routine maintenance or to correct a leak or valve malfunction. Tools and procedures currently in use for valve packing removal and inspection are generally of limited value due to various access and application problems. A process has been developed by AECL Research that addresses these problems. The process, using incompressible fluid pressure, quickly and efficiently confirms the integrity of the valve backseat, extracts hard-to-remove valve packing sets, and verifies the leak tightness of the repacked valve

  10. Increased Efficiencies in the INEEL SAR/TSR/USQ Process

    International Nuclear Information System (INIS)

    Cole, N.E.

    2002-01-01

    The Idaho National Engineering and Environmental Laboratory (INEEL) has implemented a number of efficiencies to reduce the time and cost of preparing safety basis documents. The INEEL is continuing to look at other aspects of the safety basis process to identify other efficiencies that can be implemented and remain in compliance with Title 10 Code of Federal Regulations (CFR) Part 830. A six-sigma approach is used to identify areas to improve efficiencies and develop the action plan for implementation of the new process, as applicable. Three improvement processes have been implemented: The first was the development of standardized Documented Safety Analysis (DSA) and technical safety requirement (TSR) documents that all nuclear facilities use, by adding facility-specific details. The second is a material procurement process, which is based on safety systems specified in the individual safety basis documents. The third is a restructuring of the entire safety basis preparation and approval process. Significant savings in time to prepare safety basis document, cost of materials, and total cost of the documents are currently being realized

  11. Efficient Adoption and Assessment of Multiple Process Improvement Reference Models

    Directory of Open Access Journals (Sweden)

    Simona Jeners

    2013-06-01

    Full Text Available A variety of reference models such as CMMI, COBIT or ITIL support IT organizations to improve their processes. These process improvement reference models (IRMs cover different domains such as IT development, IT Services or IT Governance but also share some similarities. As there are organizations that address multiple domains and need to coordinate their processes in their improvement we present MoSaIC, an approach to support organizations to efficiently adopt and conform to multiple IRMs. Our solution realizes a semantic integration of IRMs based on common meta-models. The resulting IRM integration model enables organizations to efficiently implement and asses multiple IRMs and to benefit from synergy effects.

  12. Healthy Efficient New Gas Homes (HENGH) Pilot Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Wanyu R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Maddalena, Randy L [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Chris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hotchi, Toshifumi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Brett C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Walker, Iain S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sherman, Max H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    The Healthy Efficient New Gas Homes (HENGH) is a field study that will collect data on ventilation systems and indoor air quality (IAQ) in new California homes that were built to 2008 Title 24 standards. A pilot test was performed to help inform the most time and cost effective approaches to measuring IAQ in the 100 test homes that will be recruited for this study. Two occupied, single-family detached homes built to 2008 Title 24 participated in the pilot test. One of the test homes uses exhaust-only ventilation provided by a continuous exhaust fan in the laundry room. The other home uses supply air for ventilation. Measurements of IAQ were collected for two weeks. Time-resolved concentrations of particulate matter (PM), nitrogen dioxide (NO2), carbon dioxide (CO2), carbon monoxide (CO), and formaldehyde were measured. Measurements of IAQ also included time-integrated concentrations of volatile organic compounds (VOCs), volatile aldehydes, and NO2. Three perfluorocarbon tracers (PFTs) were used to estimate the dilution rate of an indoor emitted air contaminant in the two pilot test homes. Diagnostic tests were performed to measure envelope air leakage, duct leakage, and airflow of range hood, exhaust fans, and clothes dryer vent when accessible. Occupant activities, such as cooking, use of range hood and exhaust fans, were monitored using various data loggers. This document describes results of the pilot test.

  13. Decomposition based parallel processing technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2000-01-01

    In practical design studies, most of designers solve multidisciplinary problems with complex design structure. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder original design processes to minimize total cost and time. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  14. Testing the Informational Efficiency on the Romanian Financial Market

    Directory of Open Access Journals (Sweden)

    Aurora Murgea

    2006-03-01

    Full Text Available The classical models of portfolio selection could not be applied on a market were the efficient market hypothesis is not valid (at least in a “weak” sense. The aim of this paper is to enlighten the difficulties of portfolio construction in a financial market with institutional and structural deficiencies, like the Romanian one, and to propose an alternative approach to the problem. The main features of our analysis are: 1 an empirical test for the efficient market hypothesis in the Romanian financial market case; 2 a critical distinction between the concept of “risk” and the concept of “incertitude”; 3 the use of the individual yield/risk ratio versus the market one as a selection variable; 4 the renouncement at the use in the selection procedure of an “non-risky” asset; 5 an example of the proposed selection procedure. The output of this approach could be resumed by the thesis that, even in a situation when the financial market is affected by severe disfunctions, there is a possibility to build an “optimal” portfolio based on a yield-risk arbitrage inside an efficiency frontier and to obtain a “good” schema of an financial placement, in spite of the limited possibilities for a efficient portfolio management.

  15. Testing the Informational Efficiency on the Romanian Financial Market

    Directory of Open Access Journals (Sweden)

    Bogdan Dima

    2006-01-01

    Full Text Available The classical models of portfolio selection could not be applied on a market were the efficient market hypothesis is not valid (at least in a "weak" sense. The aim of this paper is to enlighten the difficulties of portfolio construction in a financial market with institutional and structural deficiencies, like the Romanian one, and to propose an alternative approach to the problem. The main features of our analysis are: 1 an empirical test for the efficient market hypothesis in the Romanian financial market case; 2 a critical distinction between the concept of "risk" and the concept of "incertitude"; 3 the use of the individual yield/risk ratio versus the market one as a selection variable; 4 the renouncement at the use in the selection procedure of an "non-risky" asset; 5 an example of the proposed selection procedure. The output of this approach could be resumed by the thesis that, even in a situation when the financial market is affected by severe disfunctions, there is a possibility to build an "optimal" portfolio based on a yield-risk arbitrage inside an efficiency frontier and to obtain a "good" schema of an financial placement, in spite of the limited possibilities for a efficient portfolio management.

  16. Efficient Data Generation and Publication as a Test Tool

    Science.gov (United States)

    Einstein, Craig Jakob

    2017-01-01

    A tool to facilitate the generation and publication of test data was created to test the individual components of a command and control system designed to launch spacecraft. Specifically, this tool was built to ensure messages are properly passed between system components. The tool can also be used to test whether the appropriate groups have access (read/write privileges) to the correct messages. The messages passed between system components take the form of unique identifiers with associated values. These identifiers are alphanumeric strings that identify the type of message and the additional parameters that are contained within the message. The values that are passed with the message depend on the identifier. The data generation tool allows for the efficient creation and publication of these messages. A configuration file can be used to set the parameters of the tool and also specify which messages to pass.

  17. Biomass Gasification for Power Generation Internal Combustion Engines. Process Efficiency

    International Nuclear Information System (INIS)

    Lesme-Jaén, René; Garcia-Faure, Luis; Oliva-Ruiz, Luis; Pajarín-Rodríguez, Juan; Revilla-Suarez, Dennis

    2016-01-01

    Biomass is a renewable energy sources worldwide greater prospects for its potential and its lower environmental impact compared to fossil fuels. By different processes and energy conversion technologies is possible to obtain solid, liquid and gaseous fuels from any biomass.In this paper the evaluation of thermal and overall efficiency of the gasification of Integral Forestry Company Santiago de Cuba is presented, designed to electricity generation from waste forest industry. The gasifier is a downdraft reactor, COMBO-80 model of Indian manufacturing and motor (diesel) model Leyland modified to work with producer gas. The evaluation was conducted at different loads (electric power generated) of the motor from experimental measurements of flow and composition of gas supplied to the engine. The results show that the motor operates with a thermal efficiency in the range of 20-32% with an overall efficiency between 12-25 %. (author)

  18. GPC Light Shaper for energy efficient laser materials processing

    DEFF Research Database (Denmark)

    Bañas, Andrew Rafael; Palima, Darwin; Villangca, Mark Jayson

    The biggest use of lasers is in materials processing. In manufacturing, lasers are used for cutting, drilling, marking and other machining processes. Similarly, lasers are important in microfabrication processes such as photolithography, direct laser writing, or ablation. Lasers are advantageous...... with steep, well defined edges that would further increase laser cutting precision or allow “single shot” laser engraving of arbitrary 2D profiles, as opposed to point scanning [3,4]. Instead of lossy approaches, GPC beam shaping is achieved with simplified, binary phase-only optics [5] that redistributes...... because they do not wear out, have no physical contact with the processed material, avoid heating or warping effects, and are generally more precise. Since lasers are easier to adapt to different optimized shapes, they can be even more precise and energy efficient for materials processing. The cost...

  19. Testing the efficiency of the wine market using unit root tests with sharp and smooth breaks

    Directory of Open Access Journals (Sweden)

    Elie Bouri

    2017-12-01

    Full Text Available This paper examines the efficient market hypothesis for the wine market using a novel unit root test while accounting for sharp shifts and smooth breaks in the monthly data. We find evidence of structural shifts and nonlinearity in the wine indices. Contrary to the results from conventional linear unit root tests, when we account for sharp shifts and smooth breaks, the unit root null for each of the wine indices has been rejected. Overall, our results suggest that the wine market is inefficient when we incorporate breaks. We provide some practical and policy implications of our findings. Keywords: Wine market, Efficiency, Sharp and smooth breaks, Unit root tests

  20. Application Research on Testing Efficiency of Main Drainage Pump in Coal Mine Using Thermodynamic Theories

    Directory of Open Access Journals (Sweden)

    Deyong Shang

    2017-01-01

    Full Text Available The efficiency of a drainage pump should be tested at regular intervals to master the status of the drainage pump in real time and thus achieve the goal of saving energy. The ultrasonic flowmeter method is traditionally used to measure the flow of the pump. But there are some defects in this kind of method of underground coal mine. This paper first introduces the principle of testing the main drainage pump efficiency in coal mine using thermodynamic theories, then analyzes the energy transformation during the process of draining water, and finally derives the calculation formulae of the pump efficiency, which meet the on-site precision of engineering. On the basis of analyzing the theories, the protective sleeve and the base of the temperature sensor are designed to measure the water temperature at inlet and outlet of the pump. The efficiencies of pumps with two specifications are measured, respectively, by using the thermodynamic method and ultrasonic flowmeter method. By contrast, the results show that thermodynamic method can satisfy the precision of the testing requirements accuracy for high-flow and high-lift drainage pump under normal temperatures. Moreover, some measures are summed up to improve the accuracy of testing the pump efficiency, which are of guiding significance for on-site testing of the main drainage pump efficiency in coal mine.

  1. Acute physical exercise affected processing efficiency in an auditory attention task more than processing effectiveness.

    Science.gov (United States)

    Dutke, Stephan; Jaitner, Thomas; Berse, Timo; Barenberg, Jonathan

    2014-02-01

    Research on effects of acute physical exercise on performance in a concurrent cognitive task has generated equivocal evidence. Processing efficiency theory predicts that concurrent physical exercise can increase resource requirements for sustaining cognitive performance even when the level of performance is unaffected. This hypothesis was tested in a dual-task experiment. Sixty young adults worked on a primary auditory attention task and a secondary interval production task while cycling on a bicycle ergometer. Physical load (cycling) and cognitive load of the primary task were manipulated. Neither physical nor cognitive load affected primary task performance, but both factors interacted on secondary task performance. Sustaining primary task performance under increased physical and/or cognitive load increased resource consumption as indicated by decreased secondary task performance. Results demonstrated that physical exercise effects on cognition might be underestimated when only single task performance is the focus.

  2. The Efficiency of Halal Processed Food Industry in Malaysia

    Directory of Open Access Journals (Sweden)

    Mohd Ali Mohd Noor

    2016-06-01

    Full Text Available Efficiency is indispensable for an industry to ensure cost reduction and profit maximization. It also helps the industry to be competitive and remain in the market. In 2010, Malaysia aims to be the world halal hub. The hub should capture at least five percent of the world halal market with at least 10,000 exporting firms. However the hub failed due to the small number of firms efficiency that finally contribute to less number of firms export. Thus, this study aimed to measure the efficiency of halal processed food industry in Malaysia using Data Envelopment Analysis (DEA. Input variables used were local raw inputs, labour, and monetary assets of halal food industry in Malaysia. Meanwhile the output used was the total sales revenue of the halal industry in Malaysia. The study shows that very few indusries are efficient in each category led by meat, dairy, cordials and juices, marine products, food crops, and grains industry. Therefore, the government needs to emphasize on industry’s efficiency to be competitive and be the world halal hub in the future.

  3. Overall equipment efficiency of Flexographic Printing process: A case study

    Science.gov (United States)

    Zahoor, S.; Shehzad, A.; Mufti, NA; Zahoor, Z.; Saeed, U.

    2017-12-01

    This paper reports the efficiency improvement of a flexographic printing machine by reducing breakdown time with the help of a total productive maintenance measure called overall equipment efficiency (OEE). The methodology is comprised of calculating OEE of the machine before and after identifying the causes of the problems. Pareto diagram is used to prioritize main problem areas and 5-whys analysis approach is used to identify the root cause of these problems. OEE of the process is improved from 34% to 40.2% for a 30 days time period. It is concluded that OEE and 5-whys analysis techniques are useful in improving effectiveness of the equipment and for the continuous process improvement as well.

  4. Signal and image processing for monitoring and testing at EDF

    International Nuclear Information System (INIS)

    Georgel, B.; Garreau, D.

    1992-04-01

    The quality of monitoring and non destructive testing devices in plants and utilities today greatly depends on the efficient processing of signal and image data. In this context, signal or image processing techniques, such as adaptive filtering or detection or 3D reconstruction, are required whenever manufacturing nonconformances or faulty operation have to be recognized and identified. This paper reviews the issues of industrial image and signal processing, by briefly considering the relevant studies and projects under way at EDF. (authors). 1 fig., 11 refs

  5. Cost-efficient enactment of stream processing topologies

    Directory of Open Access Journals (Sweden)

    Christoph Hochreiner

    2017-12-01

    Full Text Available The continuous increase of unbound streaming data poses several challenges to established data stream processing engines. One of the most important challenges is the cost-efficient enactment of stream processing topologies under changing data volume. These data volume pose different loads to stream processing systems whose resource provisioning needs to be continuously updated at runtime. First approaches already allow for resource provisioning on the level of virtual machines (VMs, but this only allows for coarse resource provisioning strategies. Based on current advances and benefits for containerized software systems, we have designed a cost-efficient resource provisioning approach and integrated it into the runtime of the Vienna ecosystem for elastic stream processing. Our resource provisioning approach aims to maximize the resource usage for VMs obtained from cloud providers. This strategy only releases processing capabilities at the end of the VMs minimal leasing duration instead of releasing them eagerly as soon as possible as it is the case for threshold-based approaches. This strategy allows us to improve the service level agreement compliance by up to 25% and a reduction for the operational cost of up to 36%.

  6. Business Process Design Of An Efficient And Effective Literature Review

    Directory of Open Access Journals (Sweden)

    Sayuthi

    2015-08-01

    Full Text Available The objective of this articleis to design business processesan organization efficiently and effectively. Based on our literature review the design of business processes that is best suitable for an organization belongs to Harrington 1992 namely the concept of Business Process Improvement BPI which is a systematic framework that helps organizations in making significant progress in the implementation of business processes. BPI provides a system that will simplify or streamline business processes to provide an assurance that the internal and external customers of the organization will get a better output. One advantage of BPI concept suggested by Harrington is the continuous improvement whereas the other authorsor experts of BPI have not recognize the idea of continuous improvement. With thisidea the products services offered by organization becomes more innovative.

  7. Quality Indicators for the Total Testing Process.

    Science.gov (United States)

    Plebani, Mario; Sciacovelli, Laura; Aita, Ada

    2017-03-01

    ISO 15189:2012 requires the use of quality indicators (QIs) to monitor and evaluate all steps of the total testing process, but several difficulties dissuade laboratories from effective and continuous use of QIs in routine practice. An International Federation of Clinical Chemistry and Laboratory Medicine working group addressed this problem and implemented a project to develop a model of QIs to be used in clinical laboratories worldwide to monitor and evaluate all steps of the total testing process, and decrease error rates and improve patient services in laboratory testing. All laboratories are invited, at no cost, to enroll in the project and contribute to harmonized management at the international level. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Business process reengineering and Nigerian banking system efficiency

    Directory of Open Access Journals (Sweden)

    John N. N. Ugoani

    2017-12-01

    Full Text Available Prior to 2000, and before banks in Nigeria embraced the NBS was inefficient, characterized by frauds, long queues, nonperforming loans, illiquidity and distress. As one way of overcoming these challenges banks started to focus on BPR as a veritable tool to drive efficiency customer satisfaction and improved shareholder value. With the advent of BPR and process improvement efficiency gradually strolled back in to the NBS Against the prereengineering era when the liquidity ratio of the NBS was minus 15.92 percent in 1996 with no bank meeting the 30 percent minimum prudential requirement, the NBS had a positive average liquidity ratio of 65.69 in 2011 with all the banks meeting the 30 percent minimum liquidity ratio. The banks that introduced BPR early in the 2000s have remained without distress, liquid, efficient with high growths in gross earnings, total assets profitability and total equity. The research design was deployed for the study, and it was found that BPR has positive effect on NBS efficiency.

  9. Research on efficiency test of a turbine in Khan Khwar hydropower station

    International Nuclear Information System (INIS)

    Zhang, H K; Liang, Z; Deng, M G; Liu, X B; Wang, H Y; Liu, D M

    2012-01-01

    The efficiency test is an important indicator to evaluate the energy conversion performance of a hydraulic turbine. For hydropower stations which do not have the direct flow measurement conditions, whether the characteristic curve of a turbine obtained through similarity theory conversion by using the comprehensive characteristic curve of the turbine can correctly reflect the operating performance of the prototype unit is a key issue in this industry. By taking the No.1 unit of Khan Khwar hydropower station as the example, the efficiency test of this turbine was studied on the site, including the measurement method of test parameters, the configuration of the computer test system, as well as the processing and analysis of test data.

  10. Efficient multitasking: parallel versus serial processing of multiple tasks.

    Science.gov (United States)

    Fischer, Rico; Plessow, Franziska

    2015-01-01

    In the context of performance optimizations in multitasking, a central debate has unfolded in multitasking research around whether cognitive processes related to different tasks proceed only sequentially (one at a time), or can operate in parallel (simultaneously). This review features a discussion of theoretical considerations and empirical evidence regarding parallel versus serial task processing in multitasking. In addition, we highlight how methodological differences and theoretical conceptions determine the extent to which parallel processing in multitasking can be detected, to guide their employment in future research. Parallel and serial processing of multiple tasks are not mutually exclusive. Therefore, questions focusing exclusively on either task-processing mode are too simplified. We review empirical evidence and demonstrate that shifting between more parallel and more serial task processing critically depends on the conditions under which multiple tasks are performed. We conclude that efficient multitasking is reflected by the ability of individuals to adjust multitasking performance to environmental demands by flexibly shifting between different processing strategies of multiple task-component scheduling.

  11. Lignocellulosic ethanol: Technology design and its impact on process efficiency.

    Science.gov (United States)

    Paulova, Leona; Patakova, Petra; Branska, Barbora; Rychtera, Mojmir; Melzoch, Karel

    2015-11-01

    This review provides current information on the production of ethanol from lignocellulosic biomass, with the main focus on relationships between process design and efficiency, expressed as ethanol concentration, yield and productivity. In spite of unquestionable advantages of lignocellulosic biomass as a feedstock for ethanol production (availability, price, non-competitiveness with food, waste material), many technological bottlenecks hinder its wide industrial application and competitiveness with 1st generation ethanol production. Among the main technological challenges are the recalcitrant structure of the material, and thus the need for extensive pretreatment (usually physico-chemical followed by enzymatic hydrolysis) to yield fermentable sugars, and a relatively low concentration of monosaccharides in the medium that hinder the achievement of ethanol concentrations comparable with those obtained using 1st generation feedstocks (e.g. corn or molasses). The presence of both pentose and hexose sugars in the fermentation broth, the price of cellulolytic enzymes, and the presence of toxic compounds that can inhibit cellulolytic enzymes and microbial producers of ethanol are major issues. In this review, different process configurations of the main technological steps (enzymatic hydrolysis, fermentation of hexose/and or pentose sugars) are discussed and their efficiencies are compared. The main features, benefits and drawbacks of simultaneous saccharification and fermentation (SSF), simultaneous saccharification and fermentation with delayed inoculation (dSSF), consolidated bioprocesses (CBP) combining production of cellulolytic enzymes, hydrolysis of biomass and fermentation into one step, together with an approach combining utilization of both pentose and hexose sugars are discussed and compared with separate hydrolysis and fermentation (SHF) processes. The impact of individual technological steps on final process efficiency is emphasized and the potential for use

  12. Emergency preparedness planning: A process to insure effectiveness and efficiency

    International Nuclear Information System (INIS)

    Schroeder, A.J. Jr.

    1994-01-01

    Prevention is undoubtedly the preferred policy regarding emergency response. Unfortunately, despite best intentions, emergencies do occur. It is the prudent operator that has well written and exercised plans in place to respond to the full suite of possible situations. This paper presents a planning process to help personnel develop and/or maintain emergency management capability. It is equally applicable at the field location, the district/regional office, or the corporate headquarters. It is not limited in scope and can be useful for planners addressing incidents ranging from fires, explosions, spills/releases, computer system failure, terrorist threats and natural disasters. By following the steps in the process diagram, the planner will document emergency management capability in a logical and efficient manner which should result in effective emergency response and recovery plans. The astute planner will immediately see that the process presented is a continuing one, fully compatible with the principles of continuous improvement

  13. Parallel processing based decomposition technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2001-01-01

    In practical design studies, most of designers solve multidisciplinary problems with large sized and complex design system. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder the original design processes to minimize total computational cost. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  14. Efficient processing of MPEG-21 metadata in the binary domain

    Science.gov (United States)

    Timmerer, Christian; Frank, Thomas; Hellwagner, Hermann; Heuer, Jörg; Hutter, Andreas

    2005-10-01

    XML-based metadata is widely adopted across the different communities and plenty of commercial and open source tools for processing and transforming are available on the market. However, all of these tools have one thing in common: they operate on plain text encoded metadata which may become a burden in constrained and streaming environments, i.e., when metadata needs to be processed together with multimedia content on the fly. In this paper we present an efficient approach for transforming such kind of metadata which are encoded using MPEG's Binary Format for Metadata (BiM) without additional en-/decoding overheads, i.e., within the binary domain. Therefore, we have developed an event-based push parser for BiM encoded metadata which transforms the metadata by a limited set of processing instructions - based on traditional XML transformation techniques - operating on bit patterns instead of cost-intensive string comparisons.

  15. Quality and efficiency successes leveraging IT and new processes.

    Science.gov (United States)

    Chaiken, Barry P; Christian, Charles E; Johnson, Liz

    2007-01-01

    Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.

  16. Microbial electrolytic disinfection process for highly efficient Escherichia coli inactivation

    DEFF Research Database (Denmark)

    Zhou, Shaofeng; Huang, Shaobin; Li, Xiaohu

    2018-01-01

    extensively studied for recalcitrant organics removal, its application potential towards water disinfection (e.g., inactivation of pathogens) is still unknown. This study investigated the inactivation of Escherichia coli in a microbial electrolysis cell based bio-electro-Fenton system (renamed as microbial......Water quality deterioration caused by a wide variety of recalcitrant organics and pathogenic microorganisms has become a serious concern worldwide. Bio-electro-Fenton systems have been considered as cost-effective and highly efficient water treatment platform technology. While it has been......]OH was identified as one potential mechanism for disinfection. This study successfully demonstrated the feasibility of bio-electro-Fenton process for pathogens inactivation, which offers insight for the future development of sustainable, efficient, and cost-effective biological water treatment technology....

  17. A Foundation for Efficient Indoor Distance-Aware Query Processing

    DEFF Research Database (Denmark)

    Lu, Hua; Cao, Xin; Jensen, Christian Søndergaard

    2012-01-01

    model that integrates indoor distance seamlessly. To enable the use of the model as a foundation for query processing, we develop accompanying, efficient algorithms that compute indoor distances for different indoor entities like doors as well as locations. We also propose an indexing framework......Indoor spaces accommodate large numbers of spatial objects, e.g., points of interest (POIs), and moving populations. A variety of services, e.g., location-based services and security control, are relevant to indoor spaces. Such services can be improved substantially if they are capable of utilizing...... that accommodates indoor distances that are pre-computed using the proposed algorithms. On top of this foundation, we develop efficient algorithms for typical indoor, distance-aware queries. The results of an extensive experimental evaluation demonstrate the efficacy of the proposals....

  18. The efficient market hypothesis: problems with interpretations of empirical tests

    Directory of Open Access Journals (Sweden)

    Denis Alajbeg

    2012-03-01

    Full Text Available Despite many “refutations” in empirical tests, the efficient market hypothesis (EMH remains the central concept of financial economics. The EMH’s resistance to the results of empirical testing emerges from the fact that the EMH is not a falsifiable theory. Its axiomatic definition shows how asset prices would behave under assumed conditions. Testing for this price behavior does not make much sense as the conditions in the financial markets are much more complex than the simplified conditions of perfect competition, zero transaction costs and free information used in the formulation of the EMH. Some recent developments within the tradition of the adaptive market hypothesis are promising regarding development of a falsifiable theory of price formation in financial markets, but are far from giving assurance that we are approaching a new formulation. The most that can be done in the meantime is to be very cautious while interpreting the empirical evidence that is presented as “testing” the EMH.

  19. Reduced Syntactic Processing Efficiency in Older Adults During Sentence Comprehension

    Directory of Open Access Journals (Sweden)

    Zude Zhu

    2018-03-01

    Full Text Available Researchers have frequently reported an age-related decline in semantic processing during sentence comprehension. However, it remains unclear whether syntactic processing also declines or whether it remains constant as people age. In the present study, 26 younger adults and 20 older adults were recruited and matched in terms of working memory, general intelligence, verbal intelligence and fluency. They were then asked to make semantic acceptability judgments while completing a Chinese sentence reading task. The behavioral results revealed that the older adults had significantly lower accuracy on measures of semantic and syntactic processing compared to younger adults. Event-related potential (ERP results showed that during semantic processing, older adults had a significantly reduced amplitude and delayed peak latency of the N400 compared to the younger adults. During syntactic processing, older adults also showed delayed peak latency of the P600 relative to younger adults. Moreover, while P600 amplitude was comparable between the two age groups, larger P600 amplitude was associated with worse performance only in the older adults. Together, the behavioral and ERP data suggest that there is an age-related decline in both semantic and syntactic processing, with a trend toward lower efficiency in syntactic ability.

  20. Auditory processing efficiency deficits in children with developmental language impairments

    Science.gov (United States)

    Hartley, Douglas E. H.; Moore, David R.

    2002-12-01

    The ``temporal processing hypothesis'' suggests that individuals with specific language impairments (SLIs) and dyslexia have severe deficits in processing rapidly presented or brief sensory information, both within the auditory and visual domains. This hypothesis has been supported through evidence that language-impaired individuals have excess auditory backward masking. This paper presents an analysis of masking results from several studies in terms of a model of temporal resolution. Results from this modeling suggest that the masking results can be better explained by an ``auditory efficiency'' hypothesis. If impaired or immature listeners have a normal temporal window, but require a higher signal-to-noise level (poor processing efficiency), this hypothesis predicts the observed small deficits in the simultaneous masking task, and the much larger deficits in backward and forward masking tasks amongst those listeners. The difference in performance on these masking tasks is predictable from the compressive nonlinearity of the basilar membrane. The model also correctly predicts that backward masking (i) is more prone to training effects, (ii) has greater inter- and intrasubject variability, and (iii) increases less with masker level than do other masking tasks. These findings provide a new perspective on the mechanisms underlying communication disorders and auditory masking.

  1. Recovery efficiency test project, Phase 2 activity report

    Energy Technology Data Exchange (ETDEWEB)

    Overbey, W.K. Jr.; Salamy, S.P.; Locke, C.D.

    1989-02-01

    The Recovery Efficiency Test well project addressed a number of technical issues. The primary objective was to determine the increased efficiency of gas recovery of a long horizontal wellbore over that of a vertical wellbore and, more specifically, what improvements can be expected from inducing multiple hydraulic fractures from such a wellbore. This volume contains appendices for: (1) supporting material and procedures for data frac'' stimulation of zone 6 using nitrogen and nitrogen foam; (2) supporting material and procedures for stimulation no. 1 nitrogen gas frac on zone no. 1; (3) supporting material and procedures for stimulation no. 2 in zone no. 1 using liquid CO{sub 2}; (4) supporting material and procedures for frac no. 3 on zone no.1 using nitrogen foam and proppant; (5) supporting material and procedures for stimulation no. 4 in zones 2--3 and 4 using nitrogen foam and proppant; (6) supporting materials and procedures for stimulation no. 5 in zones 5 and 8; and (7) fracture diagnostics reports and supporting materials.

  2. Efficient Option Pricing under Levy Processes, with CVA and FVA

    Directory of Open Access Journals (Sweden)

    Jimmy eLaw

    2015-07-01

    Full Text Available We generalize the Piterbarg (2010 model to include 1 bilateral default risk as in Burgard and Kjaer (2012, and 2 jumps in the dynamics of the underlying asset using general classes of L'evy processes of exponential type. We develop an efficient explicit-implicit scheme for European options and barrier options taking CVA-FVA into account. We highlight the importance of this work in the context of trading, pricing and management a derivative portfolio given the trajectory of regulations.

  3. Women process multisensory emotion expressions more efficiently than men.

    Science.gov (United States)

    Collignon, O; Girard, S; Gosselin, F; Saint-Amour, D; Lepore, F; Lassonde, M

    2010-01-01

    Despite claims in the popular press, experiments investigating whether female are more efficient than male observers at processing expression of emotions produced inconsistent findings. In the present study, participants were asked to categorize fear and disgust expressions displayed auditorily, visually, or audio-visually. Results revealed an advantage of women in all the conditions of stimulus presentation. We also observed more nonlinear probabilistic summation in the bimodal conditions in female than male observers, indicating greater neural integration of different sensory-emotional informations. These findings indicate robust differences between genders in the multisensory perception of emotion expression.

  4. Increasing operational efficiency in a radioactive waste processing plant - 16100

    International Nuclear Information System (INIS)

    Turner, T.W.; Watson, S.N.

    2009-01-01

    The solid waste plant at Harwell in Oxfordshire, contains a purpose built facility to input, assay, visually inspect and sort remote handled intermediate level radioactive waste (RHILW). The facility includes a suite of remote handling cells, known as the head-end cells (HEC), which waste must pass through in order to be repackaged. Some newly created waste from decommissioning works on site passes through the cells, but the vast majority of waste for processing is historical waste, stored in below ground tube stores. Existing containers are not suitable for long term storage, many are already badly corroded, so the waste must be efficiently processed and repackaged in order to achieve passive safety. The Harwell site is currently being decommissioned and the land is being restored. The site is being progressively de-licensed, and redeveloped as a business park, which can only be completed when all the nuclear liabilities have been removed. The recovery and processing of old waste in the solid waste plant is a key project linked to de-licensing of a section of the site. Increasing the operational efficiency of the waste processing plant could shorten the time needed to clear the site and has the potential to save money for the Nuclear Decommissioning Authority (NDA). The waste processing facility was constructed in the mid 1990's, and commissioned in 1999. Since operations began, the yearly throughput of the cells has increased significantly every year. To achieve targets set out in the lifetime plan (LTP) for the site, throughput must continue to increase. The operations department has measured the overall equipment effectiveness (OEE) of the process for the last few years, and has used continuous improvement techniques to decrease the average cycle time. Philosophies from operational management practices such as 'lean' and 'kaizen' have been employed successfully to drive out losses and increase plant efficiency. This paper will describe how the solid waste plant

  5. Pilot-scale tests of HEME and HEPA dissolution process

    International Nuclear Information System (INIS)

    Qureshi, Z.H.; Strege, D.K.

    1996-01-01

    A series of pilot-scale demonstration tests for the dissolution of High Efficiency Mist Eliminators (BEME's) and High Efficiency Particulate Airfilters (BEPA) were performed on a 1/5th linear scale. These filters are to be used in the Defense Waste Processing Facility (DWPF) to decontaminate the effluents from the off-gases generated during the feed preparation process and vitrification. When removed, these radioactively contaminated filters will be dissolved using caustic solutions. As a result of these tests, a simple dissolution process was developed. In this process, the contaminated filter is first immersed in boiling 5% caustic solution for 24 hours and then water is sprayed on the filter. These steps break down the filter first chemically and then mechanically. The metal cage is rinsed and considered low level waste. The dissolved filter is pumpable and mixed with high level waste. Compared to earlier dissolution studies using caustic-acid-caustic solutions, the proposed method represents a 66% savings in cycle time and amount of liquid waste generated. This paper provides the details of filter mockups and results of the dissolution tests

  6. An Applied Image Processing for Radiographic Testing

    International Nuclear Information System (INIS)

    Ratchason, Surasak; Tuammee, Sopida; Srisroal Anusara

    2005-10-01

    An applied image processing for radiographic testing (RT) is desirable because it decreases time-consuming, decreases the cost of inspection process that need the experienced workers, and improves the inspection quality. This paper presents the primary study of image processing for RT-films that is the welding-film. The proposed approach to determine the defects on weld-images. The BMP image-files are opened and developed by computer program that using Borland C ++ . The software has five main methods that are Histogram, Contrast Enhancement, Edge Detection, Image Segmentation and Image Restoration. Each the main method has the several sub method that are the selected options. The results showed that the effective software can detect defects and the varied method suit for the different radiographic images. Furthermore, improving images are better when two methods are incorporated

  7. Non-Markovian quantum processes: Complete framework and efficient characterization

    Science.gov (United States)

    Pollock, Felix A.; Rodríguez-Rosario, César; Frauenheim, Thomas; Paternostro, Mauro; Modi, Kavan

    2018-01-01

    Currently, there is no systematic way to describe a quantum process with memory solely in terms of experimentally accessible quantities. However, recent technological advances mean we have control over systems at scales where memory effects are non-negligible. The lack of such an operational description has hindered advances in understanding physical, chemical, and biological processes, where often unjustified theoretical assumptions are made to render a dynamical description tractable. This has led to theories plagued with unphysical results and no consensus on what a quantum Markov (memoryless) process is. Here, we develop a universal framework to characterize arbitrary non-Markovian quantum processes. We show how a multitime non-Markovian process can be reconstructed experimentally, and that it has a natural representation as a many-body quantum state, where temporal correlations are mapped to spatial ones. Moreover, this state is expected to have an efficient matrix-product-operator form in many cases. Our framework constitutes a systematic tool for the effective description of memory-bearing open-system evolutions.

  8. Examining the Islamic stock market efficiency: Evidence from nonlinear ESTAR unit root tests

    Directory of Open Access Journals (Sweden)

    Rahmat Heru Setianto

    2015-04-01

    Full Text Available This paper empirically examines the efficient market hypothesis (EMH in the Islamic stock market namely Jakarta Islamic Index by emphasizing on the random walk behavior and nonlinearity. In the first step, we employ Brock et al. (1996 test to examine the presence of nonlinear behavior in Jakarta Islamic Index. The evidence of nonlinear behavior in the indices, motivate us to use nonlinear ESTAR unit root test procedure recently developed by Kapetanios et al. (2003 and Kruse (2011. The nonlinear unit root test procedure fail to rejects the null hypothesis of unit root for the indices, suggesting that Jakarta Islamic Index characterized by random walk process supporting the theory of efficient market hypothesis. In addition, Lumsdaine and Papel (LP test identified significant structural breaks in the index series.

  9. Pilot-scale tests of HEME and HEPA dissolution process

    Energy Technology Data Exchange (ETDEWEB)

    Qureshi, Z.H.; Strege, D.K.

    1994-06-01

    A series of pilot-scale demonstration tests for the dissolution of High Efficiency Mist Eliminators (HEME`s) and High Efficiency Particulate Airfilters (HEPA) were performed on a 1/5th linear scale. These fiberglass filters are to be used in the Defense Waste Processing Facility (DWPF) to decontaminate the effluents from the off-gases generated during the feed preparation process and vitrification. When removed, these filters will be dissolved in the Decontamination Waste Treatment Tank (DWTT) using 5 wt% NaOH solution. The contaminated fiberglass is converted to an aqueous stream which will be transferred to the waste tanks. The filter metal structure will be rinsed with process water before its disposal as low-level solid waste. The pilot-scale study reported here successfully demonstrated a simple one step process using 5 wt% NaOH solution. The proposed process requires the installation of a new water spray ring with 30 nozzles. In addition to the reduced waste generated, the total process time is reduced to 48 hours only (66% saving in time). The pilot-scale tests clearly demonstrated that the dissolution process of HEMEs has two stages - chemical digestion of the filter and mechanical erosion of the digested filter. The digestion is achieved by a boiling 5 wt% caustic solutions, whereas the mechanical break down of the digested filter is successfully achieved by spraying process water on the digested filter. An alternate method of breaking down the digested filter by increased air sparging of the solution was found to be marginally successful are best. The pilot-scale tests also demonstrated that the products of dissolution are easily pumpable by a centrifugal pump.

  10. Pilot-scale tests of HEME and HEPA dissolution process

    International Nuclear Information System (INIS)

    Qureshi, Z.H.; Strege, D.K.

    1994-06-01

    A series of pilot-scale demonstration tests for the dissolution of High Efficiency Mist Eliminators (HEME's) and High Efficiency Particulate Airfilters (HEPA) were performed on a 1/5th linear scale. These fiberglass filters are to be used in the Defense Waste Processing Facility (DWPF) to decontaminate the effluents from the off-gases generated during the feed preparation process and vitrification. When removed, these filters will be dissolved in the Decontamination Waste Treatment Tank (DWTT) using 5 wt% NaOH solution. The contaminated fiberglass is converted to an aqueous stream which will be transferred to the waste tanks. The filter metal structure will be rinsed with process water before its disposal as low-level solid waste. The pilot-scale study reported here successfully demonstrated a simple one step process using 5 wt% NaOH solution. The proposed process requires the installation of a new water spray ring with 30 nozzles. In addition to the reduced waste generated, the total process time is reduced to 48 hours only (66% saving in time). The pilot-scale tests clearly demonstrated that the dissolution process of HEMEs has two stages - chemical digestion of the filter and mechanical erosion of the digested filter. The digestion is achieved by a boiling 5 wt% caustic solutions, whereas the mechanical break down of the digested filter is successfully achieved by spraying process water on the digested filter. An alternate method of breaking down the digested filter by increased air sparging of the solution was found to be marginally successful are best. The pilot-scale tests also demonstrated that the products of dissolution are easily pumpable by a centrifugal pump

  11. Zerodur polishing process for high surface quality and high efficiency

    International Nuclear Information System (INIS)

    Tesar, A.; Fuchs, B.

    1992-08-01

    Zerodur is a glass-ceramic composite importance in applications where temperature instabilities influence optical and mechanical performance, such as in earthbound and spaceborne telescope mirror substrates. Polished Zerodur surfaces of high quality have been required for laser gyro mirrors. Polished surface quality of substrates affects performance of high reflection coatings. Thus, the interest in improving Zerodur polished surface quality has become more general. Beyond eliminating subsurface damage, high quality surfaces are produced by reducing the amount of hydrated material redeposited on the surface during polishing. With the proper control of polishing parameters, such surfaces exhibit roughnesses of < l Angstrom rms. Zerodur polishing was studied to recommend a high surface quality polishing process which could be easily adapted to standard planetary continuous polishing machines and spindles. This summary contains information on a polishing process developed at LLNL which reproducibly provides high quality polished Zerodur surfaces at very high polishing efficiencies

  12. Testing an alternate informed consent process.

    Science.gov (United States)

    Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra

    2009-01-01

    One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.

  13. ADDED VALUE AS EFFICIENCY CRITERION FOR INDUSTRIAL PRODUCTION PROCESS

    Directory of Open Access Journals (Sweden)

    L. M. Korotkevich

    2016-01-01

    Full Text Available Literary analysis has shown that the majority of researchers are using classical efficiency criteria for construction of an optimization model for production process: profit maximization; cost minimization; maximization of commercial product output; minimization of back-log for product demand; minimization of total time consumption due to production change. The paper proposes to use an index of added value as an efficiency criterion because it combines economic and social interests of all main interested subjects of the business activity: national government, property owners, employees, investors. The following types of added value have been considered in the paper: joint-stock, market, monetary, economic, notional (gross, net, real. The paper makes suggestion to use an index of real value added as an efficiency criterion. Such approach permits to bring notional added value in comparable variant because added value can be increased not only due to efficiency improvement of enterprise activity but also due to environmental factors – excess in rate of export price increases over rate of import growth. An analysis of methods for calculation of real value added has been made on a country-by-country basis (extrapolation, simple and double deflation. A method of double deflation has been selected on the basis of the executed analysis and it is counted according to the Laspeyires, Paasche, Fischer indices. A conclusion has been made that the used expressions do not take into account fully economic peculiarities of the Republic of Belarus: they are considered as inappropriate in the case when product cost is differentiated according to marketing outlets; they do not take account of difference in rate of several currencies and such approach is reflected in export price of a released product and import price for raw material, supplies and component parts. Taking this into consideration expressions for calculation of real value added have been specified

  14. C-106 tank process ventilation test

    International Nuclear Information System (INIS)

    Bailey, J.W.

    1998-01-01

    Project W-320 Acceptance Test Report for tank 241-C-106, 296-C-006 Ventilation System Acceptance Test Procedure (ATP) HNF-SD-W320-012, C-106 Tank Process Ventilation Test, was an in depth test of the 296-C-006 ventilation system and ventilation support systems required to perform the sluicing of tank C-106. Systems involved included electrical, instrumentation, chiller and HVAC. Tests began at component level, moved to loop level, up to system level and finally to an integrated systems level test. One criteria was to perform the test with the least amount of risk from a radioactive contamination potential stand point. To accomplish this a temporary configuration was designed that would simulate operation of the systems, without being connected directly to the waste tank air space. This was done by blanking off ducting to the tank and connecting temporary ducting and an inlet air filter and housing to the recirculation system. This configuration would eventually become the possible cause of exceptions. During the performance of the test, there were points where the equipment did not function per the directions listed in the ATP. These events fell into several different categories. The first and easiest problems were field configurations that did not match the design documentation. This was corrected by modifying the field configuration to meet design documentation and reperforming the applicable sections of the ATP. A second type of problem encountered was associated with equipment which did not operate correctly, at which point an exception was written against the ATP, to be resolved later. A third type of problem was with equipment that actually operated correctly but the directions in the ATP were in error. These were corrected by generating an Engineering Change Notice (ECN) against the ATP. The ATP with corrected directions was then re-performed. A fourth type of problem was where the directions in the ATP were as the equipment should operate, but the design of

  15. Pure sources and efficient detectors for optical quantum information processing

    Science.gov (United States)

    Zielnicki, Kevin

    Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on

  16. Effects of image processing on the detective quantum efficiency

    Science.gov (United States)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-04-01

    Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.

  17. Private and Efficient Query Processing on Outsourced Genomic Databases.

    Science.gov (United States)

    Ghasemi, Reza; Al Aziz, Md Momin; Mohammed, Noman; Dehkordi, Massoud Hadian; Jiang, Xiaoqian

    2017-09-01

    Applications of genomic studies are spreading rapidly in many domains of science and technology such as healthcare, biomedical research, direct-to-consumer services, and legal and forensic. However, there are a number of obstacles that make it hard to access and process a big genomic database for these applications. First, sequencing genomic sequence is a time consuming and expensive process. Second, it requires large-scale computation and storage systems to process genomic sequences. Third, genomic databases are often owned by different organizations, and thus, not available for public usage. Cloud computing paradigm can be leveraged to facilitate the creation and sharing of big genomic databases for these applications. Genomic data owners can outsource their databases in a centralized cloud server to ease the access of their databases. However, data owners are reluctant to adopt this model, as it requires outsourcing the data to an untrusted cloud service provider that may cause data breaches. In this paper, we propose a privacy-preserving model for outsourcing genomic data to a cloud. The proposed model enables query processing while providing privacy protection of genomic databases. Privacy of the individuals is guaranteed by permuting and adding fake genomic records in the database. These techniques allow cloud to evaluate count and top-k queries securely and efficiently. Experimental results demonstrate that a count and a top-k query over 40 Single Nucleotide Polymorphisms (SNPs) in a database of 20 000 records takes around 100 and 150 s, respectively.

  18. Efficient Market Hypothesis in South Africa: Evidence from Linear and Nonlinear Unit Root Tests

    Directory of Open Access Journals (Sweden)

    Andrew Phiri

    2015-12-01

    Full Text Available This study investigates the weak form efficient market hypothesis (EMH for five generalized stock indices in the Johannesburg Stock Exchange (JSE using weekly data collected from 31st January 2000 to 16th December 2014. In particular, we test for weak form market efficiency using a battery of linear and nonlinear unit root testing procedures comprising of the classical augmented Dickey-Fuller (ADF tests, the two-regime threshold autoregressive (TAR unit root tests described in Enders and Granger (1998 as well as the three-regime unit root tests described in Bec, Salem, and Carrasco (2004. Based on our empirical analysis, we are able to demonstrate that whilst the linear unit root tests advocate for unit roots within the time series, the nonlinear unit root tests suggest that most stock indices are threshold stationary processes. These results bridge two opposing contentions obtained from previous studies by concluding that under a linear framework the JSE stock indices offer support in favour of weak form market efficiency whereas when nonlinearity is accounted for, a majority of the indices violate the weak form EMH.

  19. Processing multilevel secure test and evaluation information

    Science.gov (United States)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  20. Can Intrinsic Fluctuations Increase Efficiency in Neural Information Processing?

    Science.gov (United States)

    Liljenström, Hans

    2003-05-01

    All natural processes are accompanied by fluctuations, characterized as noise or chaos. Biological systems, which have evolved during billions of years, are likely to have adapted, not only to cope with such fluctuations, but also to make use of them. We investigate how the complex dynamics of the brain, including oscillations, chaos and noise, can affect the efficiency of neural information processing. In particular, we consider the amplification and functional role of internal fluctuations. Using computer simulations of a neural network model of the olfactory cortex and hippocampus, we demonstrate how microscopic fluctuations can result in global effects at the network level. We show that the rate of information processing in associative memory tasks can be maximized for optimal noise levels, analogous to stochastic resonance phenomena. Noise can also induce transitions between different dynamical states, which could be of significance for learning and memory. A chaotic-like behavior, induced by noise or by an increase in neuronal excitability, can enhance system performance if it is transient and converges to a limit cycle memory state. We speculate whether this dynamical behavior perhaps could be related to (creative) thinking.

  1. Effects of image processing on the detective quantum efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na [Yonsei University, Wonju (Korea, Republic of)

    2010-02-15

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  2. Effects of image processing on the detective quantum efficiency

    International Nuclear Information System (INIS)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-01-01

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  3. Efficient processing of fluorescence images using directional multiscale representations.

    Science.gov (United States)

    Labate, D; Laezza, F; Negi, P; Ozcan, B; Papadakis, M

    2014-01-01

    Recent advances in high-resolution fluorescence microscopy have enabled the systematic study of morphological changes in large populations of cells induced by chemical and genetic perturbations, facilitating the discovery of signaling pathways underlying diseases and the development of new pharmacological treatments. In these studies, though, due to the complexity of the data, quantification and analysis of morphological features are for the vast majority handled manually, slowing significantly data processing and limiting often the information gained to a descriptive level. Thus, there is an urgent need for developing highly efficient automated analysis and processing tools for fluorescent images. In this paper, we present the application of a method based on the shearlet representation for confocal image analysis of neurons. The shearlet representation is a newly emerged method designed to combine multiscale data analysis with superior directional sensitivity, making this approach particularly effective for the representation of objects defined over a wide range of scales and with highly anisotropic features. Here, we apply the shearlet representation to problems of soma detection of neurons in culture and extraction of geometrical features of neuronal processes in brain tissue, and propose it as a new framework for large-scale fluorescent image analysis of biomedical data.

  4. 10 CFR 431.16 - Test procedures for the measurement of energy efficiency.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Test procedures for the measurement of energy efficiency. 431.16 Section 431.16 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY EFFICIENCY PROGRAM FOR... Methods of Determining Efficiency § 431.16 Test procedures for the measurement of energy efficiency. For...

  5. Super Efficient Refrigerator Program (SERP) evaluation. Volume 1: Process evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Sandahl, L.J.; Ledbetter, M.R.; Chin, R.I.; Lewis, K.S.; Norling, J.M.

    1996-01-01

    The Pacific Northwest National Laboratory (PNNL) conducted this study for the US Department of Energy (DOE) as part of the Super Efficient Refrigerator Program (SERP) Evaluation. This report documents the SERP formation and implementation process, and identifies preliminary program administration and implementation issues. The findings are based primarily on interviews with those familiar with the program, such as utilities, appliance manufacturers, and SERP administrators. These interviews occurred primarily between March and April 1995, when SERP was in the early stages of program implementation. A forthcoming report will estimate the preliminary impacts of SERP within the industry and marketplace. Both studies were funded by DOE at the request of SERP Inc., which sought a third-party evaluation of its program.

  6. Improving IC process efficiency with critical materials management

    Science.gov (United States)

    Hanson, Kathy L.; Andrews, Robert E.

    2003-06-01

    The management of critical materials in a high technology manufacturing facility is crucial to obtaining consistently high production yield. This is especially true in an industry like semiconductors where the success of the product is so dependent on the integrity of the critical production materials. Bar code systems, the traditional management tools, are voluntary, defeatable, and do not continuously monitor materials when in use. The significant costs associated with mis-management of chemicals can be captured with a customized model resulting in highly favorable ROI"s for the NOWTrak RFID chemical management system. This system transmits reliable chemical data about each individual container and generates information that can be used to increase wafer production efficiency and yield. The future of the RFID system will expand beyond the benefits of chemical management and into dynamic IC process management

  7. Energy Efficient Pump Control for an Offshore Oil Processing System

    DEFF Research Database (Denmark)

    Yang, Zhenyu; Soleiman, Kian; Løhndorf, Bo

    2012-01-01

    The energy efficient control of a pump system for an offshore oil processing system is investigated. The seawater is lifted up by a pump system which consists of three identical centrifugal pumps in parallel, and the lifted seawater is used to cool down the crude oil flowing out of a threephase...... separator on one of the Danish north-sea platform. A hierarchical pump-speed control strategy is developed for the considered system by minimizing the pump power consumption subject to keeping a satisfactory system performance. The proposed control strategy consists of online estimation of some system...... operating parameters, optimization of pump configurations, and a real-time feedback control. Comparing with the current control strategy at the considered system, where the pump system is on/off controlled, and the seawater flows are controlled by a number of control valves, the proposed control strategy...

  8. IN SITU FIELD TESTING OF PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    J.S.Y. YANG

    2004-11-08

    The purpose of this scientific analysis report is to update and document the data and subsequent analyses from ambient field-testing activities performed in underground drifts and surface-based boreholes through unsaturated zone (UZ) tuff rock units. In situ testing, monitoring, and associated laboratory studies are conducted to directly assess and evaluate the waste emplacement environment and the natural barriers to radionuclide transport at Yucca Mountain. This scientific analysis report supports and provides data to UZ flow and transport model reports, which in turn contribute to the Total System Performance Assessment (TSPA) of Yucca Mountain, an important document for the license application (LA). The objectives of ambient field-testing activities are described in Section 1.1. This report is the third revision (REV 03), which supercedes REV 02. The scientific analysis of data for inputs to model calibration and validation as documented in REV 02 were developed in accordance with the Technical Work Plan (TWP) ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (BSC 2004 [DIRS 167969]). This revision was developed in accordance with the ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Section 1.2.4) for better integrated, consistent, transparent, traceable, and more complete documentation in this scientific analysis report and associated UZ flow and transport model reports. No additional testing or analyses were performed as part of this revision. The list of relevant acceptance criteria is provided by ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654]), Table 3-1. Additional deviations from the TWP regarding the features, events, and processes (FEPs) list are discussed in Section 1.3. Documentation in this report includes descriptions of how, and under what

  9. IN SITU FIELD TESTING OF PROCESSES

    International Nuclear Information System (INIS)

    YANG, J.S.Y.

    2004-01-01

    The purpose of this scientific analysis report is to update and document the data and subsequent analyses from ambient field-testing activities performed in underground drifts and surface-based boreholes through unsaturated zone (UZ) tuff rock units. In situ testing, monitoring, and associated laboratory studies are conducted to directly assess and evaluate the waste emplacement environment and the natural barriers to radionuclide transport at Yucca Mountain. This scientific analysis report supports and provides data to UZ flow and transport model reports, which in turn contribute to the Total System Performance Assessment (TSPA) of Yucca Mountain, an important document for the license application (LA). The objectives of ambient field-testing activities are described in Section 1.1. This report is the third revision (REV 03), which supercedes2. The scientific analysis of data for inputs to model calibration and validation as documented in2 were developed in accordance with the Technical Work Plan (TWP) ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (BSC 2004 [DIRS 167969]). This revision was developed in accordance with the ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Section 1.2.4) for better integrated, consistent, transparent, traceable, and more complete documentation in this scientific analysis report and associated UZ flow and transport model reports. No additional testing or analyses were performed as part of this revision. The list of relevant acceptance criteria is provided by ''Technical Work Plan for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654]), Table 3-1. Additional deviations from the TWP regarding the features, events, and processes (FEPs) list are discussed in Section 1.3. Documentation in this report includes descriptions of how, and under what conditions, the tests were conducted. The descriptions and analyses

  10. Middleware for big data processing: test results

    Science.gov (United States)

    Gankevich, I.; Gaiduchok, V.; Korkhov, V.; Degtyarev, A.; Bogdanov, A.

    2017-12-01

    Dealing with large volumes of data is resource-consuming work which is more and more often delegated not only to a single computer but also to a whole distributed computing system at once. As the number of computers in a distributed system increases, the amount of effort put into effective management of the system grows. When the system reaches some critical size, much effort should be put into improving its fault tolerance. It is difficult to estimate when some particular distributed system needs such facilities for a given workload, so instead they should be implemented in a middleware which works efficiently with a distributed system of any size. It is also difficult to estimate whether a volume of data is large or not, so the middleware should also work with data of any volume. In other words, the purpose of the middleware is to provide facilities that adapt distributed computing system for a given workload. In this paper we introduce such middleware appliance. Tests show that this middleware is well-suited for typical HPC and big data workloads and its performance is comparable with well-known alternatives.

  11. Sodium removal by alcohol process: Basic tests and its application

    International Nuclear Information System (INIS)

    Nakai, S.; Yamamoto, S.; Akai, M.; Yatabe, T.

    1997-01-01

    We have various methods for sodium removal; an alcohol cleaning process, a steam cleaning process and a direct burning process. Sodium removal by the alcohol process has a lot of advantages, such as causing no alkali corrosion to steel, short processing time and easy operation. Therefore the alcohol process was selected for the 1MWt double wall tube straight type steam generator. We have already had some experiences of the alcohol process, while still needed to confirm the sodium removal rate in the crevice and to develop an on-line sodium concentration monitoring method in alcohol during sodium removal. We have conducted the small scale sodium removal test with flowing alcohol where the sodium removal rate in the crevice and the alcohol conductivity were measured as functions of sodium concentration in alcohol and alcohol temperature. The sodium removal of the DWTSG was conducted by the devised alcohol process safely and efficiently. The process hour was about 1 day. Visual inspection during dismantling of the DWTSG showed no evidence of any un-reacted sodium. (author)

  12. Study on highly efficient seismic data acquisition and processing methods based on sparsity constraint

    Science.gov (United States)

    Wang, H.; Chen, S.; Tao, C.; Qiu, L.

    2017-12-01

    High-density, high-fold and wide-azimuth seismic data acquisition methods are widely used to overcome the increasingly sophisticated exploration targets. The acquisition period is longer and longer and the acquisition cost is higher and higher. We carry out the study of highly efficient seismic data acquisition and processing methods based on sparse representation theory (or compressed sensing theory), and achieve some innovative results. The theoretical principles of highly efficient acquisition and processing is studied. We firstly reveal sparse representation theory based on wave equation. Then we study the highly efficient seismic sampling methods and present an optimized piecewise-random sampling method based on sparsity prior information. At last, a reconstruction strategy with the sparsity constraint is developed; A two-step recovery approach by combining sparsity-promoting method and hyperbolic Radon transform is also put forward. The above three aspects constitute the enhanced theory of highly efficient seismic data acquisition. The specific implementation strategies of highly efficient acquisition and processing are studied according to the highly efficient acquisition theory expounded in paragraph 2. Firstly, we propose the highly efficient acquisition network designing method by the help of optimized piecewise-random sampling method. Secondly, we propose two types of highly efficient seismic data acquisition methods based on (1) single sources and (2) blended (or simultaneous) sources. Thirdly, the reconstruction procedures corresponding to the above two types of highly efficient seismic data acquisition methods are proposed to obtain the seismic data on the regular acquisition network. A discussion of the impact on the imaging result of blended shooting is discussed. In the end, we implement the numerical tests based on Marmousi model. The achieved results show: (1) the theoretical framework of highly efficient seismic data acquisition and processing

  13. Efficient Separations and Processing Crosscutting Program. Technology summary

    International Nuclear Information System (INIS)

    1995-06-01

    The Efficient Separations and Processing (ESP) Crosscutting Program was created in 1991 to identify, develop, and perfect separations technologies and processes to treat wastes and address environmental problems throughout the DOE Complex. The ESP funds several multi-year tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESP supports applied research and development (R and D) leading to demonstration or use of these separations technologies by other organizations within DOE-EM. Treating essentially all DOE defense wastes requires separation methods that concentrate the contaminants and/or purify waste streams for release to the environment or for downgrading to a waste form less difficult and expensive to dispose of. Initially, ESP R and D efforts focused on treatment of high-level waste (HLW) from underground storage tanks (USTs) because of the potential for large reductions in disposal costs and hazards. As further separations needs emerge and as waste management and environmental restoration priorities change, the program has evolved to encompass the breadth of waste management and environmental remediation problems

  14. Efficient p-value evaluation for resampling-based tests

    KAUST Repository

    Yu, K.; Liang, F.; Ciampa, J.; Chatterjee, N.

    2011-01-01

    The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated

  15. Steady reconstruction process - development, testing and comparison in ultrasonic testing

    International Nuclear Information System (INIS)

    Langenberg, K.J.; Schmitz, V.

    1986-01-01

    The fault parameters can be extracted from a few data of high quality in steady test procedures. The boundary conditions for the successful use of such a process were researched and found, so that by using theoretical models for the elasto-dynamic interaction of fault and ultrasonics, a concentration of wavefronts instead of resonances and a wide band careful collection of data makes a physical interpretation in the form of specific geometry torques possible. Models of the interaction of ultrasonics and faults for two fault geometries (cracks and pores) were developed which permit the calculation of A scans of any bandwidth and with any angle of scatter for the direct and mode converted parts of the elastic ultrasonic scatter wave. The curved pressure and shear waves including the mode converted bending fields over an angular range of 360deg were experimentally recorded. Their agreement including the additional wavefronts caused by the close field of the crack bending field is close. Classification of torques is done on two examples (crack, cylinder) for evaluation purposes. It was found that a classification was possible according to the sign of the a 1 polynomial coefficient. (orig./HP) [de

  16. TESTING INFORMATIONAL EFFICIENCY: THE CASE OF U.E. AND BRIC EMERGENT MARKETS

    OpenAIRE

    OPREAN Camelia

    2012-01-01

    Empirical finance has brought together a considerable number of studies in determining the market efficiency in terms of information in the case of an emerging financial market. Conflicting results have been generated by these researches in efficient market hypothesis (EMH), so efficiency tests in the emerging financial markets are rarely definitive in reaching a conclusion about the existence of informational efficiency. This paper tests weak-form market efficiency of eight emerging markets:...

  17. High Power High Efficiency Diode Laser Stack for Processing

    Science.gov (United States)

    Gu, Yuanyuan; Lu, Hui; Fu, Yueming; Cui, Yan

    2018-03-01

    High-power diode lasers based on GaAs semiconductor bars are well established as reliable and highly efficient laser sources. As diode laser is simple in structure, small size, longer life expectancy with the advantages of low prices, it is widely used in the industry processing, such as heat treating, welding, hardening, cladding and so on. Respectively, diode laser could make it possible to establish the practical application because of rectangular beam patterns which are suitable to make fine bead with less power. At this power level, it can have many important applications, such as surgery, welding of polymers, soldering, coatings and surface treatment of metals. But there are some applications, which require much higher power and brightness, e.g. hardening, key hole welding, cutting and metal welding. In addition, High power diode lasers in the military field also have important applications. So all developed countries have attached great importance to high-power diode laser system and its applications. This is mainly due their low performance. In this paper we will introduce the structure and the principle of the high power diode stack.

  18. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. The enhancement of visuospatial processing efficiency through Buddhist Deity meditation.

    Science.gov (United States)

    Kozhevnikov, Maria; Louchakova, Olga; Josipovic, Zoran; Motes, Michael A

    2009-05-01

    This study examined the effects of meditation on mental imagery, evaluating Buddhist monks' reports concerning their extraordinary imagery skills. Practitioners of Buddhist meditation were divided into two groups according to their preferred meditation style: Deity Yoga (focused attention on an internal visual image) or Open Presence (evenly distributed attention, not directed to any particular object). Both groups of meditators completed computerized mental-imagery tasks before and after meditation. Their performance was compared with that of control groups, who either rested or performed other visuospatial tasks between testing sessions. The results indicate that all the groups performed at the same baseline level, but after meditation, Deity Yoga practitioners demonstrated a dramatic increase in performance on imagery tasks compared with the other groups. The results suggest that Deity meditation specifically trains one's capacity to access heightened visuospatial processing resources, rather than generally improving visuospatial imagery abilities.

  20. Automating the simulator testing and data collection process

    Energy Technology Data Exchange (ETDEWEB)

    Magi, T.; Dimitri-Hakim, R. [L-3 Communications MAPPS Inc., Montreal, Quebec (Canada)

    2012-07-01

    Scenario-based training is a key process in the use of Full Scope Simulators (FSS) for operator training. Scenario-based training can be defined as any set of simulated plant operations performed with a specific training objective in mind. In order to meet this training objective, the ANSI/ANS-3.5-2009 standard requires that certain simulator training scenarios be tested to ensure that they reproduce the expected plant responses, that all plant procedures can be followed, and that scenario-based training objectives can be met. While malfunction testing provided a narrow view of the simulator performance revolving around the malfunction itself, scenario testing provides a broader, overall view. The concept of instructor validation of simulator scenarios to be used for training and evaluation, and oversight of simulator performance during the validation process, work hand-in-hand. This is where Scenario-Based Testing comes into play. With the description of Scenario-Based Testing (SBT) within Nuclear Energy Institute NEI 09-09 white paper and within the ANSI/ANS-3.5-2009 standard, the industry now has a way forward that reduces the regulatory uncertainty. Together, scenario-based testing and scenario-based training combine to produce better simulators which in turn can be used to more effectively and efficiently train new and existing power plant operators. However, they also impose a significant data gathering and analysis burden on FSS users. L-3 MAPPS Orchid Instructor Station (Orchid IS) facilitates this data gathering and analysis by providing features that automate this process with a simple, centralized, easy to use interface. (author)

  1. PWSCC Mitigation of alloy 182: Testing of various mitigation processes

    International Nuclear Information System (INIS)

    Curieres, I. de; Calonne, O.; Crooker, P.

    2011-01-01

    Since the mid nineties, Primary Water Stress Corrosion Cracking (PWSCC) of Alloy 182 welds has occurred. This affects different components, even ones that are considered to have 'low-susceptibility' due to a low operating temperature such as the 'low operating temperature' reactor pressure vessel (RPV) heads in the global PWR fleet and bottom-mounted instrumentation nozzles, a location where currently there is no ready-to-deploy repair or replacement solution. Hence, there is an incentive to identify effective remedial measures to delay or prevent PWSCC initiation, even at 'low temperature' RPV heads in order to avoid wholesale replacement in the future. Working with EPRI, Areva has assessed the efficiency of various technological processes including brushing, polishing or compressive stress methods to mitigate PWSCC in Alloy 182. A first phase of the program is completed and the results will be presented. The emphasis will be put on the program's different testing phases and the different mitigation processes that were tested. Efficiency of 'chemical' surface treatments is not yet proved. EPRI stabilized chromium had a deleterious effect on crack initiation that should be reproduced and understood before drawing a definitive conclusion. The electropolishing process considered does not seem to be sufficiently reliable on Alloy 182 surfaces but longer exposures are required for a more definitive evaluation of this treatment. All tested 'mechanical' surface treatments i.e. -) GE-RENEW brushing, -) Fiber laser peening (Toshiba), -) Water Jet Peening (Mitsubishi), -) Water Jet Peening (Hitachi), -) Combination of GE-RENEW and Hitachi WJP have successfully inhibited crack initiation even though the surface compressive stresses induced on U-ends are lower than those expected on massive components. Past experience shows that crack initiation occurs in less than 250 h on U-bends with 'heavily ground' reference surfaces. Thus, it can be deduced that the present results show

  2. 76 FR 47178 - Energy Efficiency Program: Test Procedure for Lighting Systems (Luminaires)

    Science.gov (United States)

    2011-08-04

    ... DEPARTMENT OF ENERGY [Docket Number EERE-2011-BT-TP-0041] RIN 1904-AC50 Energy Efficiency Program: Test Procedure for Lighting Systems (Luminaires) AGENCY: Office of Energy Efficiency and Renewable... (``DOE'' or the ``Department'') is currently evaluating energy efficiency test procedures for luminaires...

  3. Global Envelope Tests for Spatial Processes

    DEFF Research Database (Denmark)

    Myllymäki, Mari; Mrkvička, Tomáš; Grabarnik, Pavel

    2017-01-01

    Envelope tests are a popular tool in spatial statistics, where they are used in goodness-of-fit testing. These tests graphically compare an empirical function T(r) with its simulated counterparts from the null model. However, the type I error probability α is conventionally controlled for a fixed d......) the construction of envelopes for a deviation test. These new tests allow the a priori selection of the global α and they yield p-values. We illustrate these tests using simulated and real point pattern data....

  4. Global envelope tests for spatial processes

    DEFF Research Database (Denmark)

    Myllymäki, Mari; Mrkvička, Tomáš; Grabarnik, Pavel

    Envelope tests are a popular tool in spatial statistics, where they are used in goodness-of-fit testing. These tests graphically compare an empirical function T(r) with its simulated counterparts from the null model. However, the type I error probability α is conventionally controlled for a fixed......) the construction of envelopes for a deviation test. These new tests allow the a priori selection of the global α and they yield p-values. We illustrate these tests using simulated and real point pattern data....

  5. Efficient separations and processing crosscutting program 1996 technical exchange meeting. Proceedings

    International Nuclear Information System (INIS)

    1996-01-01

    This document contains summaries of technology development presented at the 1996 Efficient Separations and Processing Crosscutting Program Technical Exchange Meeting. This meeting is held annually to promote a free exchange of ideas among technology developers, potential users and other interested parties within the EM community. During this meeting the following many separation processes technologies were discussed such as ion exchange, membrane separation, vacuum distillation, selective sorption, and solvent extraction. Other topics discussed include: waste forms; testing or inorganic sorbents for radionuclide and heavy metal removal; selective crystallization; and electrochemical treatment of liquid wastes. This is the leading abstract, individual papers have been indexed separately for the databases

  6. Efficient separations and processing crosscutting program 1996 technical exchange meeting. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    This document contains summaries of technology development presented at the 1996 Efficient Separations and Processing Crosscutting Program Technical Exchange Meeting. This meeting is held annually to promote a free exchange of ideas among technology developers, potential users and other interested parties within the EM community. During this meeting the following many separation processes technologies were discussed such as ion exchange, membrane separation, vacuum distillation, selective sorption, and solvent extraction. Other topics discussed include: waste forms; testing or inorganic sorbents for radionuclide and heavy metal removal; selective crystallization; and electrochemical treatment of liquid wastes. This is the leading abstract, individual papers have been indexed separately for the databases.

  7. Economic efficiency analysis of electron accelerator for irradiation processing

    International Nuclear Information System (INIS)

    Shi Huidong; Chen Ronghui

    2003-01-01

    The fixed assets, running cost and economic efficiency were discussed in this paper. For building electron accelerator of 10 MeV and 3 kW, the running cost is one time higher than building cobalt source at 2.22 x 10 15 Bq, but economic efficiency of building a electron accelerator is much higher than building a cobalt source

  8. Clean and efficient energy conversion processes (Cecon-project). Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    The objectives of the work programme reported are the development and testing of two optimised energy conversion processes, both consisting of a radiant surface gas burner and a ceramic heat exchanger. The first sub-objective of the programme is related to industrial heating, drying and curing processes requireing low and medium heat fluxes. It is estimated that around one tenth of the total EC industrial energy use is associated with such processes. The majority of these processes currently use convection and conduction as the main heat transfer mechanisms and overall energy efficiencies are typically below 25%. For many drying and finishing processes (such as curing powder coatings and drying paints, varnishes, inks, and for the fabrication of paper and textiles), radiant heating can achieve much faster dyring rates and higher energy efficiency than convective heating. In the project new concepts of natural gas fired radiant heating have been investigated which would be much more efficient than the existing processes. One element of the programme was the evelopment of gas burners having enhanced radiant efficiencies. A second concerned the investigation of the safety of gas burners containing significant volumes of mixed gas and air. Finally the new gas burners were tested in combination with the high temperature heat exchanger to create highly efficient radiant heating systems. The second sub-objective concerned the development of a compact low cost heat exchanger capable of achieving high levels of heat recovery (up to 60%) which could be easily installed on industrial processes. This would make heat recovery a practical proposition on processes where existing heat recovery technology is currently not cost effective. The project will have an impact on industrial processes consuming around 80 MTOE of energy per year within EU countries (1 MTOE equals 41.8 PJ). The overall energy saving potential of the project is estimated to be around 22 MTOE which is around 10

  9. 33 CFR 159.121 - Sewage processing test.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Sewage processing test. 159.121...) POLLUTION MARINE SANITATION DEVICES Design, Construction, and Testing § 159.121 Sewage processing test. (a) The device must process human sewage in the manner for which it is designed when tested in accordance...

  10. A test of a design process scale

    NARCIS (Netherlands)

    Marinakis, Yorgos; Harms, Rainer; Walsh, Steven Thomas

    2015-01-01

    Design is a type of innovation that focuses on creating new product and service meanings. Models of the design process are important because they can help firms manage their product and service design processes to obtain competitive advantage. Empirically-based models of the design process are

  11. Nonlinearity and intraday efficiency tests on energy futures markets

    International Nuclear Information System (INIS)

    Wang, Tao; Yang, Jian

    2010-01-01

    Using high frequency data, this paper first time comprehensively examines the intraday efficiency of four major energy (crude oil, heating oil, gasoline, natural gas) futures markets. In contrast to earlier studies which focus on in-sample evidence and assume linearity, the paper employs various nonlinear models and several model evaluation criteria to examine market efficiency in an out-of-sample forecasting context. Overall, there is evidence for intraday market inefficiency of two of the four energy future markets (heating oil and natural gas), which exists particularly during the bull market condition but not during the bear market condition. The evidence is also robust against the data-snooping bias and the model overfitting problem, and its economic significance can be very substantial. (author)

  12. Nonlinearity and intraday efficiency tests on energy futures markets

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Tao [Department of Economics, Queens College and the Graduate Center, The City University of New York, Flushing, NY 11367 (United States); Yang, Jian [The Business School, PO Box 173364, University of Colorado Denver, Denver, CO 80217-3364 (United States)

    2010-03-15

    Using high frequency data, this paper first time comprehensively examines the intraday efficiency of four major energy (crude oil, heating oil, gasoline, natural gas) futures markets. In contrast to earlier studies which focus on in-sample evidence and assume linearity, the paper employs various nonlinear models and several model evaluation criteria to examine market efficiency in an out-of-sample forecasting context. Overall, there is evidence for intraday market inefficiency of two of the four energy future markets (heating oil and natural gas), which exists particularly during the bull market condition but not during the bear market condition. The evidence is also robust against the data-snooping bias and the model overfitting problem, and its economic significance can be very substantial. (author)

  13. Initial Test Bed for Very High Efficiency Solar Cells

    Science.gov (United States)

    2008-05-01

    efficiency, both at the solar cell and module levels. The optical system consists of a tiled nonimaging concentrating system, coupled with a spectral...To achieve the benefits of the new photovoltaic system architecture, a new optical element is designed that combines a nonimaging optical...of the power from each solar cell. Optics Design The most advanced optical design is based on non- symmetric, nonimaging optics, tiled into an

  14. TESTING THE EFFICIENT MARKET HYPOTHESIS ON THE ROMANIAN CAPITAL MARKET

    OpenAIRE

    Daniel Stefan ARMEANU; Sorin-Iulian CIOACA

    2014-01-01

    The Efficient Market Hypothesis (EMH) is one of the leading financial concepts that dominated the economic research over the last 50 years, being one of the pillars of the modern economic science. This theory, developed by Eugene Fama in the `70s, was a landmark in the development of theoretical concepts and models trying to explain the price evolution of financial assets (considering the common assumptions of the main developed theories) and also for the development of some branches in the f...

  15. Testing the financial market informational efficiency in emerging states

    OpenAIRE

    Camelia Oprean

    2012-01-01

    The Efficient Markets Hypothesis (EMH) has been one of the most influential ideas in the past years and highlights that assets prices incorporate all information rationally and instantaneously. The last financial crisis has led to criticism of this hypothesis. Many practical observations concerning the reaction of investors, but also the mechanisms for the information encompassing in the price of stocks, come to highlight the aspects of 'market inefficiency'. Despite its simplicity, the EMH i...

  16. THEORETICAL AND EXPERIMENTAL STUDIES OF ENERGY-EFFICIENT GRINDING PROCESS OF CEMENT CLINKER IN A BALL MILL

    Directory of Open Access Journals (Sweden)

    Kuznetsova M.M.

    2014-08-01

    Full Text Available The article presents results of theoretical and experimental research of grinding process of bulk materials in a ball mill. The new method of determination of energy efficiently mode of operation of ball mills in a process of a cement clinker grinding is proposed and experimentally tested.

  17. Fuel Cell Stations Automate Processes, Catalyst Testing

    Science.gov (United States)

    2010-01-01

    Glenn Research Center looks for ways to improve fuel cells, which are an important source of power for space missions, as well as the equipment used to test fuel cells. With Small Business Innovation Research (SBIR) awards from Glenn, Lynntech Inc., of College Station, Texas, addressed a major limitation of fuel cell testing equipment. Five years later, the company obtained a patent and provided the equipment to the commercial world. Now offered through TesSol Inc., of Battle Ground, Washington, the technology is used for fuel cell work, catalyst testing, sensor testing, gas blending, and other applications. It can be found at universities, national laboratories, and businesses around the world.

  18. Sulfide ore looping oxidation : an innovative process that is energy efficient and environmentally friendly

    Energy Technology Data Exchange (ETDEWEB)

    McHugh, L.F.; Balliett, R.; Mozolic, J.A. [Orchard Material Technology, North Andover, MA (United States)

    2008-07-01

    Many sulphide ore processing methods use different types of roasting technologies. These technologies are generally quite effective, however, they represent significant energy use and environmental cost. This paper discussed and validated the use of a two-step looping oxidation process that effectively removes sulphur while producing materials of adequate purity in an energy efficient and environmentally sound manner. This paper described the process in detail and compared it to existing technologies in the area of energy efficiency, and off-gas treatment energy requirements. Validation of the looping oxidation concept was described and the starting chemistries of each chemical were listed. Thermodynamic modeling was used to determine the temperature at which the reaction should begin and to predict the temperature at which the reaction should be complete. The test apparatus and run conditions were also described. It was concluded that there are several critical stages in the looping process where energy recovery is economically attractive and could easily be directed or converted for other plant operations. All reactions were fast and efficient, allowing for reduced equipment size as well as higher throughput rates. 11 refs., 3 tabs., 2 figs.

  19. Leveraging the Cloud for Robust and Efficient Lunar Image Processing

    Science.gov (United States)

    Chang, George; Malhotra, Shan; Wolgast, Paul

    2011-01-01

    The Lunar Mapping and Modeling Project (LMMP) is tasked to aggregate lunar data, from the Apollo era to the latest instruments on the LRO spacecraft, into a central repository accessible by scientists and the general public. A critical function of this task is to provide users with the best solution for browsing the vast amounts of imagery available. The image files LMMP manages range from a few gigabytes to hundreds of gigabytes in size with new data arriving every day. Despite this ever-increasing amount of data, LMMP must make the data readily available in a timely manner for users to view and analyze. This is accomplished by tiling large images into smaller images using Hadoop, a distributed computing software platform implementation of the MapReduce framework, running on a small cluster of machines locally. Additionally, the software is implemented to use Amazon's Elastic Compute Cloud (EC2) facility. We also developed a hybrid solution to serve images to users by leveraging cloud storage using Amazon's Simple Storage Service (S3) for public data while keeping private information on our own data servers. By using Cloud Computing, we improve upon our local solution by reducing the need to manage our own hardware and computing infrastructure, thereby reducing costs. Further, by using a hybrid of local and cloud storage, we are able to provide data to our users more efficiently and securely. 12 This paper examines the use of a distributed approach with Hadoop to tile images, an approach that provides significant improvements in image processing time, from hours to minutes. This paper describes the constraints imposed on the solution and the resulting techniques developed for the hybrid solution of a customized Hadoop infrastructure over local and cloud resources in managing this ever-growing data set. It examines the performance trade-offs of using the more plentiful resources of the cloud, such as those provided by S3, against the bandwidth limitations such use

  20. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  1. A study on the chemical cleaning process and its qualification test by eddy current testing

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Ki Seok; Cheon, Keun Young; Nam, Min Woo [KHNP Central Research Institute, Daejeon (Korea, Republic of); Min, Kyoung Mahn [UMI Inc., Daejeon (Korea, Republic of)

    2013-12-15

    Steam Generator (SG) tube, as a barrier isolating the primary coolant system from the secondary side of nuclear power plants (NPP), must maintain the structural integrity for the public safety and their efficient power generation. So, SG tubes are subject to the periodic examination and the repairs if needed so that any defective tubes are not in service. Recently, corrosion related degradations were detected in the tubes of the domestic OPR-1000 NPP, as a form of axially oriented outer diameter stress corrosion cracking (ODSCC). According to the studies on the factors causing the heat fouling as well as developing corrosion cracking, densely scaled deposits on the secondary side of the SG tubes are mainly known to be problematic causing the adverse impacts against the soundness of the SG tubes. Therefore, the processes of various cleaning methods efficiently to dissolve and remove the deposits have been applied as well as it is imperative to maintain the structural integrity of the tubes after exposing to the cleaning agent. So qualification test (QT) should be carried out to assess the perfection of the chemical cleaning and QT is to apply the processes and to do ECT. In this paper, the chemical cleaning processes to dissolve and remove the scaled deposits are introduced and results of ECT on the artificial crack specimens to determine the effectiveness of those processes are represented.

  2. The Warsaw Stock Exchange: A Test of Market Efficiency

    OpenAIRE

    Barry Gordon; Libby Rittenberg

    1995-01-01

    This paper analyzes the behavior of the Warsaw Stock Exchange in light of the efficient market hypothesis (EMH) and alternative models of market inefficiency. Following a brief history of the Warsaw Stock Exchange and a discussion of EMH and the Shiller (1991) critique, the Polish stock market is examined in terms of the extent to which the assumptions of EMH are met and in terms of the actual behavior of stock prices for the period of 1 June 1993 to 27 July 1994. The analysis suggests that E...

  3. Testing Efficiency of the London Metal Exchange: New Evidence

    Directory of Open Access Journals (Sweden)

    Jaehwan Park

    2018-03-01

    Full Text Available This paper explores the market efficiency of the six base metals traded on the LME (London Metal Exchange using daily data from January 2000 to June 2016. The hypothesis that futures prices 3M (3-month are unbiased predictors of spot prices (cash in the LME is rejected based on the false premise that the financialization of commodities has been growing. For the robustness check, monthly data is analyzed using ordinary least squares (OLS and GARCH (1,1 models. We reject the null hypothesis for all metals except for zinc.

  4. Whiteboard Icons to Support the Blood-Test Process in an Emergency Department

    DEFF Research Database (Denmark)

    Torkilsheyggi, Arnvør Martinsdottir á; Hertzum, Morten; From, Gustav

    2013-01-01

    The competent treatment of emergency department (ED) patients requires an effective and efficient process for handling laboratory tests such as blood tests. This study investigates how ED clinicians go about the process, from ordering blood tests to acknowledging their results and, specifically......, assesses the use of whiteboard icons to support this process. On the basis of observation and interviews we find that the blood-test process is intertwined with multiple other temporal patterns in ED work. The whiteboard icons, which indicate four temporally distinct steps in the blood-test process......, support the nurses in maintaining the flow of patients through the ED and the physicians in assessing test results at timeouts. The main results of this study are, however, that the blood-test process is temporally and collaboratively complex, that the whiteboard icons pass by most of this complexity...

  5. Efficiency of color vision tests in hereditary dyschromatopsia: case report

    OpenAIRE

    Fernandes, Luciene Chaves; Urbano, Lúcia Carvalho de Ventura

    2008-01-01

    As autoras relatam dois casos de discromatopsia hereditária e discutem a eficiência dos testes cromáticos no diagnóstico de uma discromatopsia. Os pacientes foram reprovados em diferentes concursos públicos federais por apresentarem diagnóstico de discromatopsia hereditária pelo teste de Ishihara. Submeteram-se a exame oftalmológico, com resultados dentro da normalidade. Procuraram novo parecer para melhor caracterização da sua discromatopsia. Não havia sintomas relacionados à deficiência. Os...

  6. Selection Process for New Windows | Efficient Windows Collaborative

    Science.gov (United States)

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  7. Selection Process for Replacement Windows | Efficient Windows Collaborative

    Science.gov (United States)

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  8. Efficient process migration in the EMPS multiprocessor system

    NARCIS (Netherlands)

    van Dijk, G.J.W.; Gils, van M.J.

    1992-01-01

    The process migration facility in the Eindhoven multiprocessor system (EMPS) is presented. In the EMPS system, mailboxes are used for interprocess communication. These mailboxes provide transparency of location for communicating processes. The major advantages of mailbox communication in the EMPS

  9. GPC Light Shaper for energy efficient laser materials processing.

    OpenAIRE

    Bañas, Andrew Rafael; Palima, Darwin; Villangca, Mark Jayson; Aabo, Thomas; Glückstad, Jesper

    2014-01-01

    The biggest use of lasers is in materials processing. In manufacturing, lasers are used for cutting, drilling, marking and other machining processes. Similarly, lasers are important in microfabrication processes such as photolithography, direct laser writing, or ablation. Lasers are advantageous because they do not wear out, have no physical contact with the processed material, avoid heating or warping effects, and are generally more precise. Since lasers are easier to adapt to different opti...

  10. Development of a computer program to support an efficient non-regression test of a thermal-hydraulic system code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jun Yeob; Jeong, Jae Jun [School of Mechanical Engineering, Pusan National University, Busan (Korea, Republic of); Suh, Jae Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Kim, Kyung Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    During the development process of a thermal-hydraulic system code, a non-regression test (NRT) must be performed repeatedly in order to prevent software regression. The NRT process, however, is time-consuming and labor-intensive. Thus, automation of this process is an ideal solution. In this study, we have developed a program to support an efficient NRT for the SPACE code and demonstrated its usability. This results in a high degree of efficiency for code development. The program was developed using the Visual Basic for Applications and designed so that it can be easily customized for the NRT of other computer codes.

  11. Zero-Release Mixed Waste Process Facility Design and Testing

    International Nuclear Information System (INIS)

    Richard D. Boardman; John A. Deldebbio; Robert J. Kirkham; Martin K. Clemens; Robert Geosits; Ping Wan

    2004-01-01

    A zero-release off-gas cleaning system for mixed-waste thermal treatment processes has been evaluated through experimental scoping tests and process modeling. The principles can possibly be adapted to a fluidized-bed calcination or stream reforming process, a waste melter, a rotary kiln process, and possibly other waste treatment thermal processes. The basic concept of a zero-release off-gas cleaning system is to recycle the bulk of the off-gas stream to the thermal treatment process. A slip stream is taken off the off-gas recycle to separate and purge benign constituents that may build up in the gas, such as water vapor, argon, nitrogen, and CO2. Contaminants are separated from the slip stream and returned to the thermal unit for eventual destruction or incorporation into the waste immobilization media. In the current study, a standard packed-bed scrubber, followed by gas separation membranes, is proposed for removal of contaminants from the off-gas recycle slipstream. The scrub solution is continuously regenerated by cooling and precipitating sulfate, nitrate, and other salts that reach a solubility limit in the scrub solution. Mercury is also separated by the scrubber. A miscible chemical oxidizing agent was shown to effectively oxidize mercury and also NO, thus increasing their removal efficiency. The current study indicates that the proposed process is a viable option for reducing off-gas emissions. Consideration of the proposed closed-system off-gas cleaning loop is warranted when emissions limits are stringent, or when a reduction in the total gas emissions volume is desired. Although the current closed-loop appears to be technically feasible, economical considerations must be also be evaluated on a case-by-case basis

  12. Nacelle Chine Installation Based on Wind-Tunnel Test Using Efficient Global Optimization

    Science.gov (United States)

    Kanazaki, Masahiro; Yokokawa, Yuzuru; Murayama, Mitsuhiro; Ito, Takeshi; Jeong, Shinkyu; Yamamoto, Kazuomi

    Design exploration of a nacelle chine installation was carried out. The nacelle chine improves stall performance when deploying multi-element high-lift devices. This study proposes an efficient design process using a Kriging surrogate model to determine the nacelle chine installation point in wind-tunnel tests. The design exploration was conducted in a wind-tunnel using the JAXA high-lift aircraft model at the JAXA Large-scale Low-speed Wind Tunnel. The objective was to maximize the maximum lift. The chine installation points were designed on the engine nacelle in the axial and chord-wise direction, while the geometry of the chine was fixed. In the design process, efficient global optimization (EGO) which includes Kriging model and genetic algorithm (GA) was employed. This method makes it possible both to improve the accuracy of the response surface and to explore the global optimum efficiently. Detailed observations of flowfields using the Particle Image Velocimetry method confirmed the chine effect and design results.

  13. Novel efficient process for methanol synthesis by CO2 hydrogenation

    NARCIS (Netherlands)

    Kiss, Anton Alexandru; Pragt, J.J.; Vos, H.J.; Bargeman, Gerrald; de Groot, M.T.

    2016-01-01

    Methanol is an alternative fuel that offers a convenient solution for efficient energy storage. Complementary to carbon capture activities, significant effort is devoted to the development of technologies for methanol synthesis by hydrogenation of carbon dioxide. While CO2 is available from plenty

  14. An Integrated Refurbishment Design Process to Energy Efficiency

    NARCIS (Netherlands)

    Konstantinou, T.; Knaack, U.

    2013-01-01

    Given the very low renewal rate of the building stock, the efforts to reduce energy demand must focus on the existing residential buildings. Even though awareness has been raised, the effect on energy efficiency is often neglected during the design phase of refurbishment projects. This paper

  15. A novel eco-friendly technique for efficient control of lime water softening process.

    Science.gov (United States)

    Ostovar, Mohamad; Amiri, Mohamad

    2013-12-01

    Lime softening is an established type of water treatment used for water softening. The performance of this process is highly dependent on lime dosage. Currently, lime dosage is adjusted manually based on chemical tests, aimed at maintaining the phenolphthalein (P) and total (M) alkalinities within a certain range (2 P - M > or = 5). In this paper, a critical study of the softening process has been presented. It has been shown that the current method is frequently incorrect. Furthermore, electrical conductivity (EC) has been introduced as a novel indicator for effectively characterizing the lime softening process.This novel technique has several advantages over the current alkalinities method. Because no chemical reagents are needed for titration, which is a simple test, there is a considerable reduction in test costs. Additionally, there is a reduction in the treated water hardness and generated sludge during the lime softening process. Therefore, it is highly eco-friendly, and is a very cost effective alternative technique for efficient control of the lime softening process.

  16. Efficient processing of two-dimensional arrays with C or C++

    Science.gov (United States)

    Donato, David I.

    2017-07-20

    Because fast and efficient serial processing of raster-graphic images and other two-dimensional arrays is a requirement in land-change modeling and other applications, the effects of 10 factors on the runtimes for processing two-dimensional arrays with C and C++ are evaluated in a comparative factorial study. This study’s factors include the choice among three C or C++ source-code techniques for array processing; the choice of Microsoft Windows 7 or a Linux operating system; the choice of 4-byte or 8-byte array elements and indexes; and the choice of 32-bit or 64-bit memory addressing. This study demonstrates how programmer choices can reduce runtimes by 75 percent or more, even after compiler optimizations. Ten points of practical advice for faster processing of two-dimensional arrays are offered to C and C++ programmers. Further study and the development of a C and C++ software test suite are recommended.Key words: array processing, C, C++, compiler, computational speed, land-change modeling, raster-graphic image, two-dimensional array, software efficiency

  17. 10 CFR 431.444 - Test procedures for the measurement of energy efficiency.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Test procedures for the measurement of energy efficiency. 431.444 Section 431.444 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY EFFICIENCY PROGRAM FOR... procedures for the measurement of energy efficiency. (a) Scope. Pursuant to section 346(b)(1) of EPCA, this...

  18. Forest Seed Collection, Processing,and Testing

    DEFF Research Database (Denmark)

    Schmidt, Lars Holger

    2016-01-01

    This chapter pertains to the techniques of capturing the best genetic quality seeds a seed source can produce at the optimal time of high physiological maturity and maintaining these qualities throughout the handling processes, all at a minimum cost. Different collection and processing techniques...... apply to different species, seed types, situations, and purposes. Yet the collection and processing toolbox contains a number of “standard” methods for most of these groups. Records and documentation help in evaluating “best practice” for future method improvement, and it helps in linking offspring...... to seed source. Conditions are set for short- and long-term seed storage by their inert storability physiology. The potential storage life of seed may for some robust “orthodox” species be several decades, while no available storage conditions can maintain viability for sensitive “recalcitrant” seed. Seed...

  19. Image Segmentation and Processing for Efficient Parking Space Analysis

    OpenAIRE

    Tutika, Chetan Sai; Vallapaneni, Charan; R, Karthik; KP, Bharath; Muthu, N Ruban Rajesh Kumar

    2018-01-01

    In this paper, we develop a method to detect vacant parking spaces in an environment with unclear segments and contours with the help of MATLAB image processing capabilities. Due to the anomalies present in the parking spaces, such as uneven illumination, distorted slot lines and overlapping of cars. The present-day conventional algorithms have difficulties processing the image for accurate results. The algorithm proposed uses a combination of image pre-processing and false contour detection ...

  20. Efficiency of alfalfa seed processing with different seed purity

    OpenAIRE

    Đokić, Dragoslav; Stanisavljević, Rade; Terzić, Dragan; Milenković, Jasmina; Radivojević, Gordana; Koprivica, Ranko; Štrbanović, Ratibor

    2015-01-01

    The work was carried out analysis of the impact of the initial purity of raw alfalfa seed on the resulting amount of processed seed in the processing. Alfalfa is very important perennial forage legume which is used for fodder and seed production. Alfalfa seed is possible to achieve high yields and very good financial effects. To obtain the seed material with good characteristics complex machines for cleaning and sorting seeds are used. In the processing center of the Institute for forage crop...

  1. Efficient testing of the homogeneity, scale parameters and number of components in the Rayleigh mixture

    International Nuclear Information System (INIS)

    Stehlik, M.; Ososkov, G.A.

    2003-01-01

    The statistical problem to expand the experimental distribution of transverse momenta into Rayleigh distribution is considered. A high-efficient testing procedure for testing the hypothesis of the homogeneity of the observed measurements which is optimal in the sense of Bahadur is constructed. The exact likelihood ratio (LR) test of the scale parameter of the Rayleigh distribution is proposed for cases when the hypothesis of homogeneity holds. Otherwise the efficient procedure for testing the number of components in the mixture is also proposed

  2. Operator-based linearization for efficient modeling of geothermal processes

    OpenAIRE

    Khait, M.; Voskov, D.V.

    2018-01-01

    Numerical simulation is one of the most important tools required for financial and operational management of geothermal reservoirs. The modern geothermal industry is challenged to run large ensembles of numerical models for uncertainty analysis, causing simulation performance to become a critical issue. Geothermal reservoir modeling requires the solution of governing equations describing the conservation of mass and energy. The robust, accurate and computationally efficient implementation of ...

  3. Cool metal roofing tested for energy efficiency and sustainability

    Energy Technology Data Exchange (ETDEWEB)

    Miller, W.A.; Desjarlais, A. [Oak Ridge National Laboratory, Oakridge, TN (United States); Parker, D.S. [Florida Solar Energy Center, Cocoa, FL (United States); Kriner, S. [Metal Construction Association, Glenview, IL (United States)

    2004-07-01

    A 3 year field study was conducted in which temperature, heat flow, reflectance and emittance field data were calculated for 12 different painted and unpainted metal roofs exposed to weathering at an outdoor test facility at Oak Ridge National Laboratory in Oakridge, Tennessee. In addition, the Florida Solar Energy Center tested several Habitat for Humanity homes during one summer in Fort Myers, Florida. The objective was to determine how cooling and heating energy loads in a building are affected by the solar reflectance and infrared emittance of metal roofs. The Habitat for Humanities houses had different roofing systems which reduced the attic heat gain. White reflective roofs were shown to reduce cooling energy needs by 18 to 26 per cent and peak demand by 28 to 35 per cent. High solar reflectance and high infrared emittance roofs incur surface temperatures that are about 3 degrees C warmer than the ambient air temperature. A dark absorptive roof exceeds the ambient air temperature by more than 40 degrees C. It hot climates, a high solar reflectance and high infrared emittance roof can reduce the air conditioning load and reduce peak energy demands on the utility. It was concluded that an informed decision of the roof surface properties of reflectance and emittance can significantly reduce energy costs for homeowners and builders in hot climates. 7 refs., 2 tabs., 7 figs.

  4. Test operation of the uranium ore processing pilot plant and uranium conversion plant

    International Nuclear Information System (INIS)

    Suh, I.S.; Lee, K.I.; Whang, S.T.; Kang, Y.H.; Lee, C.W.; Chu, J.O.; Lee, I.H.; Park, S.C.

    1983-01-01

    For the guarantee of acid leaching process of the Uranium Ore Processing Pilot Plnat, the KAERI team performed the test operation in coorperation with the COGEMA engineers. The result of the operation was successful achieving the uranium leaching efficiency of 95%. Completing the guarentee test, a continuous test operation was shifted to reconform the reproducibility of the result and check the functions of every units of the pilot plant feeding the low-grade domestic ore, the consistency of the facility was conformed that the uranium can easily be dissolved out form the ore between the temperature range of 60degC-70degC for two hours of leaching with sulfuric acid and could be obtained the leaching efficiency of 92% to 95%. The uranium recovery efficiencies for the processes of extraction and stripping were reached to 99% and 99.6% respectively. As an alternative process for the separation of solid from the ore pulp, four of the Counter Current Decanters were shifted replacing the Belt Filter and those were connected in a series, which were not been tested during the guarantee operation. It was found out that the washing efficiencies of the ore pulp in each tests for the decanters were proportionally increased according to the quantities of the washing water. As a result of the test, it was obtained that washing efficiencies were 95%, 85%, 83% for the water to ore ratio of 3:1, 2:1, 1.5:1 respectively. (Author)

  5. MARKET INFORMATIONAL EFFICIENCY TESTS AND ITS CRITICS: THE CASE OF EMERGENT CAPITAL MARKETS

    OpenAIRE

    OPREAN Camelia; BRATIAN Vasile

    2012-01-01

    Efficient Market Hypothesis (EMH) has attracted a considerable number of studies in empirical finance, particularly in determining the market efficiency of an emerging financial market. Conflicting and inconclusive outcomes have been generated by various existing studies in EMH. In addition, efficiency tests in the emerging financial markets are rarely definitive in reaching a conclusion about the issue. The paper proposes a critical analysis regarding the testing methods of the informational...

  6. New efficient hydrogen process production from organosilane hydrogen carriers derivatives

    Energy Technology Data Exchange (ETDEWEB)

    Brunel, Jean Michel [Unite URMITE, UMR 6236 CNRS, Faculte de Medecine et de Pharmacie, Universite de la Mediterranee, 27 boulevard Jean Moulin, 13385 Marseille 05 (France)

    2010-04-15

    While the source of hydrogen constitutes a significant scientific challenge, addressing issues of hydrogen storage, transport, and delivery is equally important. None of the current hydrogen storage options, liquefied or high pressure H{sub 2} gas, metal hydrides, etc.. satisfy criteria of size, costs, kinetics, and safety for use in transportation. In this context, we have discovered a methodology for the production of hydrogen on demand, in high yield, under kinetic control, from organosilane hydrogen carriers derivatives and methanol as co-reagent under mild conditions catalyzed by a cheap ammonium fluoride salt. Finally, the silicon by-products can be efficiently recycle leading to an environmentally friendly source of energy. (author)

  7. Efficient accesses of data structures using processing near memory

    Science.gov (United States)

    Jayasena, Nuwan S.; Zhang, Dong Ping; Diez, Paula Aguilera

    2018-05-22

    Systems, apparatuses, and methods for implementing efficient queues and other data structures. A queue may be shared among multiple processors and/or threads without using explicit software atomic instructions to coordinate access to the queue. System software may allocate an atomic queue and corresponding queue metadata in system memory and return, to the requesting thread, a handle referencing the queue metadata. Any number of threads may utilize the handle for accessing the atomic queue. The logic for ensuring the atomicity of accesses to the atomic queue may reside in a management unit in the memory controller coupled to the memory where the atomic queue is allocated.

  8. Waste plastics as supplemental fuel in the blast furnace process: improving combustion efficiencies.

    Science.gov (United States)

    Kim, Dongsu; Shin, Sunghye; Sohn, Seungman; Choi, Jinshik; Ban, Bongchan

    2002-10-14

    The possibility of using waste plastics as a source of secondary fuel in a blast furnace has been of recent interest. The success of this process, however, will be critically dependent upon the optimization of operating systems. For instance, the supply of waste plastics must be reliable as well as economically attractive compared with conventional secondary fuels such as heavy oil, natural gas and pulverized coal. In this work, we put special importance on the improvement of the combustibility of waste plastics as a way to enhance energy efficiency in a blast furnace. As experimental variables to approach this target, the effects of plastic particle size, blast temperature, and the level of oxygen enrichment were investigated using a custom-made blast model designed to simulate a real furnace. Lastly, the combustion efficiency of the mixture of waste plastics and pulverized coal was tested. The observations made from these experiments led us to the conclusion that with the increase of both blast temperature and the level of oxygen enrichment, and with a decrease in particle size, the combustibility of waste polyethylene could be improved at a given distance from the tuyere. Also it was found that the efficiency of coal combustion decreased with the addition of plastics; however, the combustion efficiency of mixture could be comparable at a longer distance from the tuyere.

  9. Laser processes and system technology for the production of high-efficient crystalline solar cells

    Science.gov (United States)

    Mayerhofer, R.; Hendel, R.; Zhu, Wenjie; Geiger, S.

    2012-10-01

    The laser as an industrial tool is an essential part of today's solar cell production. Due to the on-going efforts in the solar industry, to increase the cell efficiency, more and more laser-based processes, which have been discussed and tested at lab-scale for many years, are now being implemented in mass production lines. In order to cope with throughput requirements, standard laser concepts have to be improved continuously with respect to available average power levels, repetition rates or beam profile. Some of the laser concepts, that showed high potential in the past couple of years, will be substituted by other, more economic laser types. Furthermore, requirements for processing with less-heat affected zones fuel the development of industry-ready ultra short pulsed lasers with pulse widths even below the picosecond range. In 2011, the German Ministry of Education and Research (BMBF) had launched the program "PV-Innovation Alliance", with the aim to support the rapid transfer of high-efficiency processes out of development departments and research institutes into solar cell production lines. Here, lasers play an important role as production tools, allowing the fast implementation of high-performance solar cell concepts. We will report on the results achieved within the joint project FUTUREFAB, where efficiency optimization, throughput enhancement and cost reduction are the main goals. Here, the presentation will focus on laser processes like selective emitter doping and ablation of dielectric layers. An indispensable part of the efforts towards cost reduction in solar cell production is the improvement of wafer handling and throughput capabilities of the laser processing system. Therefore, the presentation will also elaborate on new developments in the design of complete production machines.

  10. Eco-efficient and high performance hard chrome process

    Energy Technology Data Exchange (ETDEWEB)

    Negre, P. [Protection des Metaux, Montreuil (France)

    2001-07-01

    Clean manufacturing processes that can minimize effects on the environment are discussed using electrolytic hard chromium-plating as an example. The goal of the research proposal described in this paper is to develop processes from a new and non-toxic electrolytic solution that would allow for harder, thick coatings of chromium that is more resistant to corrosion than traditional coatings and without the carcinogenic effects of hexavalent chromium. The process, as envisaged, will combine the advantages of chromium plating processes based on non-toxic compounds already available, the result primarily of the chemical formulation of the solutions for chromium plating on which the microstructure and the capacity of the structural hardening of the coatings depend. The paper provides an overview of the project, estimates person year and time requirements (225 person-years and 3.5 years) and provides a list of participating partners.

  11. The Haber Process Made Efficient by Hydroxylated Graphene

    OpenAIRE

    Chaban, Vitaly; Prezhdo, Oleg

    2016-01-01

    The Haber-Bosch process is the main industrial method for producing ammonia from diatomic nitrogen and hydrogen. Very demanding energetically, it uses an iron catalyst, and requires high temperature and pressure. Any improvement of the Haber process will have an extreme scientific and economic impact. We report a significant increase of ammonia production using hydroxylated graphene. Exploiting the polarity difference between N2/H2 and NH3, as well as the universal proton acceptor behavior of...

  12. Precision Scaling of Neural Networks for Efficient Audio Processing

    OpenAIRE

    Ko, Jong Hwan; Fromm, Josh; Philipose, Matthai; Tashev, Ivan; Zarar, Shuayb

    2017-01-01

    While deep neural networks have shown powerful performance in many audio applications, their large computation and memory demand has been a challenge for real-time processing. In this paper, we study the impact of scaling the precision of neural networks on the performance of two common audio processing tasks, namely, voice-activity detection and single-channel speech enhancement. We determine the optimal pair of weight/neuron bit precision by exploring its impact on both the performance and ...

  13. A new approach to correlate transport processes and optical efficiency in GaN-based LEDs

    International Nuclear Information System (INIS)

    Pavesi, M; Manfredi, M; Rossi, F; Salviati, G; Meneghini, M; Zanoni, E

    2009-01-01

    Carrier injection and non-radiative processes are determinants of the optical efficiency of InGaN/GaN LEDs. Among transport mechanisms, tunnelling is crucial for device functioning, but other contributions can be decisive on a varying bias. It is not easy to identify the weights and roles of these terms by a simple current-voltage characterization, so it needs a careful investigation by means of complementary experimental techniques. The correlation between luminescence and microscopic transport processes in InGaN/GaN LEDs has been investigated by means of a set of techniques: electroluminescence, cathodoluminescence, current-voltage dc measurements and thermal admittance spectroscopy. Green and blue LEDs, designed with a multi-quantum-well injector layer and an optically active single-quantum-well, have been tested. They showed distinctive current and temperature dependences of the optical efficiency, with a better performance at room temperature observed for green devices. This was discussed in terms of the carrier injection efficiency controlled by electrically active traps. The comparative analysis of the optical and electrical experimental data comes in handy as a methodological approach to correlate the emission properties with the carrier injection mechanisms and to improve the functionality in a large number of quantum well heterostructures for lighting applications.

  14. Feed Forward Artificial Neural Network Model to Estimate the TPH Removal Efficiency in Soil Washing Process

    Directory of Open Access Journals (Sweden)

    Hossein Jafari Mansoorian

    2017-01-01

    Full Text Available Background & Aims of the Study: A feed forward artificial neural network (FFANN was developed to predict the efficiency of total petroleum hydrocarbon (TPH removal from a contaminated soil, using soil washing process with Tween 80. The main objective of this study was to assess the performance of developed FFANN model for the estimation of   TPH removal. Materials and Methods: Several independent repressors including pH, shaking speed, surfactant concentration and contact time were used to describe the removal of TPH as a dependent variable in a FFANN model. 85% of data set observations were used for training the model and remaining 15% were used for model testing, approximately. The performance of the model was compared with linear regression and assessed, using Root of Mean Square Error (RMSE as goodness-of-fit measure Results: For the prediction of TPH removal efficiency, a FANN model with a three-hidden-layer structure of 4-3-1 and a learning rate of 0.01 showed the best predictive results. The RMSE and R2 for the training and testing steps of the model were obtained to be 2.596, 0.966, 10.70 and 0.78, respectively. Conclusion: For about 80% of the TPH removal efficiency can be described by the assessed regressors the developed model. Thus, focusing on the optimization of soil washing process regarding to shaking speed, contact time, surfactant concentration and pH can improve the TPH removal performance from polluted soils. The results of this study could be the basis for the application of FANN for the assessment of soil washing process and the control of petroleum hydrocarbon emission into the environments.

  15. Multiple Monte Carlo Testing with Applications in Spatial Point Processes

    DEFF Research Database (Denmark)

    Mrkvička, Tomáš; Myllymäki, Mari; Hahn, Ute

    with a function as the test statistic, 3) several Monte Carlo tests with functions as test statistics. The rank test has correct (global) type I error in each case and it is accompanied with a p-value and with a graphical interpretation which shows which subtest or which distances of the used test function......(s) lead to the rejection at the prescribed significance level of the test. Examples of null hypothesis from point process and random set statistics are used to demonstrate the strength of the rank envelope test. The examples include goodness-of-fit test with several test functions, goodness-of-fit test...

  16. Cibachrome testing. [photographic processing and printing materials

    Science.gov (United States)

    Weinstein, M. S.

    1974-01-01

    The use of Cibachrome products as a solution to problems encountered when contact printing Kodak film type SO-397 onto Kodak Ektrachrome color reversal paper type 1993 is investigated. A roll of aerial imagery consisting of Kodak film types SO-397 and 2443 was contact printed onto Cibachrome and Kodak materials and compared in terms of color quality, resolution, cost, and compatibility with existing equipment and techniques. Objective measurements are given in terms of resolution and sensitometric response. Comparison prints and transparencies were viewed and ranked according to overall quality and aesthetic appeal. It is recommended that Cibachrome Print material be used in place of Kodak Ektachrome paper because it is more easily processed, the cost is equivalent, and it provides improved resolution, color quality, and image fade resistance.

  17. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from

  18. Enzymatic biodiesel synthesis. Key factors affecting efficiency of the process

    Energy Technology Data Exchange (ETDEWEB)

    Szczesna Antczak, Miroslawa; Kubiak, Aneta; Antczak, Tadeusz; Bielecki, Stanislaw [Institute of Technical Biochemistry, Faculty of Biotechnology and Food Sciences, Technical University of Lodz, Stefanowskiego 4/10, 90-924 Lodz (Poland)

    2009-05-15

    Chemical processes of biodiesel production are energy-consuming and generate undesirable by-products such as soaps and polymeric pigments that retard separation of pure methyl or ethyl esters of fatty acids from glycerol and di- and monoacylglycerols. Enzymatic, lipase-catalyzed biodiesel synthesis has no such drawbacks. Comprehension of the latter process and an appreciable progress in production of robust preparations of lipases may soon result in the replacement of chemical catalysts with enzymes in biodiesel synthesis. Engineering of enzymatic biodiesel synthesis processes requires optimization of such factors as: molar ratio of substrates (triacylglycerols: alcohol), temperature, type of organic solvent (if any) and water activity. All of them are correlated with properties of lipase preparation. This paper reports on the interplay between the crucial parameters of the lipase-catalyzed reactions carried out in non-aqueous systems and the yield of biodiesel synthesis. (author)

  19. Energy efficient solvent regeneration process for carbon dioxide capture

    Science.gov (United States)

    Zhou, Shaojun; Meyer, Howard S.; Li, Shiguang

    2018-02-27

    A process for removing carbon dioxide from a carbon dioxide-loaded solvent uses two stages of flash apparatus. Carbon dioxide is flashed from the solvent at a higher temperature and pressure in the first stage, and a lower temperature and pressure in the second stage, and is fed to a multi-stage compression train for high pressure liquefaction. Because some of the carbon dioxide fed to the compression train is already under pressure, less energy is required to further compress the carbon dioxide to a liquid state, compared to conventional processes.

  20. Computationally efficient algorithms for statistical image processing : implementation in R

    NARCIS (Netherlands)

    Langovoy, M.; Wittich, O.

    2010-01-01

    In the series of our earlier papers on the subject, we proposed a novel statistical hypothesis testing method for detection of objects in noisy images. The method uses results from percolation theory and random graph theory. We developed algorithms that allowed to detect objects of unknown shapes in

  1. Auditory Processing Testing: In the Booth versus Outside the Booth.

    Science.gov (United States)

    Lucker, Jay R

    2017-09-01

    Many audiologists believe that auditory processing testing must be carried out in a soundproof booth. This expectation is especially a problem in places such as elementary schools. Research comparing pure-tone thresholds obtained in sound booths compared to quiet test environments outside of these booths does not support that belief. Auditory processing testing is generally carried out at above threshold levels, and therefore may be even less likely to require a soundproof booth. The present study was carried out to compare test results in soundproof booths versus quiet rooms. The purpose of this study was to determine whether auditory processing tests can be administered in a quiet test room rather than in the soundproof test suite. The outcomes would identify that audiologists can provide auditory processing testing for children under various test conditions including quiet rooms at their school. A battery of auditory processing tests was administered at a test level equivalent to 50 dB HL through headphones. The same equipment was used for testing in both locations. Twenty participants identified with normal hearing were included in this study, ten having no auditory processing concerns and ten exhibiting auditory processing problems. All participants underwent a battery of tests, both inside the test booth and outside the booth in a quiet room. Order of testing (inside versus outside) was counterbalanced. Participants were first determined to have normal hearing thresholds for tones and speech. Auditory processing tests were recorded and presented from an HP EliteBook laptop computer with noise-canceling headphones attached to a y-cord that not only presented the test stimuli to the participants but also allowed monitor headphones to be worn by the evaluator. The same equipment was used inside as well as outside the booth. No differences were found for each auditory processing measure as a function of the test setting or the order in which testing was done

  2. Predictability of Exchange Rates in Sri Lanka: A Test of the Efficient Market Hypothesis

    OpenAIRE

    Guneratne B Wickremasinghe

    2007-01-01

    This study examined the validity of the weak and semi-strong forms of the efficient market hypothesis (EMH) for the foreign exchange market of Sri Lanka. Monthly exchange rates for four currencies during the floating exchange rate regime were used in the empirical tests. Using a battery of tests, empirical results indicate that the current values of the four exchange rates can be predicted from their past values. Further, the tests of semi-strong form efficiency indicate that exchange rate pa...

  3. Eco-efficient butanol separation in the ABE fermentation process

    NARCIS (Netherlands)

    Patraşcu, Iulian; Bîldea, Costin Sorin; Kiss, Anton A.

    2017-01-01

    Butanol is considered a superior biofuel, as it is more energy dense and less hygroscopic than the more popular ethanol, resulting in higher possible blending ratios with gasoline. However, the production cost of the acetone-butanol-ethanol (ABE) fermentation process is still high, mainly due to the

  4. Nitrogen Processing Efficiency of an Upper Mississippi River Backwater Lake

    Science.gov (United States)

    2006-08-01

    with denitrification accounting for ~32 percent. By subtraction, assimilation (bacteria, periphyton , phytoplankton , and macrophyte uptake) must...later release and processing or transport. Phytoplankton and periphyton can assimilate considerable nitrate for growth in the absence of ammonia in...over the relatively short period of this study. As with phytoplankton and periphyton biomass, the macrophyte N pool is subject to N transformation

  5. High Efficiency Pneumatic Systems Compressors Hydrodynamics and Termodynamics Process Research

    Directory of Open Access Journals (Sweden)

    Paulius Bogdevičius

    2016-12-01

    Full Text Available The paper analyzes pneumatic system, which consists of three piston compressors, pipes and reciever. Designed two cylinder piston compressor with an asynchronous electric motor mathematical model. In the mathematical model has been estimated rod mechanism geometry and kinematic parameters also hudrodynamics and thermodynamic processes going in the cylinders. Also there were made mathematical experiment and presented the results of it.

  6. A new efficient statistical test for detecting variability in the gene expression data.

    Science.gov (United States)

    Mathur, Sunil; Dolo, Samuel

    2008-08-01

    DNA microarray technology allows researchers to monitor the expressions of thousands of genes under different conditions. The detection of differential gene expression under two different conditions is very important in microarray studies. Microarray experiments are multi-step procedures and each step is a potential source of variance. This makes the measurement of variability difficult because approach based on gene-by-gene estimation of variance will have few degrees of freedom. It is highly possible that the assumption of equal variance for all the expression levels may not hold. Also, the assumption of normality of gene expressions may not hold. Thus it is essential to have a statistical procedure which is not based on the normality assumption and also it can detect genes with differential variance efficiently. The detection of differential gene expression variance will allow us to identify experimental variables that affect different biological processes and accuracy of DNA microarray measurements.In this article, a new nonparametric test for scale is developed based on the arctangent of the ratio of two expression levels. Most of the tests available in literature require the assumption of normal distribution, which makes them inapplicable in many situations, and it is also hard to verify the suitability of the normal distribution assumption for the given data set. The proposed test does not require the assumption of the distribution for the underlying population and hence makes it more practical and widely applicable. The asymptotic relative efficiency is calculated under different distributions, which show that the proposed test is very powerful when the assumption of normality breaks down. Monte Carlo simulation studies are performed to compare the power of the proposed test with some of the existing procedures. It is found that the proposed test is more powerful than commonly used tests under almost all the distributions considered in the study. A

  7. Reliable and Efficient Parallel Processing Algorithms and Architectures for Modern Signal Processing. Ph.D. Thesis

    Science.gov (United States)

    Liu, Kuojuey Ray

    1990-01-01

    Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.

  8. [Assessment of the efficiency of the auditory training in children with dyslalia and auditory processing disorders].

    Science.gov (United States)

    Włodarczyk, Elżbieta; Szkiełkowska, Agata; Skarżyński, Henryk; Piłka, Adam

    2011-01-01

    To assess effectiveness of the auditory training in children with dyslalia and central auditory processing disorders. Material consisted of 50 children aged 7-9-years-old. Children with articulation disorders stayed under long-term speech therapy care in the Auditory and Phoniatrics Clinic. All children were examined by a laryngologist and a phoniatrician. Assessment included tonal and impedance audiometry and speech therapists' and psychologist's consultations. Additionally, a set of electrophysiological examinations was performed - registration of N2, P2, N2, P2, P300 waves and psychoacoustic test of central auditory functions: FPT - frequency pattern test. Next children took part in the regular auditory training and attended speech therapy. Speech assessment followed treatment and therapy, again psychoacoustic tests were performed and P300 cortical potentials were recorded. After that statistical analyses were performed. Analyses revealed that application of auditory training in patients with dyslalia and other central auditory disorders is very efficient. Auditory training may be a very efficient therapy supporting speech therapy in children suffering from dyslalia coexisting with articulation and central auditory disorders and in children with educational problems of audiogenic origin. Copyright © 2011 Polish Otolaryngology Society. Published by Elsevier Urban & Partner (Poland). All rights reserved.

  9. Dynamic CT perfusion image data compression for efficient parallel processing.

    Science.gov (United States)

    Barros, Renan Sales; Olabarriaga, Silvia Delgado; Borst, Jordi; van Walderveen, Marianne A A; Posthuma, Jorrit S; Streekstra, Geert J; van Herk, Marcel; Majoie, Charles B L M; Marquering, Henk A

    2016-03-01

    The increasing size of medical imaging data, in particular time series such as CT perfusion (CTP), requires new and fast approaches to deliver timely results for acute care. Cloud architectures based on graphics processing units (GPUs) can provide the processing capacity required for delivering fast results. However, the size of CTP datasets makes transfers to cloud infrastructures time-consuming and therefore not suitable in acute situations. To reduce this transfer time, this work proposes a fast and lossless compression algorithm for CTP data. The algorithm exploits redundancies in the temporal dimension and keeps random read-only access to the image elements directly from the compressed data on the GPU. To the best of our knowledge, this is the first work to present a GPU-ready method for medical image compression with random access to the image elements from the compressed data.

  10. Multi states electromechanical switch for energy efficient parallel data processing

    KAUST Repository

    Kloub, Hussam

    2011-04-01

    We present a design, simulation results and fabrication of electromechanical switches enabling parallel data processing and multi functionality. The device is applied in logic gates AND, NOR, XNOR, and Flip-Flops. The device footprint size is 2μm by 0.5μm, and has a pull-in voltage of 5.15V which is verified by FEM simulation. © 2011 IEEE.

  11. Multi states electromechanical switch for energy efficient parallel data processing

    KAUST Repository

    Kloub, Hussam; Smith, Casey; Hussain, Muhammad Mustafa

    2011-01-01

    We present a design, simulation results and fabrication of electromechanical switches enabling parallel data processing and multi functionality. The device is applied in logic gates AND, NOR, XNOR, and Flip-Flops. The device footprint size is 2μm by 0.5μm, and has a pull-in voltage of 5.15V which is verified by FEM simulation. © 2011 IEEE.

  12. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  13. Gaining efficiency by centralising the corporate business resiliency process.

    Science.gov (United States)

    Martinez, Robert

    2017-06-01

    Organisations have compiled many business continuity plans over the years in response to uncontrollable events and natural disasters. As the types of threats increase, even more plans are being created. Unfortunately, many corporations do not communicate the existence of these various plans outside of their centre of excellence. Creating a centralised oversight of your business resiliency process brings many benefits, including greater awareness, a bigger pool of expertise, common terminology and reducing the chances of redundant efforts. Having an overarching corporate response plan in place makes it possible to have high-level leadership trained and ready in case an extreme event occurs.

  14. Testing the Efficiency of the Foreign Exchange Spot Market in Iran

    OpenAIRE

    Borhan-Azad, Lida

    2006-01-01

    This dissertation aimed at testing the efficiency of the foreign exchange market of Iran in the weak and semi-strong form using data on the black market spot exchange rates between Iranian currency (i.e., Rial) and four major foreign currencies including US Dollar, German Mark/Euro, UK Pound and Japanese Yen. The weak form efficiency is examined by unit root tests including Augmented Dickey-Fuller (1979, 1981) (ADF) test and Phillips-Perron (1988) (PP) test. The results of these tests are con...

  15. An Iterative Procedure for Efficient Testing of B2B: A Case in Messaging Service Tests

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL

    2007-03-01

    Testing is a necessary step in systems integration. Testing in the context of inter-enterprise, business-to-business (B2B) integration is more difficult and expensive than intra-enterprise integration. Traditionally, the difficulty is alleviated by conducting the testing in two stages: conformance testing and then interoperability testing. In conformance testing, systems are tested independently against a reference system. In interoperability testing, they are tested simultaneously against one another. In the traditional approach for testing, these two stages are performed sequentially with little feedback between them. In addition, test results and test traces are left only to human analysis or even discarded if the solution passes the test. This paper proposes an approach where test results and traces from both the conformance and interoperability tests are analyzed for potential interoperability issues; conformance test cases are then derived from the analysis. The result is that more interoperability issues can be resolved in the lower-cost conformance testing mode; consequently, time and cost required for achieving interoparble solutions are reduced.

  16. Effective motion design applied to energy-efficient handling processes

    Energy Technology Data Exchange (ETDEWEB)

    Brett, Tobias

    2013-10-01

    Industrial robots are available in a large variety of mechanical alternatives regarding size, motor power, link length ratio or payload. The four major types of serial kinematics dominating the market are complemented by various parallel kinematics for special purpose. In contrast, few other path planning alternatives are applied in industrial robotics which are based on similar analytic solution principles. The objective of this thesis is to develop a systematic design method for artifacts in motion, to integrate motion design and mechanical design to enable new processes for production. For each design, a theoretical benchmark is developed, which cannot be attained by conventional robots in principle. A key performance indicator enables to measure the degree of goal achievement towards the benchmark during all design phases. Motion behaviors are identified on a local level by dynamic systems modeling and are integrated into new global behavior featuring a new quality, suitable for exceeding the design benchmark in industrial processes. Two exemplary handling robot designs are presented. The first concept enables motion behavior to consume less electrical power than kinetic energy transferred to and from its payload during motion. The second concept enables motion with four degrees of freedom by single motor stimulation, reducing idle power consumption on factor 4 towards conventional robots.

  17. Test of Weak Form Efficiency: An Empirical Study of the Malaysian Foreign Exchange Market

    OpenAIRE

    Lim, Pei Mun

    2011-01-01

    This paper empirically tests the Efficient Market Hypothesis (EMH) in the weak sense for the Malaysian foreign exchange market. The hypothesis is tested using two ways. First is by testing the random walk hypothesis based on individual unit root test and second is by testing the profitability of simple technical trading rules. The study covers the high frequency daily data from January 1997 to December 2010 and the spot exchange rates are quoted as Malaysian Ringgit per unit of US Dollar. Due...

  18. Milk pasteurization: efficiency of the HTST process according to its bacterial concentration

    Directory of Open Access Journals (Sweden)

    Ari Ajzental

    1993-12-01

    Full Text Available The efficiency of milk pasteurization (HTST related to its standard plate count (SPC values were assessed in 41 milk samples using a laboratory designed pasteurizing equipment. Based on results, it is demonstrated that efficiency of the process is affected by its bacterial concentration, where lower SPC values mean decrease in efficiency and that the performance of the process is not affected in presence of high SPC values in raw product.

  19. Thermally Activated Delayed Fluorescence in Polymers: A New Route toward Highly Efficient Solution Processable OLEDs.

    Science.gov (United States)

    Nikolaenko, Andrey E; Cass, Michael; Bourcet, Florence; Mohamad, David; Roberts, Matthew

    2015-11-25

    Efficient intermonomer thermally activated delayed fluorescence is demonstrated for the first time, opening a new route to achieving high-efficiency solution processable polymer light-emitting device materials. External quantum efficiency (EQE) of up to 10% is achieved in a simple fully solution-processed device structure, and routes for further EQE improvement identified. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Process kinetics and digestion efficiency of anaerobic batch fermentation of brewer`s spent grains (BSG)

    Energy Technology Data Exchange (ETDEWEB)

    Ezeonu, F.C.; Okaka, A.N.C. [Nnamdi Azikiwe University, Awka (Nigeria). Dept. of Applied Biochemistry

    1996-12-31

    The process kinetics of optimized anaerobic batch digestion of brewer`s spent grains (BSG) reveal that biomethanation is essentially a first order reaction interrupted intermittently by mixed order reactions. An apparent cellulose degradation efficiency of approximately 60% and a lignin degradation efficiency of about 40% was observed in the optimized process. Using the Ken and Hashimoto model, the operational efficiency of the digester was determined to be 26%. (author)

  1. Is the Economic andTesting the Efficient Markets Hypothesis on the Romanian Capital Market

    Directory of Open Access Journals (Sweden)

    Dragoș Mînjină

    2013-11-01

    Full Text Available Informational efficiency of capital markets has been the subject of numerous empirical studies. Intensive research of the field is justified by the important implications of the knowledge of the of informational efficiency level in the financial practice. Empirical studies that have tested the efficient markets hypothesis on the Romanian capital market revealed mostly that this market is not characterised by the weak form of the efficient markets hypothesis. However, recent empirical studies have obtained results for the weak form of the efficient markets hypothesis. The present decline period of the Romanian capital market, recorded on the background of adverse economic developments internally and externally, will be an important test for the continuation of recent positive developments, manifested the level of informational efficiency too.

  2. Spacecraft Testing Programs: Adding Value to the Systems Engineering Process

    Science.gov (United States)

    Britton, Keith J.; Schaible, Dawn M.

    2011-01-01

    Testing has long been recognized as a critical component of spacecraft development activities - yet many major systems failures may have been prevented with more rigorous testing programs. The question is why is more testing not being conducted? Given unlimited resources, more testing would likely be included in a spacecraft development program. Striking the right balance between too much testing and not enough has been a long-term challenge for many industries. The objective of this paper is to discuss some of the barriers, enablers, and best practices for developing and sustaining a strong test program and testing team. This paper will also explore the testing decision factors used by managers; the varying attitudes toward testing; methods to develop strong test engineers; and the influence of behavior, culture and processes on testing programs. KEY WORDS: Risk, Integration and Test, Validation, Verification, Test Program Development

  3. A Framework for Building Efficient Environmental Permitting Processes

    Directory of Open Access Journals (Sweden)

    Nicola Ulibarri

    2017-01-01

    Full Text Available Despite its importance as a tool for protecting air and water quality, and for mitigating impacts to protected species and ecosystems, the environmental permitting process is widely recognized to be inefficient and marked by delays. This article draws on a literature review and interviews with permitting practitioners to identify factors that contribute to delayed permit decisions. The sociopolitical context, projects that are complex or use novel technology, a fragmented and bureaucratic regulatory regime, serial permit applications and reviews, and applicant and permitting agency knowledge and resources each contribute to permitting inefficiency when they foster uncertainty, increase transaction costs, and allow divergent interests to multiply, yet remain unresolved. We then use the interviews to consider the potential of a collaborative dialogue between permitting agencies and applicants to mitigate these challenges, and argue that collaboration is well positioned to lessen permitting inefficiency.

  4. A Fast and Efficient Dehydration Process for Waste Drilling Slurry

    Directory of Open Access Journals (Sweden)

    Zheng Guo

    2017-01-01

    Full Text Available In this article, slurry system was converted to colloid from fluid with the colloidization of high polymer coagulants with high viscosity. The solid-liquid separation of the waste slurry was realized by the process of chemical colloidal gel breaking, coagulation function, acidification gelout. In addition, the surface morphology of slurry cake was investigated by using Field emission scanning electron microscope (FE-SEM. The results indicate that mud separation effect is decides on the type of flocculants, gel breaker. The solid content of mud cake increases from 40.5% to 77.5% when A-PA and H20 are employed as the flocculants, gelout, with the dosage of zero point four grams and zero point five grams.

  5. Efficient "Myopic" Asset Pricing in General Equilibrium: A Potential Pitfall in Excess Volatility Tests

    OpenAIRE

    Willem H. Buiter

    1987-01-01

    Excess volatility tests for financial market efficiency maintain the hypothesis of risk-neutrality. This permits the specification of the benchmark efficient market price as the present discounted value of expected future dividends. By departing from the risk-neutrality assumption in a stripped-down version of Lucas's general equilibrium asset pricing model, I show that asset prices determined in a competitive asset market and efficient by construction can nevertheless violate the variance bo...

  6. Analysis of Power Transfer Efficiency of Standard Integrated Circuit Immunity Test Methods

    Directory of Open Access Journals (Sweden)

    Hai Au Huynh

    2015-01-01

    Full Text Available Direct power injection (DPI and bulk current injection (BCI methods are defined in IEC 62132-3 and IEC 62132-4 as the electromagnetic immunity test method of integrated circuits (IC. The forward power measured at the RF noise generator when the IC malfunctions is used as the measure of immunity level of the IC. However, the actual power that causes failure in ICs is different from forward power measured at the noise source. Power transfer efficiency is used as a measure of power loss of the noise injection path. In this paper, the power transfer efficiencies of DPI and BCI methods are derived and validated experimentally with immunity test setup of a clock divider IC. Power transfer efficiency varies significantly over the frequency range as a function of the test method used and the IC input impedance. For the frequency range of 15 kHz to 1 GHz, power transfer efficiency of the BCI test was constantly higher than that of the DPI test. In the DPI test, power transfer efficiency is particularly low in the lower test frequency range up to 10 MHz. When performing the IC immunity tests following the standards, these characteristics of the test methods need to be considered.

  7. The Design and Development of Test Platform for Wheat Precision Seeding Based on Image Processing Techniques

    OpenAIRE

    Li , Qing; Lin , Haibo; Xiu , Yu-Feng; Wang , Ruixue; Yi , Chuijie

    2009-01-01

    International audience; The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces t...

  8. Early language processing efficiency predicts later receptive vocabulary outcomes in children born preterm.

    Science.gov (United States)

    Marchman, Virginia A; Adams, Katherine A; Loi, Elizabeth C; Fernald, Anne; Feldman, Heidi M

    2016-01-01

    As rates of prematurity continue to rise, identifying which preterm children are at increased risk for learning disabilities is a public health imperative. Identifying continuities between early and later skills in this vulnerable population can also illuminate fundamental neuropsychological processes that support learning in all children. At 18 months adjusted age, we used socioeconomic status (SES), medical variables, parent-reported vocabulary, scores on the Bayley Scales of Infant and Toddler Development (third edition) language composite, and children's lexical processing speed in the looking-while-listening (LWL) task as predictor variables in a sample of 30 preterm children. Receptive vocabulary as measured by the Peabody Picture Vocabulary Test (fourth edition) at 36 months was the outcome. Receptive vocabulary was correlated with SES, but uncorrelated with degree of prematurity or a composite of medical risk. Importantly, lexical processing speed was the strongest predictor of receptive vocabulary (r = -.81), accounting for 30% unique variance. Individual differences in lexical processing efficiency may be able to serve as a marker for information processing skills that are critical for language learning.

  9. Standardization on the specification, test and evaluation of high efficiency motors and inverters

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kil Yong [Korea Electric Power Corp. (KEPCO), Taejon (Korea, Republic of). Research Center; Hyun, Chang Soon [Korea Academy of Industrial Technology, Incheon (Korea, Republic of)

    1995-12-31

    Most of the power systems energy is consumed by electrical motors. This report proposes a method for the standardization on the specification, test and evaluation of the high efficiency motors and related inverters. The results of this report can be referred to the rebate program for promoting the use of high efficiency motors and inverters (author). 26 refs., 102 figs.

  10. 76 FR 63211 - Energy Efficiency Program: Test Procedures for Residential Water Heaters, Direct Heating...

    Science.gov (United States)

    2011-10-12

    ... DEPARTMENT OF ENERGY 10 CFR Part 430 [Docket Number EERE-2011-BT-TP-0042] RIN 1904-AC53 Energy Efficiency Program: Test Procedures for Residential Water Heaters, Direct Heating Equipment, and Pool Heaters AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Request for...

  11. Reshaping Text Data for Efficient Processing on Amazon EC2

    Directory of Open Access Journals (Sweden)

    Gabriela Turcu

    2011-01-01

    Full Text Available Text analysis tools are nowadays required to process increasingly large corpora which are often organized as small files (abstracts, news articles, etc.. Cloud computing offers a convenient, on-demand, pay-as-you-go computing environment for solving such problems. We investigate provisioning on the Amazon EC2 cloud from the user perspective, attempting to provide a scheduling strategy that is both timely and cost effective. We derive an execution plan using an empirically determined application performance model. A first goal of our performance measurements is to determine an optimal file size for our application to consume. Using the subset-sum first fit heuristic we reshape the input data by merging files in order to match as closely as possible the desired file size. This also speeds up the task of retrieving the results of our application, by having the output be less segmented. Using predictions of the performance of our application based on measurements on small data sets, we devise an execution plan that meets a user specified deadline while minimizing cost.

  12. Optimal energy efficiency policies and regulatory demand-side management tests: How well do they match?

    International Nuclear Information System (INIS)

    Brennan, Timothy J.

    2010-01-01

    Under conventional models, subsidizing energy efficiency requires electricity to be priced below marginal cost. Its benefits increase when electricity prices increase to finance the subsidy. With high prices, subsidies are counterproductive unless consumers fail to make efficiency investments when private benefits exceed costs. If the gain from adopting efficiency is only reduced electricity spending, capping revenues from energy sales may induce a utility to substitute efficiency for generation when the former is less costly. This goes beyond standard 'decoupling' of distribution revenues from sales, requiring complex energy price regulation. The models' results are used to evaluate tests in the 2002 California Standard Practice Manual for assessing demand-side management programs. Its 'Ratepayer Impact Measure' test best conforms to the condition that electricity price is too low. Its 'Total Resource Cost' and 'Societal Cost' tests resemble the condition for expanded decoupling. No test incorporates optimality conditions apart from consumer choice failure.

  13. Correction in the efficiency of uranium purification process by solvent extraction

    International Nuclear Information System (INIS)

    Franca Junior, J.M.

    1981-01-01

    An uranium solvent extraction, of high purification, with full advantage of absorbed uranium in the begining of process, is described. Including a pulsed column, called correction column, the efficiency of whole process is increased, dispensing the recycling of uranium losses from leaching column. With the correction column the uranium losses go in continuity, for reextraction column, increasing the efficiency of process. The purified uranium is removed in the reextraction column in aqueous phase. The correction process can be carried out with full efficiency using pulsed columns or chemical mixer-settlers. (M.C.K.) [pt

  14. Errors in the Total Testing Process in the Clinical Chemistry ...

    African Journals Online (AJOL)

    2018-03-01

    Mar 1, 2018 ... Analytical errors related to internal and external quality control exceeding the target range, (14.4%) ... indicators to assess errors in the total testing process. The. University ... Evidence showed that the risk of .... Data management and quality control: Pre-test ..... indicators and specifications for key processes.

  15. Proton Testing of Advanced Stellar Compass Digital Processing Unit

    DEFF Research Database (Denmark)

    Thuesen, Gøsta; Denver, Troelz; Jørgensen, Finn E

    1999-01-01

    The Advanced Stellar Compass Digital Processing Unit was radiation tested with 300 MeV protons at Proton Irradiation Facility (PIF), Paul Scherrer Institute, Switzerland.......The Advanced Stellar Compass Digital Processing Unit was radiation tested with 300 MeV protons at Proton Irradiation Facility (PIF), Paul Scherrer Institute, Switzerland....

  16. Test process for the safety-critical embedded software

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju; Lee, Jangsoo

    2004-01-01

    Digitalization of nuclear Instrumentation and Control (I and C) system requires high reliability of not only hardware but also software. Verification and Validation (V and V) process is recommended for software reliability. But a more quantitative method is necessary such as software testing. Most of software in the nuclear I and C system is safety-critical embedded software. Safety-critical embedded software is specified, verified and developed according to V and V process. Hence two types of software testing techniques are necessary for the developed code. First, code-based software testing is required to examine the developed code. Second, after code-based software testing, software testing affected by hardware is required to reveal the interaction fault that may cause unexpected results. We call the testing of hardware's influence on software, an interaction testing. In case of safety-critical embedded software, it is also important to consider the interaction between hardware and software. Even if no faults are detected when testing either hardware or software alone, combining these components may lead to unexpected results due to the interaction. In this paper, we propose a software test process that embraces test levels, test techniques, required test tasks and documents for safety-critical embedded software. We apply the proposed test process to safety-critical embedded software as a case study, and show the effectiveness of it. (author)

  17. Application Research on Testing Efficiency of Main Drainage Pump in Coal Mine Using Thermodynamic Theories

    OpenAIRE

    Shang, Deyong

    2017-01-01

    The efficiency of a drainage pump should be tested at regular intervals to master the status of the drainage pump in real time and thus achieve the goal of saving energy. The ultrasonic flowmeter method is traditionally used to measure the flow of the pump. But there are some defects in this kind of method of underground coal mine. This paper first introduces the principle of testing the main drainage pump efficiency in coal mine using thermodynamic theories, then analyzes the energy transfor...

  18. Design of Test Parts to Characterize Micro Additive Manufacturing Processes

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Mischkot, Michael

    2015-01-01

    The minimum feature size and obtainable tolerances of additive manufacturing processes are linked to the smallest volumetric elements (voxels) that can be created. This work presents the iterative design of a test part to investigate the resolution of AM processes with voxel sizes at the micro...... scale. Each design iteration reduces the test part size, increases the number of test features, improves functionality, and decreases coupling in the part. The final design is a set of three test parts that are easy to orient and measure, and that provide useful information about micro additive...... manufacturing processes....

  19. The cost efficiency of exploratory testing:ISTQB certified testing compared with RST

    OpenAIRE

    Niittyviita, S. (Sampo)

    2017-01-01

    Abstract The research of software testing and the practices of software testing in the industry are separated by gaps in some areas. One such gap regards Exploratory Testing (ET). ET is probably the most widely used software testing approach in the industry, yet it is lacking research and many of the manuals of software engineering either ignore or look down on it. In addition, ET has the absence of widespread methodology a...

  20. Efficient compliance with prescribed bounds on operational parameters by means of hypothesis testing using reactor data

    International Nuclear Information System (INIS)

    Sermer, P.; Olive, C.; Hoppe, F.M.

    2000-01-01

    - A common problem in any reactor operations is to comply with a requirement that certain operational parameters are constrained to lie within some prescribed bounds. The fundamental issue which is to be addressed in any compliance description can be stated as follows: The compliance definition, compliance procedures and allowances for uncertainties in data and accompanying methodologies, should be well defined and justifiable. To this end, a mathematical framework for compliance, in which the computed or measured estimates of process parameters are considered random variables, is described in this paper. This allows a statistical formulation of the definition of compliance with licence or otherwise imposed limits. An important aspect of the proposed methodology is that the derived statistical tests are obtained by a Monte Carlo procedure using actual reactor operational data. The implementation of the methodology requires a routine surveillance of the reactor core in order to perform the underlying statistical tests. The additional work required for surveillance is balanced by the fact that the resulting actions on the reactor operations, implemented in station procedures, make the reactor 'safer' by increasing the operating margins. Furthermore, increased margins are also achieved by efficient solution techniques which may allow an increase in reactor power. A rigorous analysis of a compliance problem using statistical hypothesis testing based on extreme value probability distributions and actual reactor operational data leads to effective solutions in the areas of licensing, nuclear safety, reliability and competitiveness of operating nuclear reactors. (author)

  1. Efficiency analysis of wood processing industry in China during 2006-2015

    Science.gov (United States)

    Zhang, Kun; Yuan, Baolong; Li, Yanxuan

    2018-03-01

    The wood processing industry is an important industry which affects the national economy and social development. The data envelopment analysis model (DEA) is a quantitative evaluation method for studying industrial efficiency. In this paper, the wood processing industry of 8 provinces in southern China is taken as the study object, and the efficiency of each province in 2006 to 2015 was measured and calculated with the DEA method, and the efficiency changes, technological changes and Malmquist index were analyzed dynamically. The empirical results show that there is a widening gap in the efficiency of wood processing industry of the 8 provinces, and the technological progress has shown a lag in the promotion of wood processing industry. According to the research conclusion, along with the situation of domestic and foreign wood processing industry development, the government must introduce relevant policies to strengthen the construction of the wood processing industry technology innovation policy system and the industrial coordinated development system.

  2. On the Bahadur-efficient testing of uniformity by means of entropy

    Czech Academy of Sciences Publication Activity Database

    Harremoes, P.; Vajda, Igor

    2008-01-01

    Roč. 54, č. 1 (2008), s. 321-331 ISSN 0018-9448 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bahadur-efficient testing * power divergence tests * power entropy tests Subject RIV: BD - Theory of Information Impact factor: 3.793, year: 2008 http://library.utia.cas.cz/separaty/2010/SI/vajda-on the bahadur-efficient testing of uniformity by means of entropy.pdf

  3. Minicomputer controlled test system for process control and monitoring systems

    International Nuclear Information System (INIS)

    Worster, L.D.

    A minicomputer controlled test system for testing process control and monitoring systems is described. This system, in service for over one year, has demonstrated that computerized control of such testing has a real potential for expanding the scope of the testing, improving accuracy of testing, and significantly reducing the time required to do the testing. The test system is built around a 16-bit minicomputer with 12K of memory. The system programming language is BASIC with the addition of assembly level routines for communication with the peripheral devices. The peripheral devices include a 100 channel scanner, analog-to-digital converter, visual display, and strip printer. (auth)

  4. Testing the Weak Form Efficiency in Pakistan’s Equity, Badla and Money Markets

    OpenAIRE

    Rashid, Abdul; Husain, Fazal

    2009-01-01

    The paper test the weak form market efficient hypothesis for Pakistan’s equity, badla and money markets with an aim to investigate which one of them is most efficient in the weak form sense. The analysis provides evidence, under the assumption of heteroscedasticity, that the KSE is weak-form efficient over the full-length sample period. Nevertheless, the analysis reports that over the same period the other two markets viz. badla and money are not weak form efficient. The badla market was effi...

  5. Test processing integrated system (S.I.D.E.X.)

    International Nuclear Information System (INIS)

    Sabas, M.; Oules, H.; Badel, D.

    1969-01-01

    The Test Processing Integrated System is mostly composed of a CAE 9080 (equiv. S. D. S. 9300) computer which is equipped of a 100 000 samples/sec acquisition system. The System is designed for high speed data acquisition and data processing on environment tests, and also calculation of structural models. Such a digital appliance on data processing has many advantages compared to the conventional methods based on analog instruments. (author) [fr

  6. Impact of Educational Level on Performance on Auditory Processing Tests.

    Science.gov (United States)

    Murphy, Cristina F B; Rabelo, Camila M; Silagi, Marcela L; Mansur, Letícia L; Schochat, Eliane

    2016-01-01

    Research has demonstrated that a higher level of education is associated with better performance on cognitive tests among middle-aged and elderly people. However, the effects of education on auditory processing skills have not yet been evaluated. Previous demonstrations of sensory-cognitive interactions in the aging process indicate the potential importance of this topic. Therefore, the primary purpose of this study was to investigate the performance of middle-aged and elderly people with different levels of formal education on auditory processing tests. A total of 177 adults with no evidence of cognitive, psychological or neurological conditions took part in the research. The participants completed a series of auditory assessments, including dichotic digit, frequency pattern and speech-in-noise tests. A working memory test was also performed to investigate the extent to which auditory processing and cognitive performance were associated. The results demonstrated positive but weak correlations between years of schooling and performance on all of the tests applied. The factor "years of schooling" was also one of the best predictors of frequency pattern and speech-in-noise test performance. Additionally, performance on the working memory, frequency pattern and dichotic digit tests was also correlated, suggesting that the influence of educational level on auditory processing performance might be associated with the cognitive demand of the auditory processing tests rather than auditory sensory aspects itself. Longitudinal research is required to investigate the causal relationship between educational level and auditory processing skills.

  7. Testing Efficiency of Derivative Markets: ISE30, ISE100, USD and EURO

    OpenAIRE

    Akal, Mustafa; Birgili, Erhan; Durmuskaya, Sedat

    2012-01-01

    This study attempts to develop new market efficiency tests depending on the spot and future prices, or the differences of them alternative to traditional unit root test build on univariate time series. As a result of the autocorrelation, normality and run tests applied to spot and futures prices or differences of them, and Adopted Purchasing Power Parity test based on a regression the future markets of ISE30, ISE100 index indicators, USD and Euro currencies, all of which have been traded dail...

  8. Efficient reading in standardized tests for EFL learners : a case study of reading strategies used by Chinese English major students in TEM-4

    OpenAIRE

    Xia, Yan

    2011-01-01

    The aim of this study is to investigate the reading strategies used by Chinese English major students in the reading component in standardized national tests of TEM-4 with regard to reading efficiency. The research questions include: 1) what strategies are used by the students in TEM-4 test context; 2) whether there is a significant correlation between strategy use and efficient reading in the test; 3) what kinds of reading problems are revealed in the students’ use of processing strategies; ...

  9. Improvement for enhancing effectiveness of universal power system (UPS) continuous testing process

    Science.gov (United States)

    Sriratana, Lerdlekha

    2018-01-01

    This experiment aims to enhance the effectiveness of the Universal Power System (UPS) continuous testing process of the Electrical and Electronic Institute by applying work scheduling and time study methods. Initially, the standard time of testing process has not been considered that results of unaccurate testing target and also time wasting has been observed. As monitoring and reducing waste time for improving the efficiency of testing process, Yamazumi chart and job scheduling theory (North West Corner Rule) were applied to develop new work process. After the improvements, the overall efficiency of the process possibly increased from 52.8% to 65.6% or 12.7%. Moreover, the waste time could reduce from 828.3 minutes to 653.6 minutes or 21%, while testing units per batch could increase from 3 to 4 units. Therefore, the number of testing units would increase from 12 units up to 20 units per month that also contribute to increase of net income of UPS testing process by 72%.

  10. Highly efficient polymer solar cells with printed photoactive layer: rational process transfer from spin-coating

    KAUST Repository

    Zhao, Kui

    2016-09-05

    Scalable and continuous roll-to-roll manufacturing is at the heart of the promise of low-cost and high throughput manufacturing of solution-processed photovoltaics. Yet, to date the vast majority of champion organic solar cells reported in the literature rely on spin-coating of the photoactive bulk heterojunction (BHJ) layer, with the performance of printed solar cells lagging behind in most instances. Here, we investigate the performance gap between polymer solar cells prepared by spin-coating and blade-coating the BHJ layer for the important class of modern polymers exhibiting no long range crystalline order. We find that thickness parity does not always yield performance parity even when using identical formulations. Significant differences in the drying kinetics between the processes are found to be responsible for BHJ nanomorphology differences. We propose an approach which benchmarks the film drying kinetics and associated BHJ nanomorphology development against those of the champion laboratory devices prepared by spin-coating the BHJ layer by adjusting the process temperature. If the optimization requires the solution concentration to be changed, then it is crucial to maintain the additive-to-solute volume ratio. Emulating the drying kinetics of spin-coating is also shown to help achieve morphological and performance parities. We put this approach to the test and demonstrate printed PTB7:PC71BM polymer solar cells with efficiency of 9% and 6.5% PCEs on glass and flexible PET substrates, respectively. We further demonstrate performance parity for two other popular donor polymer systems exhibiting rigid backbones and absence of a long range crystalline order, achieving a PCE of 9.7%, the highest efficiency reported to date for a blade coated organic solar cell. The rational process transfer illustrated in this study should help the broader and successful adoption of scalable printing methods for these material systems.

  11. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants.

    Science.gov (United States)

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-11-15

    Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test-a score test-with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene-gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test-up to 23 more associations-whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene-gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. heckerma@microsoft.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  12. A new condition for assessing the clinical efficiency of a diagnostic test.

    Science.gov (United States)

    Bokhari, Ehsan; Hubert, Lawrence

    2015-09-01

    When prediction using a diagnostic test outperforms simple prediction using base rates, the test is said to be "clinically efficient," a term first introduced into the literature by Meehl and Rosen (1955) in Psychological Bulletin. This article provides three equivalent conditions for determining the clinical efficiency of a diagnostic test: (a) Meehl-Rosen (Meehl & Rosen, 1955); (b) Dawes (Dawes, 1962); and (c) the Bokhari-Hubert condition, introduced here for the first time. Clinical efficiency is then generalized to situations where misclassification costs are considered unequal (for example, false negatives are more costly than false positives). As an illustration, the clinical efficiency of an actuarial device for predicting violent and dangerous behavior is examined that was developed as part of the MacArthur Violence Risk Assessment Study. (c) 2015 APA, all rights reserved.

  13. High efficient ammonia heat pump system for industrial process water using the ISEC concept. Part 2

    DEFF Research Database (Denmark)

    Olesen, Martin F.; Madsen, Claus; Olsen, Lars

    2014-01-01

    The Isolated System Energy Charging (ISEC) concept allows for a high efficiency of a heat pump system for hot water production. The ISEC concept consists of two water storage tanks, one charged and one discharged. The charged tank is used for the industrial process, while the discharged tank...... is charging. The charging of the tank is done by recirculating water through the condenser and thereby gradually heating the water. The modelling of the system is described in Part I [1]. In this part, Part II, an experimental test setup of the tank system is reported, the results are presented and further...... modelling of the heat pump and tank system is performed (in continuation of Part I). The modelling is extended to include the system performance with different natural refrigerants and the influence of different types of compressors....

  14. Efficient Noninferiority Testing Procedures for Simultaneously Assessing Sensitivity and Specificity of Two Diagnostic Tests

    Directory of Open Access Journals (Sweden)

    Guogen Shan

    2015-01-01

    Full Text Available Sensitivity and specificity are often used to assess the performance of a diagnostic test with binary outcomes. Wald-type test statistics have been proposed for testing sensitivity and specificity individually. In the presence of a gold standard, simultaneous comparison between two diagnostic tests for noninferiority of sensitivity and specificity based on an asymptotic approach has been studied by Chen et al. (2003. However, the asymptotic approach may suffer from unsatisfactory type I error control as observed from many studies, especially in small to medium sample settings. In this paper, we compare three unconditional approaches for simultaneously testing sensitivity and specificity. They are approaches based on estimation, maximization, and a combination of estimation and maximization. Although the estimation approach does not guarantee type I error, it has satisfactory performance with regard to type I error control. The other two unconditional approaches are exact. The approach based on estimation and maximization is generally more powerful than the approach based on maximization.

  15. The Study of Image Processing Method for AIDS PA Test

    International Nuclear Information System (INIS)

    Zhang, H J; Wang, Q G

    2006-01-01

    At present, the main test technique of AIDS is PA in China. Because the judgment of PA test image is still depending on operator, the error ration is high. To resolve this problem, we present a new technique of image processing, which first process many samples and get the data including coordinate of center and the rang of kinds images; then we can segment the image with the data; at last, the result is exported after data was judgment. This technique is simple and veracious; and it also turns out to be suitable for the processing and analyzing of other infectious diseases' PA test image

  16. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  17. Collaborative Testing: Cognitive and Interpersonal Processes Related to Enhanced Test Performance

    Science.gov (United States)

    Kapitanoff, Susan H.

    2009-01-01

    Research has demonstrated that collaborative testing, working on tests in groups, leads to improved test scores but the mechanism by which this occurs has not been specified. Three factors were proposed as mediators: cognitive processes, interpersonal interactions and reduced test-anxiety. Thirty-three students completed a multiple-choice exam…

  18. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    Science.gov (United States)

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity

  19. Testing the evolution of crude oil market efficiency: Data have the conn

    International Nuclear Information System (INIS)

    Zhang, Bing; Li, Xiao-Ming; He, Fei

    2014-01-01

    Utilising a time-varying GAR (1)-TGARCH (1,1) model with different frequency data, we investigate the weak-form efficiency of major global crude oil spot markets in Europe, the US, the UAE and China for the period from December 2001 to August 2013. Our empirical results with weekly data indicate that all four markets have reached efficiency with few brief inefficient periods during the past decade, whereas the daily crude oil returns series suggest intermittent and inconsistent efficiency. We argue that the weekly Friday series fit the data better than the average series in autocorrelation tests. The evidence suggests that all four markets exhibit asymmetries in return-volatility reactions to different information shocks and that they react more strongly to bad news than to good news. The 2008 financial crisis has significantly affected the efficiency of oil markets. Furthermore, a comovement phenomenon and volatility spillover effects exist among the oil markets. Policy recommendations consistent with our empirical results are proposed, which address three issues: implementing prudential regulations, establishing an Asian pricing centre and improving transparency in crude oil spot markets. - Highlights: • We adopt a time-varying model to test the weak-form efficiency of crude oil markets. • Weekly oil returns series have been extremely efficient during the past decade. • Daily oil returns series have presented intermittent and inconsistent efficiency. • Oil markets react asymmetrically to different information shocks. • Policy recommendations are proposed according to the degree of efficiency

  20. Processing efficiency theory in children: working memory as a mediator between trait anxiety and academic performance.

    Science.gov (United States)

    Owens, Matthew; Stevenson, Jim; Norgate, Roger; Hadwin, Julie A

    2008-10-01

    Working memory skills are positively associated with academic performance. In contrast, high levels of trait anxiety are linked with educational underachievement. Based on Eysenck and Calvo's (1992) processing efficiency theory (PET), the present study investigated whether associations between anxiety and educational achievement were mediated via poor working memory performance. Fifty children aged 11-12 years completed verbal (backwards digit span; tapping the phonological store/central executive) and spatial (Corsi blocks; tapping the visuospatial sketchpad/central executive) working memory tasks. Trait anxiety was measured using the State-Trait Anxiety Inventory for Children. Academic performance was assessed using school administered tests of reasoning (Cognitive Abilities Test) and attainment (Standard Assessment Tests). The results showed that the association between trait anxiety and academic performance was significantly mediated by verbal working memory for three of the six academic performance measures (math, quantitative and non-verbal reasoning). Spatial working memory did not significantly mediate the relationship between trait anxiety and academic performance. On average verbal working memory accounted for 51% of the association between trait anxiety and academic performance, while spatial working memory only accounted for 9%. The findings indicate that PET is a useful framework to assess the impact of children's anxiety on educational achievement.

  1. Ammonia scrubber testing during IDMS SRAT and SME processing. Revision 1

    International Nuclear Information System (INIS)

    Lambert, D.P.

    1995-01-01

    This report summarizes results of the Integrated DWPF (Defense Waste Processing Facility) Melter System (IDMS) ammonia scrubber testing during the PX-7 run (the 7th IDMS run with a Purex type sludge). Operation of the ammonia scrubber during IDMS Sludge Receipt and Adjustment Tank (SRAT) and Slurry Mix Evaporator (SME) processing has been completed. The ammonia scrubber was successful in removing ammonia from the vapor stream to achieve NH3 concentrations far below the 10 ppM vapor exist design basis during SRAT processing. However, during SME processing, vapor NH3 concentrations as high as 450 ppM were measured exiting the scrubber. Problems during the SRAT and SME testing were vapor bypassing the scrubber and inefficient scrubbing of the ammonia at the end of the SME cycle (50% removal efficiency; 99.9% is design basis efficiency)

  2. A mechanistic understanding of processing additive-induced efficiency enhancement in bulk heterojunction organic solar cells

    KAUST Repository

    Schmidt, Kristin; Tassone, Christopher J.; Niskala, Jeremy R.; Yiu, Alan T.; Lee, Olivia P.; Weiss, Thomas M.; Wang, Cheng; Frechet, Jean; Beaujuge, Pierre; Toney, Michael F.

    2013-01-01

    The addition of processing additives is a widely used approach to increase power conversion efficiencies for many organic solar cells. We present how additives change the polymer conformation in the casting solution leading to a more intermixed

  3. Manufacturing polymer light emitting diode with high luminance efficiency by solution process

    Science.gov (United States)

    Kim, Miyoung; Jo, SongJin; Yang, Ho Chang; Yoon, Dang Mo; Kwon, Jae-Taek; Lee, Seung-Hyun; Choi, Ju Hwan; Lee, Bum-Joo; Shin, Jin-Koog

    2012-06-01

    While investigating polymer light emitting diodes (polymer-LEDs) fabricated by solution process, surface roughness influences electro-optical (E-O) characteristics. We expect that E-O characteristics such as luminance and power efficiency related to surface roughness and layer thickness of emitting layer with poly-9-Vinylcarbazole. In this study, we fabricated polymer organic light emitting diodes by solution process which guarantees easy, eco-friendly and low cost manufacturing for flexible display applications. In order to obtain high luminescence efficiency, E-O characteristics of these devices by varying parameters for printing process have been investigated. Therefore, we optimized process condition for polymer-LEDs by adjusting annealing temperatures of emission, thickness of emission layer showing efficiency (10.8 cd/A) at 10 mA/cm2. We also checked wavelength dependent electroluminescence spectrum in order to find the correlation between the variation of efficiency and the thickness of the layer.

  4. IN-SITU TEST OF PRESSURE PIPELINE VIBRATION BASED ON DATA ACQUISITION AND SIGNAL PROCESSING

    OpenAIRE

    Hou, Huimin; Xu, Cundong; Liu, Hui; Wang, Rongrong; Jie, Junkun; Ding, Lianying

    2015-01-01

    Pipeline vibration of high frequency and large amplitude is an important factor that impacts the safe operation of pumping station and the efficiency of the pumps. Through conducting the vibration in-situ test of pipeline system in the pumping station, we can objectively analyze the mechanism of pipeline vibration and evaluate the stability of pipeline operation. By using DASP (data acquisition & signal processing) in the in-situ test on the 2# pipeline of the third pumping station in the gen...

  5. J-2X Test Articles Using FDM Process

    Science.gov (United States)

    Anderson, Ted; Ruf, Joe; Steele, Phil

    2010-01-01

    This viewgraph presentation gives a brief history of the J-2X engine, along with detailed description of the material demonstrator and test articles that were created using Fused Deposition Modeling (FDM) process.

  6. Non-process instrumentation surveillance and test reduction

    International Nuclear Information System (INIS)

    Ferrell, R.; LeDonne, V.; Donat, T.; Thomson, I.; Sarlitto, M.

    1993-12-01

    Analysis of operating experience, instrument failure modes, and degraded instrument performance has led to a reduction in Technical Specification surveillance and test requirements for nuclear power plant process instrumentation. These changes have resulted in lower plant operations and maintenance (O ampersand M) labor costs. This report explores the possibility of realizing similar savings by reducing requirements for non-process instrumentation. The project team reviewed generic Technical Specifications for the four major US nuclear steam supply system (NSSS) vendors (Westinghouse, General Electric, Combustion Engineering, and Babcock ampersand Wilcox) to identify nonprocess instrumentation for which surveillance/test requirements could be reduced. The team surveyed 10 utilities to identify specific non-process instrumentation at their plants for which requirements could be reduced. The team evaluated utility analytic approaches used to justify changes in surveillance/test requirements for process equipment to determine their applicability to non-process instrumentation. The report presents a prioritized list of non-process instrumentation systems suitable for surveillance/test requirements reduction. The top three systems in the list are vibration monitors, leak detection monitors, and chemistry monitors. In general, most non-process instrumentation governed by Technical Specification requirements are candidates for requirements reduction. If statistical requirements are somewhat relaxed, the analytic approaches previously used to reduce requirements for process instrumentation can be applied to non-process instrumentation. The report identifies as viable the technical approaches developed and successfully used by Southern California Edison, Arizona Public Service, and Boston Edison

  7. EFFICIENT QUANTITATIVE RISK ASSESSMENT OF JUMP PROCESSES: IMPLICATIONS FOR FOOD SAFETY

    OpenAIRE

    Nganje, William E.

    1999-01-01

    This paper develops a dynamic framework for efficient quantitative risk assessment from the simplest general risk, combining three parameters (contamination, exposure, and dose response) in a Kataoka safety-first model and a Poisson probability representing the uncertainty effect or jump processes associated with food safety. Analysis indicates that incorporating jump processes in food safety risk assessment provides more efficient cost/risk tradeoffs. Nevertheless, increased margin of safety...

  8. Energy Efficient Scheduling of Real Time Signal Processing Applications through Combined DVFS and DPM

    OpenAIRE

    Nogues , Erwan; Pelcat , Maxime; Menard , Daniel; Mercat , Alexandre

    2016-01-01

    International audience; This paper proposes a framework to design energy efficient signal processing systems. The energy efficiency is provided by combining Dynamic Frequency and Voltage Scaling (DVFS) and Dynamic Power Management (DPM). The framework is based on Synchronous Dataflow (SDF) modeling of signal processing applications. A transformation to a single rate form is performed to expose the application parallelism. An automated scheduling is then performed, minimizing the constraint of...

  9. An ultra-efficient nonlinear planar integrated platform for optical signal processing and generation

    DEFF Research Database (Denmark)

    Pu, Minhao; Ottaviano, Luisa; Semenova, Elizaveta

    2017-01-01

    This paper will discuss the recently developed integrated platform: AlGaAs-oninsulator and its broad range of nonlinear applications. Recent demonstrations of broadband optical signal processing and efficient frequency comb generations in this platform will be reviewed.......This paper will discuss the recently developed integrated platform: AlGaAs-oninsulator and its broad range of nonlinear applications. Recent demonstrations of broadband optical signal processing and efficient frequency comb generations in this platform will be reviewed....

  10. High Efficiency Mask Based Laser Materials Processing with TEA-CO2 - and Excimer Laser

    DEFF Research Database (Denmark)

    Bastue, Jens; Olsen, Flemmming Ove

    1997-01-01

    In general, mask based laser materials processing techniques suffer from a very low energy efficiency. We have developed a simple device called an energy enhancer, which is capable of increasing the energy efficiency of typical mask based laser materials processing systems. A short review of the ...... line marking with TEA-CO2 laser of high speed canning lines. The second one is manufactured for marking or microdrilling with excimer laser....

  11. Time reversal signal processing in acoustic emission testing

    Czech Academy of Sciences Publication Activity Database

    Převorovský, Zdeněk; Krofta, Josef; Kober, Jan; Dvořáková, Zuzana; Chlada, Milan; Dos Santos, S.

    2014-01-01

    Roč. 19, č. 12 (2014) ISSN 1435-4934. [European Conference on Non-Destructive Testing (ECNDT 2014) /11./. Praha, 06.10.2014-10.10.2014] Institutional support: RVO:61388998 Keywords : acoustic emission (AE) * ultrasonic testing (UT) * signal processing * source location * time reversal acoustic s * acoustic emission * signal processing and transfer Subject RIV: BI - Acoustic s http://www.ndt.net/events/ECNDT2014/app/content/Slides/637_Prevorovsky.pdf

  12. Validation of the Vanderbilt Holistic Face Processing Test

    OpenAIRE

    Wang, Chao-Chih; Ross, David A.; Gauthier, Isabel; Richler, Jennifer J.

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  13. Validation of the Vanderbilt Holistic Face Processing Test.

    OpenAIRE

    Chao-Chih Wang; Chao-Chih Wang; David Andrew Ross; Isabel Gauthier; Jennifer Joanna Richler

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  14. L2 Writing in Test and Non-test Situations: Process and Product

    Directory of Open Access Journals (Sweden)

    Baraa Khuder

    2015-02-01

    Full Text Available Test writers sometimes complain they cannot perform to their true abilities because of time constraints. We therefore examined differences in terms of process and product between texts produced under test and non-test conditions. Ten L2 postgraduates wrote two argumentative essays, one under test conditions, with only forty minutes being allowed and without recourse to resources, and one under non-test conditions, with unlimited time as well as access to the Internet. Keystroke logging, screen capture software, and stimulated recall protocols were used, participants explaining and commenting on their writing processes. Sixteen writing process types were identified. Higher proportions of the processes of translation and surface revision were recorded in the test situation, while meaningful revision and evaluation were both higher in the non-test situation. There was a statistically significant difference between time allocation for different processes at different stages. Experienced teachers awarded the non-test texts a mean score of almost one point (0.8 higher. A correlational analysis examining the relationship between writing process and product quality showed that while the distribution of writing processes can have an impact on text quality in the test situation, it had no effect on the product in the non-test situation.

  15. Non-destructive testing of tubes by electromagnetic processes

    International Nuclear Information System (INIS)

    Kowarski, A.

    1979-01-01

    This article reviews and assesses the non destructive testing techniques used for locating defects in tubes by electromagnetic processes. These form the basis of many testing devices, the diversity of which results from various factors: range of materials, methods of fabrication, specific defects of the product. There are two distinct main families of devices utilising two different principles: dispersion flow and Foucault currents [fr

  16. Using Knowledge Management to Revise Software-Testing Processes

    Science.gov (United States)

    Nogeste, Kersti; Walker, Derek H. T.

    2006-01-01

    Purpose: This paper aims to use a knowledge management (KM) approach to effectively revise a utility retailer's software testing process. This paper presents a case study of how the utility organisation's customer services IT production support group improved their test planning skills through applying the American Productivity and Quality Center…

  17. 10 CFR 431.86 - Uniform test method for the measurement of energy efficiency of commercial packaged boilers.

    Science.gov (United States)

    2010-01-01

    ... efficiency of commercial packaged boilers. 431.86 Section 431.86 Energy DEPARTMENT OF ENERGY ENERGY... Boilers Test Procedures § 431.86 Uniform test method for the measurement of energy efficiency of... packaged boiler equipment classes. (B) On or after March 2, 2012, conduct the thermal efficiency test as...

  18. Development of energy-efficient processes for natural gas liquids recovery

    International Nuclear Information System (INIS)

    Yoon, Sekwang; Binns, Michael; Park, Sangmin; Kim, Jin-Kuk

    2017-01-01

    A new NGL (natural gas liquids) recovery process configuration is proposed which can offer improved energy efficiency and hydrocarbon recovery. The new process configuration is an evolution of the conventional turboexpander processes with the introduction of a split stream transferring part of the feed to the demethanizer column. In this way additional heat recovery is possible which improves the energy efficiency of the process. To evaluate the new process configuration a number of different NGL recovery process configurations are optimized and compared using a process simulator linked interactively with external optimization methods. Process integration methodology is applied as part of the optimization to improve energy recovery during the optimization. Analysis of the new process configuration compared with conventional turbo-expander process designs demonstrates the benefits of the new process configuration. - Highlights: • Development of a new energy-efficient natural gas liquids recovery process. • Improving energy recovery with application of process integration techniques. • Considering multiple different structural changes lead to considerable energy savings.

  19. Development of an eco-efficient product/process for the vulcanising industry

    Directory of Open Access Journals (Sweden)

    Becerra, M. B.

    2014-08-01

    Full Text Available This paper presents the development of an eco-efficient product/process, which has improved mechanical properties from the introduction of natural fibres in the EPDM (Ethylene-Propylene-Diene-Terpolymer rubber formulation. The optimisation analysis is made by a fractional factorial design 211-7. Different formulations were evaluated using a multi-response desirability function, with the aim of finding efficient levels for the manufacturing time-cycle, improving the mechanical properties of the product, and reducing the raw material costs. The development of an eco-efficient product/process generates a sustainable alternative to conventional manufacturing.

  20. Efficiency of the pre-heater against flow rate on primary the beta test loop

    International Nuclear Information System (INIS)

    Edy Sumarno; Kiswanta; Bambang Heru; Ainur R; Joko P

    2013-01-01

    Calculation of efficiency of the pre-heater has been carried out against the flow rate on primary the BETA Test Loop. BETA test loop (UUB) is a facilities of experiments to study the thermal hydraulic phenomenon, especially for thermal hydraulic post-LOCA (Lost of Coolant Accident). Sequences removal on the BETA Test Loop contained a pre-heater that serves as a getter heat from the primary side to the secondary side, determination of efficiency is to compare the incoming heat energy with the energy taken out by a secondary fluid. Characterization is intended to determine the performance of a pre-heater, then used as tool for analysis, and as a reference design experiments. Calculation of efficiency methods performed by operating the pre-heater with fluid flow rate variation on the primary side. Calculation of efficiency on the results obtained that the efficiency change with every change of flow rate, the flow rate is 71.26% on 163.50 ml/s and 60.65% on 850.90 ml/s. Efficiency value can be even greater if the pre-heater tank is wrapped with thermal insulation so there is no heat leakage. (author)

  1. Errors in the Total Testing Process in the Clinical Chemistry ...

    African Journals Online (AJOL)

    2018-03-01

    Mar 1, 2018 ... testing processes impair the clinical decision-making process. Such errors are ... and external quality control exceeding the target range, (14.4%) and (51.4%) .... version 3.5.3 and transferred to Statistical. Package for the ...

  2. The no-arbitrage efficiency test of the OMX Index option market

    DEFF Research Database (Denmark)

    Hou, Ai Jun; Wiktorsson, Magnus; Zhao, RuiZhi

    2012-01-01

    In this paper, we test the market efficiency of the OMXS30 Index option market. The market efficiency definition is the absence of arbitrage oppor- tunity in the market. We first check the arbitrage opportunity by examining the boundary conditions and the Put-Call-Parity that must be satisfied...... in the market. Then a variance based efficiency test is performed by establish- ing a risk neutral portfolio and re-balance the initial portfolio in different trading strategies. In order to choose the most appropriate model for option price and hedging strategies, we calibrate several most applied models, i.......e. the BS, Merton, Heston, Bates model and Affine Jump Diffusion models. Our results indicate that the AJD model significantly outperforms other models in the option price forecast and the trading strategies. The bound- ary and the PCP test and the dynamic hedging strategy results evidence...

  3. Final Technical Report on Development of an Economic and Efficient Biodiesel production Process (NC)

    Energy Technology Data Exchange (ETDEWEB)

    Tirla, Cornelia [Univ. of North Carolina, Pembroke, NC (United States); Dooling, Thomas A. [Univ. of North Carolina, Pembroke, NC (United States); Smith, Rachel B. [Univ. of North Carolina, Pembroke, NC (United States); Shi, Xinyan [Univ. of North Carolina, Pembroke, NC (United States); Shahbazi, Abolghasem [North Carolina Agricultural and Technical State Univ., Greensboro, NC (United States)

    2014-03-19

    The Biofuels Team at The University of North Carolina at Pembroke and North Carolina A&T State University carried out a joint research project aimed at developing an efficient process to produce biodiesel. In this project, the team developed and tested various types of homogeneous and heterogeneous catalysts which could replace the conventionally used soluble potassium hydroxide catalyst which, traditionally, must be separated and disposed of at the end of the process. As a result of this screening, the homogeneous catalyst choline hydroxide was identified as a potential replacement for the traditional catalyst used in this process, potassium hydroxide, due to its decreased corrosiveness and toxicity. A large number of heterogeneous catalysts were produced and tested in order to determine the scaffold, ion type and ion concentration which would produce optimum yield of biodiesel. The catalyst with 12% calcium on Zeolite β was identified as being highly effective and optimal reaction conditions were identified. Furthermore, a packed bed reactor utilizing this type of catalyst was designed, constructed and tested in order to further optimize the process. An economic analysis of the viability of the project showed that the cost of an independent farmer to produce the fuelstock required to produce biodiesel exceeds the cost of petroleum diesel under current conditions and that therefore without incentives, farmers would not be able to benefit economically from producing their own fuel. An educational website on biodiesel production and analysis was produced and a laboratory experiment demonstrating the production of biodiesel was developed and implemented into the Organic Chemistry II laboratory curriculum at UNCP. Five workshops for local farmers and agricultural agents were held in order to inform the broader community about the various fuelstock available, their cultivation and the process and advantages of biodiesel use and production. This project fits both

  4. Efficient high-throughput biological process characterization: Definitive screening design with the ambr250 bioreactor system.

    Science.gov (United States)

    Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam

    2015-01-01

    The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.

  5. Collaborative testing for key-term definitions under representative conditions: Efficiency costs and no learning benefits.

    Science.gov (United States)

    Wissman, Kathryn T; Rawson, Katherine A

    2018-01-01

    Students are expected to learn key-term definitions across many different grade levels and academic disciplines. Thus, investigating ways to promote understanding of key-term definitions is of critical importance for applied purposes. A recent survey showed that learners report engaging in collaborative practice testing when learning key-term definitions, with outcomes also shedding light on the way in which learners report engaging in collaborative testing in real-world contexts (Wissman & Rawson, 2016, Memory, 24, 223-239). However, no research has directly explored the effectiveness of engaging in collaborative testing under representative conditions. Accordingly, the current research evaluates the costs (with respect to efficiency) and the benefits (with respect to learning) of collaborative testing for key-term definitions under representative conditions. In three experiments (ns = 94, 74, 95), learners individually studied key-term definitions and then completed retrieval practice, which occurred either individually or collaboratively (in dyads). Two days later, all learners completed a final individual test. Results from Experiments 1-2 showed a cost (with respect to efficiency) and no benefit (with respect to learning) of engaging in collaborative testing for key-term definitions. Experiment 3 evaluated a theoretical explanation for why collaborative benefits do not emerge under representative conditions. Collectively, outcomes indicate that collaborative testing versus individual testing is less effective and less efficient when learning key-term definitions under representative conditions.

  6. A Hydrogen Containment Process for Nuclear Thermal Engine Ground testing

    Science.gov (United States)

    Wang, Ten-See; Stewart, Eric; Canabal, Francisco

    2016-01-01

    The objective of this study is to propose a new total hydrogen containment process to enable the testing required for NTP engine development. This H2 removal process comprises of two unit operations: an oxygen-rich burner and a shell-and-tube type of heat exchanger. This new process is demonstrated by simulation of the steady state operation of the engine firing at nominal conditions.

  7. Measurements of Conversion Efficiency for a Flat Plate Thermophotovoltaic System Using a Photonic Cavity Test System

    International Nuclear Information System (INIS)

    Brown, E.J.; Ballinger, C.T.; Burger, S.R.; Charache, G.W.; Danielson, L.R.; DePoy, D.M.; Donovan, T.J.; LoCascio, M.

    2000-01-01

    The performance of a 1 cm 2 thermophotovoltaic (TPV) module was recently measured in a photonic cavity test system. A conversion efficiency of 11.7% was measured at a radiator temperature of 1076 C and a module temperature of 29.9 C. This experiment achieved the highest direct measurement of efficiency for an integrated TPV system. Efficiency was calculated from the ratio of the peak (load matched) electrical power output and the heat absorption rate. Measurements of these two parameters were made simultaneously to assure the validity of the measured efficiency value. This test was conducted in a photonic cavity which mimicked a typical flat-plate TPV system. The radiator was a large, flat graphite surface. The module was affixed to the top of a copper pedestal for heat absorption measurements. The heat absorption rate was proportional to the axial temperature gradient in the pedestal under steady-state conditions. The test was run in a vacuum to eliminate conductive and convective heat transfer mechanisms. The photonic cavity provides the optimal test environment for TPV efficiency measurements because it incorporates all important physical phenomena found in an integrated TPV system: high radiator emissivity and blackbody spectral shape, photon recycling, Lambertian distribution of incident radiation and complex geometric effects. Furthermore, the large aspect ratio between radiating surface area and radiator/module spacing produces a view factor approaching unity with minimal photon leakage

  8. Fidelity of test development process within a national science grant

    Science.gov (United States)

    Brumfield, Teresa E.

    In 2002, a math-science partnership (MSP) program was initiated by a national science grant. The purpose of the MSP program was to promote the development, implementation, and sustainability of promising partnerships among institutions of higher education, K-12 schools and school systems, as well as other important stakeholders. One of the funded projects included a teacher-scientist collaborative that instituted a professional development system to prepare teachers to use inquiry-based instructional modules. The MSP program mandated evaluations of its funded projects. One of the teacher-scientist collaborative project's outcomes specifically focused on teacher and student science content and process skills. In order to provide annual evidence of progress and to measure the impact of the project's efforts, and because no appropriate science tests were available to measure improvements in content knowledge of participating teachers and their students, the project contracted for the development of science tests. This dissertation focused on the process of test development within an evaluation and examined planned (i.e., expected) and actual (i.e., observed) test development, specifically concentrating on the factors that affected the actual test development process. Planned test development was defined as the process of creating tests according to the well-established test development procedures recommended by the AERA/APA/NCME 1999 Standards for Educational and Psychological Testing. Actual test development was defined as the process of creating tests as it actually took place. Because case study provides an in-depth, longitudinal examination of an event (i.e., case) in a naturalistic setting, it was selected as the appropriate methodology to examine the difference between planned and actual test development. The case (or unit of analysis) was the test development task, a task that was bounded by the context in which it occurred---and over which this researcher had

  9. Data triggered data processing at the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1986-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory the authors schedule jobs to process experimental data to be collected during a five minute shot cycle. The data driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on the networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. The authors report here on details of diagnostic data processing and their experiences

  10. Test of Poisson Process for Earthquakes in and around Korea

    International Nuclear Information System (INIS)

    Noh, Myunghyun; Choi, Hoseon

    2015-01-01

    Since Cornell's work on the probabilistic seismic hazard analysis (hereafter, PSHA), majority of PSHA computer codes are assuming that the earthquake occurrence is Poissonian. To the author's knowledge, it is uncertain who first opened the issue of the Poisson process for the earthquake occurrence. The systematic PSHA in Korea, led by the nuclear industry, were carried out for more than 25 year with the assumption of the Poisson process. However, the assumption of the Poisson process has never been tested. Therefore, the test is of significance. We tested whether the Korean earthquakes follow the Poisson process or not. The Chi-square test with the significance level of 5% was applied. The test turned out that the Poisson process could not be rejected for the earthquakes of magnitude 2.9 or larger. However, it was still observed in the graphical comparison that some portion of the observed distribution significantly deviated from the Poisson distribution. We think this is due to the small earthquake data. The earthquakes of magnitude 2.9 or larger occurred only 376 times during 34 years. Therefore, the judgment on the Poisson process derived in the present study is not conclusive

  11. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  12. An efficient ultrasonic SAFT imaging for pulse-echo immersion testing

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Hong Wei [Changsha University of Science and Technology, Changsha (China); Jeong, Hyun Jo [Div. of Mechanical and Automotive Engineering, Wonkwang University, Iksan (Korea, Republic of)

    2017-04-15

    An ultrasonic synthetic aperture focusing technique (SAFT) using a root mean square (RMS) velocity model is proposed for pulse-echo immersion testing to improve the computational efficiency. Considering the immersion ultrasonic testing of a steel block as an example, three kinds of imaging were studied (B-Scan, SAFT imaging based on ray tracing technology and RMS velocity). The experimental results show that two kinds of SAFT imaging have almost the same imaging performance, while the efficiency of RMS velocity SAFT imaging is almost 25 times greater than the SAFT based on Snell's law.

  13. An efficient ultrasonic SAFT imaging for pulse-echo immersion testing

    International Nuclear Information System (INIS)

    Hu, Hong Wei; Jeong, Hyun Jo

    2017-01-01

    An ultrasonic synthetic aperture focusing technique (SAFT) using a root mean square (RMS) velocity model is proposed for pulse-echo immersion testing to improve the computational efficiency. Considering the immersion ultrasonic testing of a steel block as an example, three kinds of imaging were studied (B-Scan, SAFT imaging based on ray tracing technology and RMS velocity). The experimental results show that two kinds of SAFT imaging have almost the same imaging performance, while the efficiency of RMS velocity SAFT imaging is almost 25 times greater than the SAFT based on Snell's law

  14. What We Know about Software Test Maturity and Test Process Improvement

    NARCIS (Netherlands)

    Garousi, Vahid; Felderer, Michael; Hacaloglu, Tuna

    2018-01-01

    In many companies, software testing practices and processes are far from mature and are usually conducted in an ad hoc fashion. Such immature practices lead to negative outcomes - for example, testing that doesn't detect all the defects or that incurs cost and schedule overruns. To conduct test

  15. Expedited, uniform processing of multi-vendor gamma test data

    International Nuclear Information System (INIS)

    Zimmerman, R.E.

    2002-01-01

    Aim: Acceptance testing of gamma camera performance is time consuming. When data is collected from different vendors image format and methodology differences can result in disjointed and difficult to compare results. Even when performing NEMA specified tests consistent processing of data from multi-vendor cameras results in methodological inconsistencies. A more uniform and consistent method to process data collected from gamma cameras would be a boon to those involved in acquiring and processing such test data. Methods and Materials: Image J is an image processing program freely available on the Web at http://rsb.info.nih.gov/ij/. It can be run using a normal Web browser or installed on any computer. Since it is written in Java it is platform and operating system independent. Image J is very extensible, object based and has a large international user community in many imaging disciplines. Extensions to Image J are written using the Java programming language and the internal macro recording facility. Image J handles multiple image formats useful in nuclear medicine including DICOM, RAW, BMP, JPG, Interfile, AVI and QT. Extensions have been written to process and determine gamma camera intrinsic and extrinsic uniformity, COR errors, planar extrinsic resolution and reconstructed spatial resolution. The testing and processing adhere closely to NEMA specified procedures and result in quantitative measures of performance traceable to NEMA and manufacturers specifications. Results and Conclusions: Extensions to Image J written specifically to process gamma camera acceptance test data have resulted in considerable decrease in time to complete the analysis and allows a consistent, vendor independent, method to measure performance of cameras from multiple vendors. Quality control images from multiple camera vendors are also easily processed in a consistent fashion. The availability of this or similar platform and vendor independent software should lead to more complete

  16. The coupled process laboratory test of highly compacted bentonite

    International Nuclear Information System (INIS)

    Shen Zhenyao; Li Guoding; Li Shushen; Wang Chengzu

    2004-01-01

    Highly compacted bentonite blocks have been heated and hydrated in the laboratory in order to simulate the thermo-hydro-mechanical (THM) coupled processes of buffer material in a high-level radioactive waste (HLW) repository. The experiment facility, which is composed of experiment barrel, heated system, high pressure water input system, temperature measure system, water content measure system and swelling stress system, is introduced in this paper. The steps of the THM coupled experiment are also given out in detail. There are total 10 highly compacted bentonite blocks used in this test. Experimental number 1-4 are the tests with the heater and the hydrated process, which temperature distribution vs. time and final moisture distribution are measured. Experimental number 5-8 are the tests with the heater and without the hydrated process, which temperature distribution vs. time and final moisture distribution are measured. Experimental number 9-10 are the tests with the heater and the hydrated process, which temperature distribution vs. time, final moisture distribution and the swelling stress distribution at some typical points vs. time are measured. The maximum test time is nearly 20 days and the minimum test time is only 8 hours. The results show that the temperature field is little affected by hydration process and stress condition, but moisture transport and stress distribution are a little affected by the thermal gradient. The results also show that the water head difference is the mainly driving force of hydration process and the swelling stress is mainly from hydration process. It will great help to understand better about heat and mass transfer in porous media and the THM coupled process in actual HLW disposal. (author)

  17. Motivation as a factor affecting the efficiency of cognitive processes in elderly patients with hypertension

    Directory of Open Access Journals (Sweden)

    Zinchenko, Yury P.

    2013-12-01

    Full Text Available The main purpose of the present study was to assess the role of motivation in the effective cognitive activity of elderly hypertension (HTN patients provided with antihypertensive treatment; 25 patients with HTN took part in the study, stage 1-2; their mean age was 67.6±6.1. The psychological examination program embraced a quantitative measurement of intelligence quotient (IQ with the Wechsler Adult Intelligence Scale, and an investigation into the qualitative features of their cognitive processes, applying a pathopsychological study procedure (Zeigarnik, 1962, 1972 and the principles of psychological syndrome analysis (Vygotsky-Luria-Zeigarnik school. The results showed that within the psychological syndrome structure of cognitive disorders in HTN patients, the leading part is played by two syndrome-generating factors: a neurodynamic factor and a motivational factor. The patients with reduced motivation would achieve poor general test results, if compared with the group of highly motivated participants. A correlation analysis of the data revealed the interconnection between frequency disturbances in motivation and the frequency in occurrence of various signs of cognitive decline, such as low efficiency in memorization and delayed recall, as well as lower IQ test results. The data provide a strong argument to support the hypothesis that motivation is of particular importance as a factor in the generation of cognitive disorders in HTN patients.

  18. Improving laboratory efficiencies to scale-up HIV viral load testing.

    Science.gov (United States)

    Alemnji, George; Onyebujoh, Philip; Nkengasong, John N

    2017-03-01

    Viral load measurement is a key indicator that determines patients' response to treatment and risk for disease progression. Efforts are ongoing in different countries to scale-up access to viral load testing to meet the Joint United Nations Programme on HIV and AIDS target of achieving 90% viral suppression among HIV-infected patients receiving antiretroviral therapy. However, the impact of these initiatives may be challenged by increased inefficiencies along the viral load testing spectrum. This will translate to increased costs and ineffectiveness of scale-up approaches. This review describes different parameters that could be addressed across the viral load testing spectrum aimed at improving efficiencies and utilizing test results for patient management. Though progress is being made in some countries to scale-up viral load, many others still face numerous challenges that may affect scale-up efficiencies: weak demand creation, ineffective supply chain management systems; poor specimen referral systems; inadequate data and quality management systems; and weak laboratory-clinical interface leading to diminished uptake of test results. In scaling up access to viral load testing, there should be a renewed focus to address efficiencies across the entire spectrum, including factors related to access, uptake, and impact of test results.

  19. CLAMP - a toolkit for efficiently building customized clinical natural language processing pipelines.

    Science.gov (United States)

    Soysal, Ergin; Wang, Jingqi; Jiang, Min; Wu, Yonghui; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua

    2017-11-24

    Existing general clinical natural language processing (NLP) systems such as MetaMap and Clinical Text Analysis and Knowledge Extraction System have been successfully applied to information extraction from clinical text. However, end users often have to customize existing systems for their individual tasks, which can require substantial NLP skills. Here we present CLAMP (Clinical Language Annotation, Modeling, and Processing), a newly developed clinical NLP toolkit that provides not only state-of-the-art NLP components, but also a user-friendly graphic user interface that can help users quickly build customized NLP pipelines for their individual applications. Our evaluation shows that the CLAMP default pipeline achieved good performance on named entity recognition and concept encoding. We also demonstrate the efficiency of the CLAMP graphic user interface in building customized, high-performance NLP pipelines with 2 use cases, extracting smoking status and lab test values. CLAMP is publicly available for research use, and we believe it is a unique asset for the clinical NLP community. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. TWRS tank waste pretreatment process development hot test siting report

    International Nuclear Information System (INIS)

    Howden, G.F.; Banning, D.L.; Dodd, D.A.; Smith, D.A.; Stevens, P.F.; Hansen, R.I.; Reynolds, B.A.

    1995-02-01

    This report is the sixth in a series that have assessed the hot testing requirements for TWRS pretreatment process development and identified the hot testing support requirements. This report, based on the previous work, identifies specific hot test work packages, matches those packages to specific hot cell facilities, and provides recommendations of specific facilities to be employed for the pretreatment hot test work. Also identified are serious limitations in the tank waste sample retrieval and handling infrastructure. Recommendations are provided for staged development of 500 mL, 3 L, 25 L and 4000 L sample recovery systems and specific actions to provide those capabilities

  1. Deployment Efficiency and Barrier Effectiveness Testing of a Temporary Anti-Personnel (TAP) Barrier System.

    Energy Technology Data Exchange (ETDEWEB)

    Allen, David James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hedrick, Charles D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martinez, Ruben [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report documents tests conducted by Sandia National Laboratories (SNL) on behalf of the U.S. Department of State to evaluate a temporary anti-personnel (TAP) barrier system developed by Mitigation Technologies. For this, the SNL Denial and Structural Assessment department developed a test protocol for the evaluation of the TAP barrier system on the basis of deployment efficiency and barrier effectiveness against a riotous/mob attack threat. The test protocol was then executed by SNL personnel and the results of the testing are documented.

  2. Testing Weak-Form Efficiency of The Chinese Stock Market and Hong Kong Stock Market

    OpenAIRE

    Lei, Zhuolin

    2012-01-01

    This study examines the random walk hypothesis to determine the validity of weak-form efficiency for Shanghai, Shenzhen and Hong Kong Stock Exchanges. Daily returns from 2001-2010 for Shanghai A and B shares, Shenzhen A and B shares and Hong Kong Hang Seng Index are used in this study. The random walk hypothesis is examined by using four statistical methods, namely a serial correlation test, an Augmented Dickey-Fuller Unit Root test, a runs test and a variance ratio test. The empirical re...

  3. Testing Weak Form Market Efficiency for Emerging Economies: A Nonlinear Approach

    OpenAIRE

    Omay, Nazli C.; Karadagli, Ece C.

    2010-01-01

    In this paper, we address weak form stock market efficiency of Emerging Economies, by testing whether the price series of these markets contain unit root. Nonlinear behavior of stock prices is well documented in the literature, and thus linear unit root tests may not be appropriate in this case. For this purpose, we employ the nonlinear unit root test procedure recently developed by Kapetanios et al. (2003) and nonlinear panel unit root test Ucar and Omay (2009) that has a better power than s...

  4. Influence analysis of sewage sludge methane fermentation parameters on process efficiency

    Directory of Open Access Journals (Sweden)

    Катерина Борисівна Сорокіна

    2016-12-01

    Full Text Available The efficiency dependence of sewage sludge organic matter decomposition from organization and conditions of the process is analyzed. Support of the optimal values of several parameters ensures to provide completeness of the sludge fermentation process and obtain biogas in calculated amount. Biogas utilization reduces costs for reactor heating and provides additional obtaining of other types of energy

  5. Influence analysis of sewage sludge methane fermentation parameters on process efficiency

    OpenAIRE

    Катерина Борисівна Сорокіна

    2016-01-01

    The efficiency dependence of sewage sludge organic matter decomposition from organization and conditions of the process is analyzed. Support of the optimal values of several parameters ensures to provide completeness of the sludge fermentation process and obtain biogas in calculated amount. Biogas utilization reduces costs for reactor heating and provides additional obtaining of other types of energy

  6. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  7. Testing the Week Form Efficiency of Pakistani Stock Market (2000-2010

    Directory of Open Access Journals (Sweden)

    Abdul Haque

    2011-01-01

    Full Text Available This empirical paper tests out the weak form efficiency of Pakistani stock market by examining the weekly index over the period . Return series has a leptokurtic and negatively skewed distribution, which is away from normal distribution as reflected by significant Jarque-Bera statistic. Estimated results of ADF (1979, PP (1988 and KPSS (1992 tests, Ljung-Box Q-Statistic of autocorrelations and runs test of randomness reject the Random Walk Hypothesis (RWH for the returns series. Moreover the results of variance ratio test (Lo and MacKinlay (1988 also reject the RWH and prove the robustness of other estimated results. The rejection of RWH reveals that the Pakistani stock prices are not Weak Form Efficient.

  8. Field test of radioactive high efficiency filter and filter exchange techniques of fuel cycle examination facility

    International Nuclear Information System (INIS)

    Hwang, Yong Hwa; Lee, Hyung Kwon; Chun, Young Bum; Park, Dae Gyu; Ahn, Sang Bok; Chu, Yong Sun; Kim, Eun Ka.

    1997-12-01

    The development of high efficiency filter was started to protect human beings from the contamination of radioactive particles, toxic gases and bacillus, and its gradual performance increment led to the fabrication of Ultra Low Penetration Air Filter (ULPA) today. The application field of ULPA has been spread not only to the air conditioning of nuclear power facilities, semiconductor industries, life science, optics, medical care and general facilities but also to the core of ultra-precision facilities. Periodic performance test on the filters is essential to extend its life-time through effective maintenance. Especially, the bank test on HEPA filter of nuclear facilities handling radioactive materials is required for environmental safety. Nowadays, the bank test technology has been reached to the utilization of a minimized portable detecting instruments and the evaluation techniques can provide high confidence in the area of particle distribution and leakage test efficiency. (author). 16 refs., 13 tabs., 14 figs

  9. Durability and efficiency tests for direct methanol fuel cell's long-term performance assessment

    International Nuclear Information System (INIS)

    Yeh, Pulin; Chang, Chu Hsiang; Shih, Naichien; Yeh, Naichia

    2016-01-01

    This research assessed the long-term performance of direct methanol fuel cells. The experiment was performed at room temperature using 0.51 mol/L ∼0.651 mol/L methanol with a fuel consumption rate of 0.8 ± 0.1 cc/Wh at stack temperature of 60 °C–70 °C. DuPont Nafion115 proton exchange membrane was used as the base material of MEA (membrane electrode assembly), which is then examined via a series of processes that include I−V curve test, humidity cycle test, load cycle test, and hydrogen penetration test. The study employs membrane modification and cell structure adjustment approaches to reduce the methanol crossover in the cathode and identify the cell performance effect of the carbon paper gas diffusion layer. The test results indicated an efficiency of 25% can be achieved with a three-piece MEA assembly. According to the durability test, the stack power-generation efficiency has maintained at 15%–25% level. With such efficiency, the stack voltage output has been able to stay above 7.8-V for over 5000 h. This result is in line with industry standard. - Highlights: • Assess DMFC performance under non-optimal conditions for production readiness. • Output of 26-cell DMFC stack stays beyond 7.8v after 5000 operation hours. • Power-generation efficiency of 26-cell DMFC stack maintains between 15%–20%.

  10. Buffer mass test - data aquisition and data processing systems

    International Nuclear Information System (INIS)

    Hagvall, B.

    1982-08-01

    This report describes data aquisition and data processing systems used for the Buffer Mass Test at Stripa. A data aquisition system, designed mainly to provide high reliability, in Stripa produces raw-data log tapes. Copies of these tapes are mailed to the computer center at the University of Luleaa for processing of raw-data. The computer systems in Luleaa offer a wide range of processing facilities: large mass storage units, several plotting facilities, programs for processing and monitoring of vast amounts of data, etc.. (Author)

  11. Is the Market Portfolio Efficient? A New Test to Revisit the Roll (1977) versus Levy and Roll (2010) Controversy

    OpenAIRE

    Marie Brière; Bastien Drut; Valérie Mignon; Kim Oosterlinck; Ariane Szafarz

    2011-01-01

    Levy and Roll (Review of Financial Studies, 2010) have recently revived the debate related to the market portfolio's efficiency suggesting that it may be mean-variance efficient after all. This paper develops an alternative test of portfolio mean-variance efficiency based on the realistic assumption that all assets are risky. The test is based on the vertical distance of a portfolio from the efficient frontier. Monte Carlo simulations show that our test outperforms the previous mean-variance ...

  12. Demonstration & Testing of ClimaStat for Improved DX Air-Conditioning Efficiency

    Science.gov (United States)

    2013-04-01

    Performance DDC – Direct Digital Control DoD – Department of Defense DOE – Department of Energy DX – Direct Expansion EER – Energy Efficiency...field tests on four Trane (American Standard) systems at a university site were concluded in 2009. A production prototype was constructed based on...dehumidification, giving a 18% reduction in energy consumption. Field test data from four Trane Voyager rooftop package units at a university site

  13. Waste water processing technology for Space Station Freedom - Comparative test data analysis

    Science.gov (United States)

    Miernik, Janie H.; Shah, Burt H.; Mcgriff, Cindy F.

    1991-01-01

    Comparative tests were conducted to choose the optimum technology for waste water processing on SSF. A thermoelectric integrated membrane evaporation (TIMES) subsystem and a vapor compression distillation subsystem (VCD) were built and tested to compare urine processing capability. Water quality, performance, and specific energy were compared for conceptual designs intended to function as part of the water recovery and management system of SSF. The VCD is considered the most mature and efficient technology and was selected to replace the TIMES as the baseline urine processor for SSF.

  14. Sustaining high energy efficiency in existing processes with advanced process integration technology

    International Nuclear Information System (INIS)

    Zhang, Nan; Smith, Robin; Bulatov, Igor; Klemeš, Jiří Jaromír

    2013-01-01

    Highlights: ► Process integration with better modelling and more advanced solution methods. ► Operational changes for better environmental performance through optimisation. ► Identification of process integration technology for operational optimisation. ► Systematic implementation procedure of process integration technology. ► A case study with crude oil distillation to demonstrate the operational flexibility. -- Abstract: To reduce emissions in the process industry, much emphasis has been put on making step changes in emission reduction, by developing new process technology and making renewable energy more affordable. However, the energy saving potential of existing systems cannot be simply ignored. In recent years, there have been significant advances in process integration technology with better modelling techniques and more advanced solution methods. These methods have been applied to the new design and retrofit studies in the process industry. Here attempts are made to apply these technologies to improve the environmental performance of existing facilities with operational changes. An industrial project was carried out to demonstrate the importance and effectiveness of exploiting the operational flexibility for energy conservation. By applying advanced optimisation technique to integrate the operation of distillation and heat recovery in a crude oil distillation unit, the energy consumption was reduced by 8% without capital expenditure. It shows that with correctly identified technology and the proper execution procedure, significant energy savings and emission reduction can be achieved very quickly without major capital expenditure. This allows the industry to improve its economic and environment performance at the same time.

  15. A test of the survival processing advantage in implicit and explicit memory tests.

    Science.gov (United States)

    McBride, Dawn M; Thomas, Brandon J; Zimmerman, Corinne

    2013-08-01

    The present study was designed to investigate the survival processing effect (Nairne, Thompson, & Pandeirada, Journal of Experimental Psychology: Learning, Memory, and Cognition, 33, 263-273, 2007) in cued implicit and explicit memory tests. The survival effect has been well established in explicit free recall and recognition tests, but has not been evident in implicit memory tests or in cued explicit tests. In Experiment 1 of the present study, we tested implicit and explicit memory for words studied in survival, moving, or pleasantness contexts in stem completion tests. In Experiment 2, we further tested these effects in implicit and explicit category production tests. Across the two experiments, with four separate memory tasks that included a total of 525 subjects, no survival processing advantage was found, replicating the results from implicit tests reported by Tse and Altarriba (Memory & Cognition, 38, 1110-1121, 2010). Thus, although the survival effect appears to be quite robust in free recall and recognition tests, it has not been replicated in cued implicit and explicit memory tests. The similar results found for the implicit and explicit tests in the present study do not support encoding elaboration explanations of the survival processing effect.

  16. The efficiency of the crude oil markets: Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie, E-mail: acharles@audencia.co [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier, E-mail: olivier.darne@univ-nantes.f [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable.

  17. The efficiency of the crude oil markets. Evidence from variance ratio tests

    International Nuclear Information System (INIS)

    Charles, Amelie; Darne, Olivier

    2009-01-01

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  18. The efficiency of the crude oil markets. Evidence from variance ratio tests

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Amelie [Audencia Nantes, School of Management, 8 route de la Joneliere, 44312 Nantes (France); Darne, Olivier [LEMNA, University of Nantes, IEMN-IAE, Chemin de la Censive du Tertre, 44322 Nantes (France)

    2009-11-15

    This study examines the random walk hypothesis for the crude oil markets, using daily data over the period 1982-2008. The weak-form efficient market hypothesis for two crude oil markets (UK Brent and US West Texas Intermediate) is tested with non-parametric variance ratio tests developed by [Wright J.H., 2000. Alternative variance-ratio tests using ranks and signs. Journal of Business and Economic Statistics, 18, 1-9] and [Belaire-Franch J. and Contreras D., 2004. Ranks and signs-based multiple variance ratio tests. Working paper, Department of Economic Analysis, University of Valencia] as well as the wild-bootstrap variance ratio tests suggested by [Kim, J.H., 2006. Wild bootstrapping variance ratio tests. Economics Letters, 92, 38-43]. We find that the Brent crude oil market is weak-form efficiency while the WTI crude oil market seems to be inefficiency on the 1994-2008 sub-period, suggesting that the deregulation have not improved the efficiency on the WTI crude oil market in the sense of making returns less predictable. (author)

  19. In Vitro Sensitivity Test in Antibiotics from the Fermentation Process in a Sugar-Alcohol Plant in the State of Paraná, Brazil

    OpenAIRE

    Lopes, Murilo Brandão; Faculdade de Apucarana – FAP; Silva, Thaís Medeiros Boldrin; Faculdade Metropolitana de Maringá – UNIFAMMA

    2011-01-01

    Since the development of different types of microorganisms is common during the fermentation process in sugar-alcohol plants, due to the processing states of prime matter, microbiological control is mandatory. In vitro sensitivity test is highly important for the fermentation process at sugar-alcohol plants since the type of antibiotic with the best antibacterial activity is evaluated. The test classifies antibiotics through their effects, namely, efficient, less efficient, slightly efficient...

  20. Taylor-Made Education: The Influence of the Efficiency Movement on the Testing of Reading Skills.

    Science.gov (United States)

    Allen, JoBeth

    Much of what has developed in the testing of reading harkens back to the days of the "Cult of Efficiency" movement in education that can be largely attributed to Frederick Winslow Taylor. Taylor spent most of his productive years studying time and motion in an attempt to streamline industrial production so that people could work as…

  1. Diagnosis efficiency of urine malaria test kit for the diagnosis of ...

    African Journals Online (AJOL)

    The aim of this study was to determine the diagnostic efficiencies of urine malaria test kit with microscopy as the gold standard in detecting Plasmodium falciparum HRP-2, a poly-histidine antigen in urine of febrile patients. The study was conducted in a primary and secondary health institution in Gombe Town, Gombe State, ...

  2. Results for the Brine Evaporation Bag (BEB) Brine Processing Test

    Science.gov (United States)

    Delzeit, Lance; Flynn, Michael; Fisher, John; Shaw, Hali; Kawashima, Brian; Beeler, David; Howard, Kevin

    2015-01-01

    The recent Brine Processing Test compared the NASA Forward Osmosis Brine Dewatering (FOBD), Paragon Ionomer Water Processor (IWP), UMPQUA Ultrasonic Brine Dewatering System (UBDS), and the NASA Brine Evaporation Bag (BEB). This paper reports the results of the BEB. The BEB was operated at 70 deg C and a base pressure of 12 torr. The BEB was operated in a batch mode, and processed 0.4L of brine per batch. Two different brine feeds were tested, a chromic acid-urine brine and a chromic acid-urine-hygiene mix brine. The chromic acid-urine brine, known as the ISS Alternate Pretreatment Brine, had an average processing rate of 95 mL/hr with a specific power of 5kWhr/L. The complete results of these tests will be reported within this paper.

  3. Highly efficient electroluminescence from a solution-processable thermally activated delayed fluorescence emitter

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Yoshimasa; Kubo, Shosei; Suzuki, Katsuaki; Kaji, Hironori, E-mail: kaji@scl.kyoto-u.ac.jp [Institute for Chemical Research, Kyoto University, Uji, Kyoto 611-0011 (Japan); Shizu, Katsuyuki [Institute for Chemical Research, Kyoto University, Uji, Kyoto 611-0011 (Japan); Center for Organic Photonics and Electronics Research (OPERA), Kyushu University, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan); Tanaka, Hiroyuki [Center for Organic Photonics and Electronics Research (OPERA), Kyushu University, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan); Adachi, Chihaya [Center for Organic Photonics and Electronics Research (OPERA), Kyushu University, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan); Japan Science and Technology Agency (JST), ERATO, Adachi Molecular Exciton Engineering Project, 744 Motooka, Nishi, Fukuoka 819-0395 (Japan)

    2015-11-02

    We developed a thermally activated delayed fluorescence (TADF) emitter, 2,4,6-tris(4-(9,9-dimethylacridan-10-yl)phenyl)-1,3,5-triazine (3ACR-TRZ), suitable for use in solution-processed organic light-emitting diodes (OLEDs). When doped into 4,4′-bis(carbazol-9-yl)biphenyl (CBP) host at 16 wt. %, 3ACR-TRZ showed a high photoluminescence quantum yield of 98%. Transient photoluminescence decay measurements of the 16 wt. % 3ACR-TRZ:CBP film confirmed that 3ACR-TRZ exhibits efficient TADF with a triplet-to-light conversion efficiency of 96%. This high conversion efficiency makes 3ACR-TRZ attractive as an emitting dopant in OLEDs. Using 3ACR-TRZ as an emitter, we fabricated a solution-processed OLED exhibiting a maximum external quantum efficiency of 18.6%.

  4. Design and Construction of a Test Bench to Characterize Efficiency and Reliability of High Voltage Battery Energy Storage Systems

    DEFF Research Database (Denmark)

    Blank, Tobias; Thomas, Stephan; Roggendorf, Christoph

    2010-01-01

    system efficiency. High voltage batteries may be advantageous for future medium voltage DC-grids as well. In all cases, high availability and reliability is indispensable. Investigations on the operating behavior of such systems are needed. For this purpose, a test bench for high voltage storage systems...... was built to analyze these processes for different battery technologies. A special safety infrastructure for the test bench was developed due to the high voltage and the storable energy of approximately 120 kWh. This paper presents the layout of the test bench for analyzing high voltage batteries with about...... 4,300 volts including all components, the safety requirements with the resultant safety circuit and the aim of the investigations to be performed with the test bench....

  5. Enhancement of efficiency of storage and processing of food raw materials using radiation technologies

    Energy Technology Data Exchange (ETDEWEB)

    Gracheva, A. Yu.; Zav’yalov, M. A.; Ilyukhina, N. V.; Kukhto, V. A.; Tarasyuk, V. T.; Filippovich, V. P. [All-Russia Research Institute of Preservation Technology (Russian Federation); Egorkin, A. V.; Chasovskikh, A. V. [Research Institute of Technical Physics and Automation (Russian Federation); Pavlov, Yu. S., E-mail: rad05@bk.ru [Frumkin Institute of Physical Chemistry and Electrochemistry, Russian Academy of Sciences (Russian Federation); Prokopenko, A. V., E-mail: pav14@mail.ru [National Research Nuclear University (Moscow Engineering Physics Institute) (Russian Federation); Strokova, N. E. [Moscow State University (Russian Federation); Artem’ev, S. A. [Russian Research Institute of Baking Industry (Russian Federation); Polyakova, S. P. [Russian Research Institute of Confectionery Industry (Russian Federation)

    2016-12-15

    The work is dedicated to improvement of efficiency of storage and processing of food raw materials using radiation technologies. International practice of radiation processing of food raw materials is presented and an increase in the consumption of irradiated food products is shown. The prospects of using radiation technologies for the processing of food products in Russia are discussed. The results of studies of radiation effects on various food products and packaging film by γ radiation and accelerated electrons are presented.

  6. Enhancement of efficiency of storage and processing of food raw materials using radiation technologies

    International Nuclear Information System (INIS)

    Gracheva, A. Yu.; Zav’yalov, M. A.; Ilyukhina, N. V.; Kukhto, V. A.; Tarasyuk, V. T.; Filippovich, V. P.; Egorkin, A. V.; Chasovskikh, A. V.; Pavlov, Yu. S.; Prokopenko, A. V.; Strokova, N. E.; Artem’ev, S. A.; Polyakova, S. P.

    2016-01-01

    The work is dedicated to improvement of efficiency of storage and processing of food raw materials using radiation technologies. International practice of radiation processing of food raw materials is presented and an increase in the consumption of irradiated food products is shown. The prospects of using radiation technologies for the processing of food products in Russia are discussed. The results of studies of radiation effects on various food products and packaging film by γ radiation and accelerated electrons are presented.

  7. Data processing codes for fatigue and tensile tests

    International Nuclear Information System (INIS)

    Sanchez Sarmiento, Gustavo; Iorio, A.F.; Crespi, J.C.

    1981-01-01

    The processing of fatigue and tensile tests data in order to obtain several parameters of engineering interest requires a considerable effort of numerical calculus. In order to reduce the time spent in this work and to establish standard data processing from a set of similar type tests, it is very advantageous to have a calculation code for running in a computer. Two codes have been developed in FORTRAN language; one of them predicts cyclic properties of materials from the monotonic and incremental or multiple cyclic step tests (ENSPRED CODE), and the other one reduces data coming from strain controlled low cycle fatigue tests (ENSDET CODE). Two examples are included using Zircaloy-4 material from different manufacturers. (author) [es

  8. Confirmation test of powder mixing process in J-MOX

    International Nuclear Information System (INIS)

    Ota, Hiroshi; Osaka, Shuichi; Kurita, Ichiro

    2009-01-01

    Japan Nuclear Fuel Ltd. (hereafter, JNFL) MOX Fuel Fabrication Plant (hereafter, J-MOX) is what fabricates MOX fuel for domestic light water power plants. Development of design concept of J-MOX was started mid 90's and the frame of J-MOX process was clarified around 2000 including adoption of MIMAS process as apart of J-MOX powder process. JNFL requires to take an answer to any technical question that has not been clarified ever before by world's MOX and/or Uranium fabricators before it commissions equipment procurement. J-MOX is to be constructed adjacent to the Rokkasho Reprocessing Plant (RRP) and to utilize MH-MOX powder recovered at RRP. The combination of the MIMAS process and the MH-MOX powder is what has never tried in the world. Therefore JNFL started a series of confirmation tests of which the most important is the powder test to confirm the applicability of MH-MOX powder to the MIMAS process. The MH-MOX powder, consisting of 50% plutonium oxide and 50% uranium oxide, originates JAEA development utilizing microwave heating (MH) technology. The powder test started with laboratory scale small equipment utilizing both uranium and the MOX powder in 2000, left a solution to tough problem such as powder adhesion onto equipment, and then was followed by a large scale equipment test again with uranium and the MOX powder. For the MOX test, actual size equipment within glovebox was manufactured and installed in JAEA plutonium fuel center in 2005, and based on results taken so far an understanding that the MIMAS equipment, with the MH-MOX powder, can present almost same quality MOX pellet as what is introduced as fabricated in Europe was developed. The test was finished at the end of Japanese fiscal year (JFY) 2007, and it was confirmed that the MOX pellets fabricated in this test were almost satisfied with the targeted specifications set for domestic LWR MOX fuels. (author)

  9. Comparison of ultrafiltration and dissolved air flotation efficiencies in industrial units during the papermaking process

    OpenAIRE

    Monte Lara, Concepción; Ordóñez Sanz, Ruth; Hermosilla Redondo, Daphne; Sánchez González, Mónica; Blanco Suárez, Ángeles

    2011-01-01

    The efficiency of an ultrafiltration unit has been studied and compared with a dissolved air flotation system to get water with a suited quality to be reused in the process. The study was done at a paper mill producing light weight coated paper and newsprint paper from 100% recovered paper. Efficiency was analysed by removal of turbidity, cationic demand, total and dissolved chemical oxygen demand, hardness, sulphates and microstickies. Moreover, the performance of the ultrafiltration unit an...

  10. A mechanistic understanding of processing additive-induced efficiency enhancement in bulk heterojunction organic solar cells

    KAUST Repository

    Schmidt, Kristin

    2013-10-31

    The addition of processing additives is a widely used approach to increase power conversion efficiencies for many organic solar cells. We present how additives change the polymer conformation in the casting solution leading to a more intermixed phase-segregated network structure of the active layer which in turn results in a 5-fold enhancement in efficiency. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Development of Test Rig for Robotization of Mining Technological Processes - Oversized Rock Breaking Process Case

    Science.gov (United States)

    Pawel, Stefaniak; Jacek, Wodecki; Jakubiak, Janusz; Zimroz, Radoslaw

    2017-12-01

    Production chain (PCh) in underground copper ore mine consists of several subprocesses. From our perspective implementation of so called ZEPA approach (Zero Entry Production Area) might be very interesting [16]. In practice, it leads to automation/robotization of subprocesses in production area. In this paper was investigated a specific part of PCh i.e. a place when cyclic transport by LHDs is replaced with continuous transport by conveying system. Such place is called dumping point. The objective of dumping points with screen is primary classification of the material (into coarse and fine material) and breaking oversized rocks with hydraulic hammer. Current challenges for the underground mining include e.g. safety improvement as well as production optimization related to bottlenecks, stoppages and operational efficiency of the machines. As a first step, remote control of the hydraulic hammer has been introduced, which not only transferred the operator to safe workplace, but also allowed for more comfortable work environment and control over multiple technical objects by a single person. Today literature analysis shows that current mining industry around the world is oriented to automation and robotization of mining processes and reveals technological readiness for 4th industrial revolution. The paper is focused on preliminary analysis of possibilities for the use of the robotic system to rock-breaking process. Prototype test rig has been proposed and experimental works have been carried out. Automatic algorithms for detection of oversized rocks, crushing them as well as sweeping and loosening of material have been formulated. Obviously many simplifications have been assumed. Some near future works have been proposed.

  12. Accelerated life testing design using geometric process for pareto distribution

    OpenAIRE

    Mustafa Kamal; Shazia Zarrin; Arif Ul Islam

    2013-01-01

    In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...

  13. Efficient and Stable Carbon-coated Nickel Foam Cathodes for the Electro-Fenton Process

    International Nuclear Information System (INIS)

    Song, Shuqin; Wu, Mingmei; Liu, Yuhui; Zhu, Qiping; Tsiakaras, Panagiotis; Wang, Yi

    2015-01-01

    Highlights: • Carbon-coated nickel foam (C@NF) was prepared by cycle coating carbon process. • Ni leaching can be effectively controlled at C@NF4 (4 cycle coating times) cathode. • C@NF4 exhibits excellent electro-Fenton performance with desirable stability. • C@NF4 exhibits low energy consumption for DMP degradation. - Abstract: Carbon-coated nickel foam (C@NF) electrodes are prepared via a simple and effective method, hydrothermal-carbonization cycle coating process, characterized by scanning electron microscopy (SEM) with energy dispersive spectrometer (EDS) and employed as the electro-Fenton (E-Fenton) cathode for degrading dimethyl phthalate (DMP) in aqueous solution. For the sake of comparison, nickel foam (NF) electrode and the conventional E-Fenton cathode (graphite gas diffusion electrode (GDE)) are also tested and compared. Experimental results indicate that nickel leaching can be effectively controlled at C@NF4 cathode (4 times cycle coating process), having great significance for promoting the application of NF in E-Fenton system. Moreover, C@NF4 cathode still presents excellent and effective performance on DMP degradation. DMP can be completely degraded within 2 h at −0.5 V and the total organic carbon (TOC) removal reaches as high as 82.1 %, which is almost 3 times as high as that at graphite GDE. Futhermore, the current efficiency for H 2 O 2 generation at C@NF4 is enhanced by 12 times compared to that at NF, and consequently the energy consumption during DMP degradation at C@NF4 is obvious lower than that at both NF cathode and graphite GDE. From the obtained results it can be deduced that C@NF4 is promising to be an attractive alternative E-Fenton cathode for removing organic pollutants in wastewater

  14. Mobile Energy Laboratory energy-efficiency testing programs. Semiannual report, April 1, 1991--September 30, 1991

    Energy Technology Data Exchange (ETDEWEB)

    Parker, G. B.; Currie, J. W.

    1992-03-01

    This report summarizes energy-efficiency testing activities applying the Mobile Energy Laboratory (MEL) testing capabilities during the third and fourth quarters of fiscal year (FY) 1991. The MELs, developed by the US Department of Energy (DOE) Federal Energy Management Program (FEMP), are administered by Pacific Northwest Laboratory (PNL) and the Naval Energy and Environmental Support Activity (NEESA) for energy testing and energy conservation program support functions at federal facilities. The using agencies principally fund MEL applications, while DOE/FEMP funds program administration and capability enhancement activities. This report fulfills the requirements established in Section 8 of the MEL Use Plan (PNL-6861) for semi-annual reporting on energy-efficiency testing activities using the MEL capabilities. The MEL Use Committee, formally established in 1989, developed the MEL Use Plan and meets semi-annually to establish priorities for energy-efficient testing applications using the MEL capabilities. The MEL Use Committee is composed of one representative each of the US Department of Energy, US Army, US Air Force, US Navy, and other federal agencies.

  15. A new device to test cutting efficiency of mechanical endodontic instruments

    Science.gov (United States)

    Rubini, Alessio Giansiracusa; Plotino, Gianluca; Al-Sudani, Dina; Grande, Nicola M.; Putorti, Ermanno; Sonnino, GianPaolo; Cotti, Elisabetta; Testarelli, Luca; Gambarini, Gianluca

    2014-01-01

    Background The purpose of the present study was to introduce a new device specifically designed to evaluate the cutting efficiency of mechanically driven endodontic instruments. Material/Methods Twenty new Reciproc R25 (VDW, Munich, Germany) files were used to be investigated in the new device developed to test the cutting ability of endodontic instruments. The device consists of a main frame to which a mobile plastic support for the hand-piece is connected and a stainless-steel block containing a Plexiglas block against which the cutting efficiency of the instruments was tested. The length of the block cut in 1 minute was measured in a computerized program with a precision of 0.1mm. The instruments were activated by using a torque-controlled motor (Silver Reciproc; VDW, Munich, Germany) in a reciprocating movement by the “Reciproc ALL” program (Group 1) and in counter-clockwise rotation at 300 rpm (Group 2). Mean and standard deviations of each group were calculated and data were statistically analyzed with a one-way ANOVA test (P0.05). Conclusions The cutting testing device evaluated in the present study was reliable and easy to use and may be effectively used to test cutting efficiency of both rotary and reciprocating mechanical endodontic instruments. PMID:24603777

  16. ENERGY EFFICIENCY OF DIESEL LOCOMOTIVE HYDRAULIC TRANSMISSION TESTS AT LOCOMOTIVE REPAIR PLANT

    Directory of Open Access Journals (Sweden)

    B. E. Bodnar

    2015-10-01

    Full Text Available Purpose. In difficult economic conditions, cost reduction of electricity consumption for the needs of production is an urgent task for the country’s industrial enterprises. Technical specifications of enterprises, which repair diesel locomotive hydraulic transmission, recommend conducting a certain amount of evaluation and regulatory tests to monitor their condition after repair. Experience shows that a significant portion of hydraulic transmission defects is revealed by bench tests. The advantages of bench tests include the ability to detect defects after repair, ease of maintenance of the hydraulic transmission and relatively low labour intensity for eliminating defects. The quality of these tests results in the transmission resource and its efficiency. Improvement of the technology of plant post-repairs hydraulic tests in order to reduce electricity consumption while testing. Methodology. The possible options for hydraulic transmission test bench improvement were analysed. There was proposed an energy efficiency method for diesel locomotive hydraulic transmission testing in locomotive repair plant environment. This is achieved by installing additional drive motor which receives power from the load generator. Findings. Based on the conducted analysis the necessity of improving the plant stand testing of hydraulic transmission was proved. The variants of the stand modernization were examined. The test stand modernization analysis was conducted. Originality. The possibility of using electric power load generator to power the stand electric drive motor or the additional drive motor was theoretically substantiated. Practical value. A variant of hydraulic transmission test stand based on the mutual load method was proposed. Using this method increases the hydraulic transmission load range and power consumption by stand remains unchanged. The additional drive motor will increase the speed of the input shaft that in its turn wil allow testing in

  17. Minimum detection efficiencies for a loophole-free observable-asymmetric Bell-type test

    International Nuclear Information System (INIS)

    Garbarino, G.

    2010-01-01

    We discuss the problem of finding the most favorable conditions for closing the detection loophole in a test of local realism with a Bell inequality. For a generic nonmaximally entangled two-qubit state and two incompatible bases to be adopted for alternative measurements of two observables a and b on each party, we apply Hardy's proof of nonlocality without inequality and derive an Eberhard-like inequality. For an infinity of nonmaximally entangled states we find that it is possible to refute local realism by requiring perfect detection efficiency for only one of the two observables, say b, to be measured on each party: The test is free from the detection loophole for any value of the detection efficiency corresponding to the other observable a. The maximum tolerable noise in such a loophole-free observable-asymmetric test is also evaluated.

  18. Testing of modular industrial solar retrofit industrial process steam systems

    Energy Technology Data Exchange (ETDEWEB)

    Cameron, C.P.; Dudley, V.E.

    1984-06-13

    Under the Department of Energy's Modular Industrial Solar Retrofit project, five industrial process heat systems incorporating line-focus solar collectors were designed and hardware was installed and tested at Sandia National Laboratories and the Solar Energy Research Institute. System designers and collector manufacturers participating in the project included Acurex Solar Corporation, BDM, Inc., Custom Engineering, Inc., Foster Wheeler Solar Development Corporation, Solar Kinetics, Inc., and Suntec Systems, Inc. This paper describes the testing of the qualification test systems which has been under way since mid-1982. Each qualification test system includes an equipment skid sufficient to support a collector field of 2300 m/sup 2/ aperture and one delta-tempeature string of from 175 to 460 m/sup 2/ aperture. Each system is capable of producing saturated steam at 1.7 MPa and operates at maximum outlet temperatures of from 250 to 290/sup 0/C. The test series includes function and safety tests to determine that the systems operate as specified, an unattended operation test of at least two weeks duration, performance tests to allow prediction of annual system performance, and life cycle tests to evaluate component lifetime and maintenance requirements. Since the start of testing, some twenty five modifications have been made to the various systems for the purpose of improving system performance and/or reliability, and appropriate tests of these modifictions have been made or are underway. This paper presents a description of the approach to testing of the MISR systems and selected test results.

  19. Test generation for digital circuits using parallel processing

    Science.gov (United States)

    Hartmann, Carlos R.; Ali, Akhtar-Uz-Zaman M.

    1990-12-01

    The problem of test generation for digital logic circuits is an NP-Hard problem. Recently, the availability of low cost, high performance parallel machines has spurred interest in developing fast parallel algorithms for computer-aided design and test. This report describes a method of applying a 15-valued logic system for digital logic circuit test vector generation in a parallel programming environment. A concept called fault site testing allows for test generation, in parallel, that targets more than one fault at a given location. The multi-valued logic system allows results obtained by distinct processors and/or processes to be merged by means of simple set intersections. A machine-independent description is given for the proposed algorithm.

  20. Test Plan: Sludge Treatment Project Corrosion Process Chemistry Follow-on Testing

    Energy Technology Data Exchange (ETDEWEB)

    Delegard, Calvin H.; Schmidt, Andrew J.; Poloski, Adam P.

    2007-08-17

    This test plan was prepared by the Pacific Northwest National Laboratory (PNNL) under contract with Fluor Hanford (FH). The test plan describes the scope and conditions to be used to perform laboratory-scale testing of the Sludge Treatment Project (STP) hydrothermal treatment of K Basin sludge. The STP, managed for the U. S. Department of Energy (DOE) by FH, was created to design and operate a process to eliminate uranium metal from the sludge prior to packaging for Waste Isolation Pilot Plant (WIPP) by using high temperature liquid water to accelerate the reaction, produce uranium dioxide from the uranium metal, and safely discharge the hydrogen. The proposed testing builds on the approach and laboratory test findings for both K Basin sludge and simulated sludge garnered during prior testing from September 2006 to March 2007. The outlined testing in this plan is designed to yield further understanding of the nature of the chemical reactions, the effects of compositional and process variations and the effectiveness of various strategies to mitigate the observed high shear strength phenomenon observed during the prior testing. These tests are designed to provide process validation and refinement vs. process development and design input. The expected outcome is to establish a level of understanding of the chemistry such that successful operating strategies and parameters can be implemented within the confines of the existing STP corrosion vessel design. In July 2007, the DOE provided direction to FH regarding significant changes to the scope of the overall STP. As a result of the changes, FH directed PNNL to stop work on most of the planned activities covered in this test plan. Therefore, it is unlikely the testing described here will be performed. However, to preserve the test strategy and details developed to date, the test plan has been published.

  1. Highly Efficient Reproducible Perovskite Solar Cells Prepared by Low-Temperature Processing

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2016-04-01

    Full Text Available In this work, we describe the role of the different layers in perovskite solar cells to achieve reproducible, ~16% efficient perovskite solar cells. We used a planar device architecture with PEDOT:PSS on the bottom, followed by the perovskite layer and an evaporated C60 layer before deposition of the top electrode. No high temperature annealing step is needed, which also allows processing on flexible plastic substrates. Only the optimization of all of these layers leads to highly efficient and reproducible results. In this work, we describe the effects of different processing conditions, especially the influence of the C60 top layer on the device performance.

  2. A method to determine stratification efficiency of thermal energy storage processes independently from storage heat losses

    DEFF Research Database (Denmark)

    Haller, M.Y.; Yazdanshenas, Eshagh; Andersen, Elsa

    2010-01-01

    process is in agreement with the first law of thermodynamics. A comparison of the stratification efficiencies obtained from experimental results of charging, standby, and discharging processes gives meaningful insights into the different mixing behaviors of a storage tank that is charged and discharged......A new method for the calculation of a stratification efficiency of thermal energy storages based on the second law of thermodynamics is presented. The biasing influence of heat losses is studied theoretically and experimentally. Theoretically, it does not make a difference if the stratification...

  3. Validation of the Vanderbilt Holistic Face Processing Test.

    Science.gov (United States)

    Wang, Chao-Chih; Ross, David A; Gauthier, Isabel; Richler, Jennifer J

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom), which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.

  4. Validation of the Vanderbilt Holistic Face Processing Test.

    Directory of Open Access Journals (Sweden)

    Chao-Chih Wang

    2016-11-01

    Full Text Available The Vanderbilt Holistic Face Processing Test (VHPT-F is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014. In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom, which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.

  5. Solar fuel processing efficiency for ceria redox cycling using alternative oxygen partial pressure reduction methods

    International Nuclear Information System (INIS)

    Lin, Meng; Haussener, Sophia

    2015-01-01

    Solar-driven non-stoichiometric thermochemical redox cycling of ceria for the conversion of solar energy into fuels shows promise in achieving high solar-to-fuel efficiency. This efficiency is significantly affected by the operating conditions, e.g. redox temperatures, reduction and oxidation pressures, solar irradiation concentration, or heat recovery effectiveness. We present a thermodynamic analysis of five redox cycle designs to investigate the effects of working conditions on the fuel production. We focused on the influence of approaches to reduce the partial pressure of oxygen in the reduction step, namely by mechanical approaches (sweep gassing or vacuum pumping), chemical approaches (chemical scavenger), and combinations thereof. The results indicated that the sweep gas schemes work more efficient at non-isothermal than isothermal conditions, and efficient gas phase heat recovery and sweep gas recycling was important to ensure efficient fuel processing. The vacuum pump scheme achieved best efficiencies at isothermal conditions, and at non-isothermal conditions heat recovery was less essential. The use of oxygen scavengers combined with sweep gas and vacuum pump schemes further increased the system efficiency. The present work can be used to predict the performance of solar-driven non-stoichiometric redox cycles and further offers quantifiable guidelines for system design and operation. - Highlights: • A thermodynamic analysis was conducted for ceria-based thermochemical cycles. • Five novel cycle designs and various operating conditions were proposed and investigated. • Pressure reduction method affects optimal operating conditions for maximized efficiency. • Chemical oxygen scavenger proves to be promising in further increasing efficiency. • Formulation of quantifiable design guidelines for economical competitive solar fuel processing

  6. Dual-Process Theories of Reasoning: The Test of Development

    Science.gov (United States)

    Barrouillet, Pierre

    2011-01-01

    Dual-process theories have become increasingly influential in the psychology of reasoning. Though the distinction they introduced between intuitive and reflective thinking should have strong developmental implications, the developmental approach has rarely been used to refine or test these theories. In this article, I review several contemporary…

  7. Testing a Constrained MPC Controller in a Process Control Laboratory

    Science.gov (United States)

    Ricardez-Sandoval, Luis A.; Blankespoor, Wesley; Budman, Hector M.

    2010-01-01

    This paper describes an experiment performed by the fourth year chemical engineering students in the process control laboratory at the University of Waterloo. The objective of this experiment is to test the capabilities of a constrained Model Predictive Controller (MPC) to control the operation of a Double Pipe Heat Exchanger (DPHE) in real time.…

  8. Research process of nondestructive testing pitting corrosion in metal material

    Directory of Open Access Journals (Sweden)

    Bo ZHANG

    2017-12-01

    Full Text Available Pitting corrosion directly affects the usability and service life of metal material, so the effective nondestructive testing and evaluation on pitting corrosion is of great significance for fatigue life prediction because of data supporting. The features of pitting corrosion are elaborated, and the relation between the pitting corrosion parameters and fatigue performance is pointed out. Through introducing the fundamental principles of pitting corrosion including mainly magnetic flux leakage inspection, pulsed eddy current and guided waves, the research status of nondestructive testing technology for pitting corrosion is summarized, and the key steps of nondestructive testing technologies are compared and analyzed from the theoretical model, signal processing to industrial applications. Based on the analysis of the signal processing specificity of different nondestructive testing technologies in detecting pitting corrosion, the visualization combined with image processing and signal analysis are indicated as the critical problems of accurate extraction of pitting defect information and quantitative characterization for pitting corrosion. The study on non-contact nondestructive testing technologies is important for improving the detection precision and its application in industries.

  9. Simple Retrofit High-Efficiency Natural Gas Water Heater Field Test

    Energy Technology Data Exchange (ETDEWEB)

    Schoenbauer, Ben [NorthernSTAR, St. Paul, MN (United States)

    2017-03-01

    High-performance water heaters are typically more time consuming and costly to install in retrofit applications, making high performance water heaters difficult to justify economically. However, recent advancements in high performance water heaters have targeted the retrofit market, simplifying installations and reducing costs. Four high efficiency natural gas water heaters designed specifically for retrofit applications were installed in single-family homes along with detailed monitoring systems to characterize their savings potential, their installed efficiencies, and their ability to meet household demands. The water heaters tested for this project were designed to improve the cost-effectiveness and increase market penetration of high efficiency water heaters in the residential retrofit market. The retrofit high efficiency water heaters achieved their goal of reducing costs, maintaining savings potential and installed efficiency of other high efficiency water heaters, and meeting the necessary capacity in order to improve cost-effectiveness. However, the improvements were not sufficient to achieve simple paybacks of less than ten years for the incremental cost compared to a minimum efficiency heater. Significant changes would be necessary to reduce the simple payback to six years or less. Annual energy savings in the range of $200 would also reduce paybacks to less than six years. These energy savings would require either significantly higher fuel costs (greater than $1.50 per therm) or very high usage (around 120 gallons per day). For current incremental costs, the water heater efficiency would need to be similar to that of a heat pump water heater to deliver a six year payback.

  10. Simple Retrofit High-Efficiency Natural Gas Water Heater Field Test

    Energy Technology Data Exchange (ETDEWEB)

    Schoenbauer, Ben [NorthernSTAR, St. Paul, MN (United States)

    2017-03-28

    High performance water heaters are typically more time consuming and costly to install in retrofit applications, making high performance water heaters difficult to justify economically. However, recent advancements in high performance water heaters have targeted the retrofit market, simplifying installations and reducing costs. Four high efficiency natural gas water heaters designed specifically for retrofit applications were installed in single-family homes along with detailed monitoring systems to characterize their savings potential, their installed efficiencies, and their ability to meet household demands. The water heaters tested for this project were designed to improve the cost-effectiveness and increase market penetration of high efficiency water heaters in the residential retrofit market. The retrofit high efficiency water heaters achieved their goal of reducing costs, maintaining savings potential and installed efficiency of other high efficiency water heaters, and meeting the necessary capacity in order to improve cost-effectiveness. However, the improvements were not sufficient to achieve simple paybacks of less than ten years for the incremental cost compared to a minimum efficiency heater. Significant changes would be necessary to reduce the simple payback to six years or less. Annual energy savings in the range of $200 would also reduce paybacks to less than six years. These energy savings would require either significantly higher fuel costs (greater than $1.50 per therm) or very high usage (around 120 gallons per day). For current incremental costs, the water heater efficiency would need to be similar to that of a heat pump water heater to deliver a six year payback.

  11. Construction Of A Computerised Information-Processing Test Battery

    Directory of Open Access Journals (Sweden)

    Johann M. Schepers

    2002-09-01

    Full Text Available The primary goal of the study was to construct a computerised information-processing test battery to measure choice reaction time for up to and including six bits of information, to measure discrimination reaction time with regard to colour patterns and form patterns, to measure rate of information processing with regard to perceptual stimuli and conceptual reasoning, and to develop a suitable scoring system for the respective tests. The battery of tests was applied to 58 pilots. Opsomming Die hoofdoel van die studie was om ‘n gerekenariseerde inligtingverwerkingstoets-battery te konstrueer om keusereaksietyd tot en met ses bis inligting te meet, om diskriminasie-reaksietyd ten opsigte van kleurpatrone en vormpatrone te meet, om tempo van inligtingverwerking ten opsigte van perseptuele stimuli en konseptuele redenering te meet en om ‘n gepaste nasienstelsel vir die onderskeie toetse te ontwikkel. Die battery toetse is op 58 vlieëniers toegepas

  12. Simulation based energy-resource efficient manufacturing integrated with in-process virtual management

    Science.gov (United States)

    Katchasuwanmanee, Kanet; Cheng, Kai; Bateman, Richard

    2016-09-01

    As energy efficiency is one of the key essentials towards sustainability, the development of an energy-resource efficient manufacturing system is among the great challenges facing the current industry. Meanwhile, the availability of advanced technological innovation has created more complex manufacturing systems that involve a large variety of processes and machines serving different functions. To extend the limited knowledge on energy-efficient scheduling, the research presented in this paper attempts to model the production schedule at an operation process by considering the balance of energy consumption reduction in production, production work flow (productivity) and quality. An innovative systematic approach to manufacturing energy-resource efficiency is proposed with the virtual simulation as a predictive modelling enabler, which provides real-time manufacturing monitoring, virtual displays and decision-makings and consequentially an analytical and multidimensional correlation analysis on interdependent relationships among energy consumption, work flow and quality errors. The regression analysis results demonstrate positive relationships between the work flow and quality errors and the work flow and energy consumption. When production scheduling is controlled through optimization of work flow, quality errors and overall energy consumption, the energy-resource efficiency can be achieved in the production. Together, this proposed multidimensional modelling and analysis approach provides optimal conditions for the production scheduling at the manufacturing system by taking account of production quality, energy consumption and resource efficiency, which can lead to the key competitive advantages and sustainability of the system operations in the industry.

  13. Take-Off Efficiency: Transformation of Mechanical Work Into Kinetic Energy During the Bosco Test

    Directory of Open Access Journals (Sweden)

    Jandova Sona

    2017-09-01

    Full Text Available Purpose. The aim of the study is to present a new method for determining the efficiency of take-off during a 60-s Bosco repeated vertical jump test. Method. The study involved 15 physical education students (age: 21.5 ± 2.4 years; height: 1.81 ± 0.08 m; mass: 76 ± 9 kg. The data were collected with the use of a pedobarographical system (Pedar-x; Novel, Munich, Germany. The statistical analysis utilized a simple linear regression model. Results. Owing to possible fatigue, flight time and flight height decreased. The average flight height was 0.260 ± 0.063 m, and the average contact time equalled 0.54 ± 0.16 s. The average anaerobic power values calculated for the 60-s work period had the mean value of 21.9 ± 6.7 W · kgBW-1; there was a statistically significant (p < 0.05 decrease in anaerobic power during the 60-s Bosco test. Conclusions. The efficiency of mechanical work was highest at the beginning of the test, reaching values of up to 50%. The efficiency of mechanical work conversion into mechanical energy seems to be an appropriate determinant of rising fatigue during the 60-s Bosco jumping test.

  14. Design and testing of solar dryers for processing food wastes

    Energy Technology Data Exchange (ETDEWEB)

    Nijmeh, M.N.; Ragab, A.S.; Emeish, M.S. [University of Jordan, Amman (Jordan). Mechanical Engineering Dept.; Jubran, B.A. [International Islamic University of Malaysia, Kaula Lumpur (Malaysia). Dept. of Mechanical Engineering

    1998-12-01

    This paper investigates the potential of using two solar dryers manufactured from locally available materials under Jordanian climatic conditions for drying food wastes for utilization as animal feed. The first dryer is a radiative-convective type, while the second is a solar boiler dryer. Tests were also conducted to investigate the nutritious values of the dried products and their suitability as animal feed. It was found from tests that the solar boiler dryer is more efficient than the radiative-convective dryer for producing animal feed in terms of both quality and quantity. The nutritious values of the end products from the dryers were found to be within the international recommended values used for feeding chickens. (author)

  15. Manual on laboratory testing for uranium ore processing

    International Nuclear Information System (INIS)

    1990-01-01

    Laboratory testing of uranium ores is an essential step in the economic evaluation of uranium occurrences and in the development of a project for the production of uranium concentrates. Although these tests represent only a small proportion of the total cost of a project, their proper planning, execution and interpretation are of crucial importance. The main purposes of this manual are to discuss the objectives of metallurgical laboratory ore testing, to show the specific role of these tests in the development of a project, and to provide practical instructions for performing the tests and for interpreting their results. Guidelines on the design of a metallurgical laboratory, on the equipment required to perform the tests and on laboratory safety are also given. This manual is part of a series of Technical Reports on uranium ore processing being prepared by the IAEA's Division of Nuclear Fuel Cycle and Waste Management. A report on the Significance of Mineralogy in the Development of Flowsheets for Processing Uranium Ores (Technical Reports Series No. 196, 1980) and an instruction manual on Methods for the Estimation of Uranium Ore Reserves (No. 255, 1985) have already been published. 17 refs, 40 figs, 17 tabs

  16. Numeric signal analysis process, with particular application to eddy current testing

    International Nuclear Information System (INIS)

    Combes, J.Y.; Ledinghen, E. de; Lionti, F.

    1996-01-01

    Eddy current testing uses analogic demodulation process, and then analog or digital phase shift measurement. These techniques are efficient, but not always versatile enough to apply to different configurations, in particular when a change of operating frequency is requested. This method performs an entirely digital conditioning. Excitation is simultaneously performed at N different frequencies (typically N=4). By sampling at a much higher frequency, 2N equation are obtained, allowing the resolution of the linear equations system. (D.L.)

  17. Coffee beans as a natural test food for the evaluation of the masticatory efficiency.

    Science.gov (United States)

    Schneider, G; Senger, B

    2001-04-01

    A lot of test foods have been used during this century to evaluate the masticatory ability of human subjects. Nevertheless, none has been universally admitted. If the test food by itself is of importance, attention should also be paid to its behaviour during the chewing test procedure. Therefore, we analysed step by step coffee beans through the processing of the chewing test and a dry sieving The development of a compression test and a computer simulation have shown that groups of 11 coffee beans give satisfactory results and deserve to be used in mastication studies.

  18. Comparison of different testing methods for gas fired domestic boiler efficiency determination

    International Nuclear Information System (INIS)

    De Paepe, M.; T'Joen, C.; Huisseune, H.; Van Belleghem, M.; Kessen, V.

    2013-01-01

    As the Energy Performance of Buildings Directive is being implemented throughout the European Union, a clear need for certification of boiler and domestic heating devices has arisen. Several ‘Notified Bodies’ exist, spread around the different member states. They are acting as the notified body of that member state and focus on local certification. A boiler manufacturer has its equipment tested according to the ‘Boiler Efficiency directive 92/42/EC’. Recently, tests done by several notified bodies in sequence on an identical unit of a manufacturer showed that results could differ depending on which notified body performed the test. In cooperation with ‘Technigas’ (Notified Body in Belgium) a detailed study was done of the measurement setup and devices for determining boiler efficiencies. Several aspects were studied: measurement devices (absolute or differential types), their location within the test setup (focussing on accuracy and their overall impact on the result) and the measurement strategy (measuring on the primary or the secondary water side). The study was performed for both full load and part load scenarios of a gas fired domestic boiler (smaller than 70 kW [4]). The results clearly indicate that temperature measurements arecritical for assessing boiler efficiency. Secondly the test setup using secondary circuit measurements should be preferred. Tests were performed at ‘Technigas’ on different setups in order to validate the findings. - Highlights: ► Labelling of boiler is now obliged by European standards. ► Error propagation is analysed for different methods of boiler performance testing. ► Secondary water side measurement with separate calibration of has highest quality. ► A sensitivity analysis showed that the water temperatures are important factors.

  19. In-situ testing of high efficiency filters at AEE Winfrith

    International Nuclear Information System (INIS)

    Fraser, D.C.

    1977-10-01

    This paper discusses experience in the testing of high efficiency filters in a variety of reactor and plant installations at AEE Winfrith. There is rarely any concern about the effectiveness of the filter as supplied by any reputable manufacturer. Experience has shown there is a need to check for defects in the installation of filters which could lead to by-passing of aerosols and it is desirable to perform periodical re-tests to ensure that no subsequent deterioration occurs. It is important to use simple, portable apparatus for such tests; methods based on the use of sodium chloride aerosols, although suitable for the testing of filters prior to installation, involve apparatus which is too bulky for in-situ testing. At Winfrith a double automatic Pollak counter has been developed and used routinely since 1970. The aerosol involved has a particle size far smaller than the size most likely to penetrate intact filters, but this is irrelevant when one is primarily interested in particles which by-pass the filter. Comparisons with other methods of testing filters will be described. There is remarkably good agreement between the efficiency of the filter installation as measured by a Pollak counter compared with techniques involving aerosols of sodium chloride and di octyl phthalate (DOP), presumably because the leakage around the filter is independent of particle size. (author)

  20. A laboratory dispersant effectiveness test which reflects dispersant efficiency in the field

    International Nuclear Information System (INIS)

    Lunel, T.; Wood, P.

    1996-01-01

    Oil dispersion efficiencies of surfactants, from laboratory dispersion tests and field data were compared and calibrated. Data from an oil spill, where dispersants were used as a major part of the response, was analysed. The data was accumulated through the monitoring of the dispersant operation of the Sea Empress spill incident, in which Forties Blend oil was spilled at sea. This detailed data set was used to calibrate existing laboratory dispersant tests, and to devise a new International Dispersant Effectiveness Test. The objective was to create a comprehensive guide to decision making on whether and when to start a dispersant spraying operation. The dispersion efficiencies obtained from the laboratory dispersant tests were compared with field data. Flume tests produced the highest percentage of dispersed oil for all the dispersal tests. However, it was emphasised that the total percentage of oil dispersed should not be the only measure of dispersant effectiveness, since it does not distinguish between the contribution of natural and chemically enhanced dispersion. 9 refs., 1 tab., 9 figs

  1. Comparison of Soybean Transformation Efficiency and Plant Factors Affecting Transformation during the Agrobacterium Infection Process.

    Science.gov (United States)

    Jia, Yuying; Yao, Xingdong; Zhao, Mingzhe; Zhao, Qiang; Du, Yanli; Yu, Cuimei; Xie, Futi

    2015-08-07

    The susceptibility of soybean genotype to Agrobacterium infection is a key factor for the high level of genetic transformation efficiency. The objective of this study is to evaluate the plant factors related to transformation in cotyledonary nodes during the Agrobacterium infection process. This study selected three genotypes (Williams 82, Shennong 9 and Bert) with high transformation efficiency, which presented better susceptibility to Agrobacterium infection, and three low transformation efficiency genotypes (General, Liaodou 16 and Kottman), which showed a relatively weak susceptibility. Gibberellin (GA) levels and soybean GA20ox2 and CYP707A2 transcripts of high-efficiency genotypes increased and were higher than those of low-efficiency genotypes; however, the opposite performance was shown in abscisic acid (ABA). Higher zeatin riboside (ZR) content and DNA quantity, and relatively higher expression of soybean IPT5, CYCD3 and CYCA3 were obtained in high-efficiency genotypes. High-efficiency genotypes had low methyl jasmonate (MeJA) content, polyphenol oxidase (PPO) and peroxidase (POD) activity, and relatively lower expression of soybean OPR3, PPO1 and PRX71. GA and ZR were positive plant factors for Agrobacterium-mediated soybean transformation by facilitating germination and growth, and increasing the number of cells in DNA synthesis cycle, respectively; MeJA, PPO, POD and ABA were negative plant factors by inducing defence reactions and repressing germination and growth, respectively.

  2. Comparison of Soybean Transformation Efficiency and Plant Factors Affecting Transformation during the Agrobacterium Infection Process

    Directory of Open Access Journals (Sweden)

    Yuying Jia

    2015-08-01

    Full Text Available The susceptibility of soybean genotype to Agrobacterium infection is a key factor for the high level of genetic transformation efficiency. The objective of this study is to evaluate the plant factors related to transformation in cotyledonary nodes during the Agrobacterium infection process. This study selected three genotypes (Williams 82, Shennong 9 and Bert with high transformation efficiency, which presented better susceptibility to Agrobacterium infection, and three low transformation efficiency genotypes (General, Liaodou 16 and Kottman, which showed a relatively weak susceptibility. Gibberellin (GA levels and soybean GA20ox2 and CYP707A2 transcripts of high-efficiency genotypes increased and were higher than those of low-efficiency genotypes; however, the opposite performance was shown in abscisic acid (ABA. Higher zeatin riboside (ZR content and DNA quantity, and relatively higher expression of soybean IPT5, CYCD3 and CYCA3 were obtained in high-efficiency genotypes. High-efficiency genotypes had low methyl jasmonate (MeJA content, polyphenol oxidase (PPO and peroxidase (POD activity, and relatively lower expression of soybean OPR3, PPO1 and PRX71. GA and ZR were positive plant factors for Agrobacterium-mediated soybean transformation by facilitating germination and growth, and increasing the number of cells in DNA synthesis cycle, respectively; MeJA, PPO, POD and ABA were negative plant factors by inducing defence reactions and repressing germination and growth, respectively.

  3. Analyzing the Efficient Execution of In-Store Logistics Processes in Grocery Retailing

    DEFF Research Database (Denmark)

    Reiner, Gerald; Teller, Christop; Kotzab, Herbert

    2013-01-01

    In this article, we examine in-store logistics processes for handling dairy products, from the incoming dock to the shelves of supermarkets and hypermarkets. The efficient execution of the in-store logistics related to such fast-moving, sensitive, and essential items is challenging and crucial...... for grocery retailers' sales, profits, and image. In our empirical study, we survey in-store logistics processes in 202 grocery supermarkets and hypermarkets belonging to a major retail chain in central Europe. Using a data envelopment analysis (DEA) and simulation, we facilitate process benchmarking....... In particular, we identify ways of improving in-store logistics processes by showing the performance impacts of different managerial strategies and tactics. The DEA results indicate different efficiency levels for different store formats; the hybrid store format of the small hypermarket exhibits a comparatively...

  4. Systematic, efficient and consistent LCA calculations for chemical and biochemical processes

    DEFF Research Database (Denmark)

    Petchkaewkul, Kaesinee; Malakul, Pomthong; Gani, Rafiqul

    2016-01-01

    that allow a wider coverage of chemical and biochemical processes. Improvements of LCIA calculations and eco-efficiency evaluation are introduced. Also, a new model for photochemical ozone formation has been developed and implemented. Performance of LCSoft in terms of accuracy and reliability is compared......Life Cycle Assessment or LCA is a technique, which is applied for the study and evaluation of quantitative environmental impacts through the entire life cycle of products, processes or services in order to improve and/or evaluate the design of existing as well as new processes. The LCA factors can...... with another well-known LCA-software, SimaPro for a biochemical process – the production of bioethanol from cassava rhizome. The results show a very good match of new added impact categories. Also, the results from a new feature in LCSoft, which is eco-efficiency evaluation, are presented....

  5. Energy efficiency solutions for driers used in the glass manufacturing and processing industry

    Directory of Open Access Journals (Sweden)

    Pătrașcu Roxana

    2017-07-01

    Full Text Available Energy conservation is relevant to increasing efficiency in energy projects, by saving energy, by its’ rational use or by switching to other forms of energy. The goal is to secure energy supply on short and long term, while increasing efficiency. These are enforced by evaluating the companies’ energy status, by monitoring and adjusting energy consumption and organising a coherent energy management. The manufacturing process is described, starting from the state and properties of the raw material and ending with the glass drying technological processes involved. Raw materials are selected considering technological and economic criteria. Manufacturing is treated as a two-stage process, consisting of the logistic, preparation aspect of unloading, transporting, storing materials and the manufacturing process itself, by which the glass is sifted, shredded, deferrized and dried. The interest of analyzing the latter is justified by the fact that it has a big impact on the final energy consumption values, hence, in order to improve the general performance, the driers’ energy losses are to be reduced. Technological, energy and management solutions are stated to meet this problem. In the present paper, the emphasis is on the energy perspective of enhancing the overall efficiency. The case study stresses the effects of heat recovery over the efficiency of a glass drier. Audits are conducted, both before and after its’ implementation, to punctually observe the balance between the entering and exiting heat in the drying process. The reduction in fuel consumption and the increase in thermal performance and fuel usage performances reveal the importance of using all available exiting heat from processes. Technical faults, either in exploitation or in management, lead to additional expenses. Improving them is in congruence with the energy conservation concept and is in accordance with the Energy Efficiency Improvement Program for industrial facilities.

  6. Process configuration of Liquid-nitrogen Energy Storage System (LESS) for maximum turnaround efficiency

    Science.gov (United States)

    Dutta, Rohan; Ghosh, Parthasarathi; Chowdhury, Kanchan

    2017-12-01

    Diverse power generation sector requires energy storage due to penetration of variable renewable energy sources and use of CO2 capture plants with fossil fuel based power plants. Cryogenic energy storage being large-scale, decoupled system with capability of producing large power in the range of MWs is one of the options. The drawback of these systems is low turnaround efficiencies due to liquefaction processes being highly energy intensive. In this paper, the scopes of improving the turnaround efficiency of such a plant based on liquid Nitrogen were identified and some of them were addressed. A method using multiple stages of reheat and expansion was proposed for improved turnaround efficiency from 22% to 47% using four such stages in the cycle. The novelty here is the application of reheating in a cryogenic system and utilization of waste heat for that purpose. Based on the study, process conditions for a laboratory-scale setup were determined and presented here.

  7. A Moving-Object Index for Efficient Query Processing with PeerWise Location Privacy

    DEFF Research Database (Denmark)

    Lin, Dan; Jensen, Christian S.; Zhang, Rui

    2011-01-01

    attention has been paid to enabling so-called peer-wise privacy—the protection of a user’s location from unauthorized peer users. This paper identifies an important efficiency problem in existing peer-privacy approaches that simply apply a filtering step to identify users that are located in a query range......, but that do not want to disclose their location to the querying peer. To solve this problem, we propose a novel, privacy-policy enabled index called the PEB-tree that seamlessly integrates location proximity and policy compatibility. We propose efficient algorithms that use the PEB-tree for processing privacy......-aware range and kNN queries. Extensive experiments suggest that the PEB-tree enables efficient query processing....

  8. Increasing a large petrochemical company efficiency by improvement of decision making process

    OpenAIRE

    Kirin Snežana D.; Nešić Lela G.

    2010-01-01

    The paper shows the results of a research conducted in a large petrochemical company, in a state under transition, with the aim to "shed light" on the decision making process from the aspect of personal characteristics of the employees, in order to use the results to improve decision making process and increase company efficiency. The research was conducted by a survey, i.e. by filling out a questionnaire specially made for this purpose, in real conditions, during working hours. The sample of...

  9. Ozone using outlook for efficiency increasing of transportation and processing of high viscous petroleum raw materials

    International Nuclear Information System (INIS)

    Nadirov, N.K.; Zajkina, R.F.; Mamonova, T.B.

    1997-01-01

    Main types of oxidation reactions preceding during petroleum feedstock ozonization are generalized. The slight ozone high paraffin-content petroleum processing sites in shown on the example will make possible to rise the pipe transport efficiency and to increase the light fraction contents in petroleums. The prospects are discussed to application of ozone forming as a by-product of radiation-chemical facilities action for petroleum feedstock processing. (author)

  10. Gas dynamic design of the pipe line compressor with 90% efficiency. Model test approval

    Science.gov (United States)

    Galerkin, Y.; Rekstin, A.; Soldatova, K.

    2015-08-01

    Gas dynamic design of the pipe line compressor 32 MW was made for PAO SMPO (Sumy, Ukraine). The technical specification requires compressor efficiency of 90%. The customer offered favorable scheme - single-stage design with console impeller and axial inlet. The authors used the standard optimization methodology of 2D impellers. The original methodology of internal scroll profiling was used to minimize efficiency losses. Radically improved 5th version of the Universal modeling method computer programs was used for precise calculation of expected performances. The customer fulfilled model tests in a 1:2 scale. Tests confirmed the calculated parameters at the design point (maximum efficiency of 90%) and in the whole range of flow rates. As far as the authors know none of compressors have achieved such efficiency. The principles and methods of gas-dynamic design are presented below. The data of the 32 MW compressor presented by the customer in their report at the 16th International Compressor conference (September 2014, Saint- Petersburg) and later transferred to the authors.

  11. Anaerobic digestion of post-hydrothermal liquefaction wastewater for improved energy efficiency of hydrothermal bioenergy processes.

    Science.gov (United States)

    Zhou, Yan; Schideman, Lance; Zheng, Mingxia; Martin-Ryals, Ana; Li, Peng; Tommaso, Giovana; Zhang, Yuanhui

    2015-01-01

    Hydrothermal liquefaction (HTL) is a promising process for converting wet biomass and organic wastes into bio-crude oil. It also produces an aqueous product referred to as post-hydrothermal liquefaction wastewater (PHWW) containing up to 40% of the original feedstock carbon, which reduces the overall energy efficiency of the HTL process. This study investigated the feasibility of using anaerobic digestion (AD) to treat PHWW, with the aid of activated carbon. Results showed that successful AD occurred at relatively low concentrations of PHWW (≤ 6.7%), producing a biogas yield of 0.5 ml/mg CODremoved, and ∼53% energy recovery efficiency. Higher concentrations of PHWW (≥13.3%) had an inhibitory effect on the AD process, as indicated by delayed, slower, or no biogas production. Activated carbon was shown to effectively mitigate this inhibitory effect by enhancing biogas production and allowing digestion to proceed at higher PHWW concentrations (up to 33.3%), likely due to sequestering toxic organic compounds. The addition of activated carbon also increased the net energy recovery efficiency of AD with a relatively high concentration of PHWW (33.3%), taking into account the energy for producing activated carbon. These results suggest that AD is a feasible approach to treat PHWW, and to improve the energy efficiency of the HTL processes.

  12. Word Recognition Processing Efficiency as a Component of Second Language Listening

    Science.gov (United States)

    Joyce, Paul

    2013-01-01

    This study investigated the application of the speeded lexical decision task to L2 aural processing efficiency. One-hundred and twenty Japanese university students completed an aural word/nonword task. When the variation of lexical decision time (CV) was correlated with reaction time (RT), the results suggested that the single-word recognition…

  13. Future energy-efficient and low-emissions glass melting processes

    NARCIS (Netherlands)

    Beerkens, R.G.C.; Limpt, J.A.C. van; Lankhorst, A.M.; Santen, P.J. van

    2012-01-01

    All over the world, there is an increasing drive to develop new technologies or concepts for industrial glass melting furnaces, with the main aim to increase the energy efficiency, tabilize production and reduce emissions. The application of new process sensors, improved furnace design, intelligent

  14. An Efficient Experimental Design Strategy for Modelling and Characterization of Processes

    DEFF Research Database (Denmark)

    Tajsoleiman, Tannaz; Semenova, Daria; Oliveira Fernandes, Ana Carolina

    2017-01-01

    Designing robust, efficient and economic processes is a main challenge for the biotech industries. To achieve a well-designed bioprocess, understanding the ongoing phenomena and the involved reaction kinetics is crucial. By development of advanced miniaturized reactors, a promising opportunity ar...

  15. Investigation of Processing, Microstructures and Efficiencies of Polycrystalline CdTe Photovoltaic Films and Devices

    Science.gov (United States)

    Munshi, Amit Harenkumar

    CdTe based photovoltaics have been commercialized at multiple GWs/year level. The performance of CdTe thin film photovoltaic devices is sensitive to process conditions. Variations in deposition temperatures as well as other treatment parameters have a significant impact on film microstructure and device performance. In this work, extensive investigations are carried out using advanced microstructural characterization techniques in an attempt to relate microstructural changes due to varying deposition parameters and their effects on device performance for cadmium telluride based photovoltaic cells deposited using close space sublimation (CSS). The goal of this investigation is to apply advanced material characterization techniques to aid process development for higher efficiency CdTe based photovoltaic devices. Several techniques have been used to observe the morphological changes to the microstructure along with materials and crystallographic changes as a function of deposition temperature and treatment times. Traditional device structures as well as advanced structures with electron reflector and films deposited on Mg1-xZnxO instead of conventional CdS window layer are investigated. These techniques include Scanning Electron Microscopy (SEM) with Electron Back Scattered Diffraction (EBSD) and Energy dispersive X-ray spectroscopy (EDS) to study grain structure and High Resolution Transmission Electron Microscopy (TEM) with electron diffraction and EDS. These investigations have provided insights into the mechanisms that lead to change in film structure and device performance with change in deposition conditions. Energy dispersive X-ray spectroscopy (EDS) is used for chemical mapping of the films as well as to understand interlayer material diffusion between subsequent layers. Electrical performance of these devices has been studied using current density vs voltage plots. Devices with efficiency over 18% have been fabricated on low cost commercial glass substrates

  16. Efficient statistical tests to compare Youden index: accounting for contingency correlation.

    Science.gov (United States)

    Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan

    2015-04-30

    Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.

  17. An Efficient Secret Key Homomorphic Encryption Used in Image Processing Service

    Directory of Open Access Journals (Sweden)

    Pan Yang

    2017-01-01

    Full Text Available Homomorphic encryption can protect user’s privacy when operating on user’s data in cloud computing. But it is not practical for wide using as the data and services types in cloud computing are diverse. Among these data types, digital image is an important personal data for users. There are also many image processing services in cloud computing. To protect user’s privacy in these services, this paper proposed a scheme using homomorphic encryption in image processing. Firstly, a secret key homomorphic encryption (IGHE was constructed for encrypting image. IGHE can operate on encrypted floating numbers efficiently to adapt to the image processing service. Then, by translating the traditional image processing methods into the operations on encrypted pixels, the encrypted image can be processed homomorphically. That is, service can process the encrypted image directly, and the result after decryption is the same as processing the plain image. To illustrate our scheme, three common image processing instances were given in this paper. The experiments show that our scheme is secure, correct, and efficient enough to be used in practical image processing applications.

  18. [Efficient and rapid non-test tube cloning of Jatropha curcas].

    Science.gov (United States)

    Wang, Zhao-Yu; Lin, Jing-Ming; Xu, Zeng-Fu

    2007-08-01

    To develop a new technique for efficient and rapid non-test tube cloning of the medicinal and energy- producing plant Jatropha curcas. Using the mini-stem fragment (2-3 cm) of Jatropha curcas with merely one axillary bud as the explant, the effect of an auxin IBA concentration on the plantlet regeneration was studied. When treated with 1 mg/LIBA for 1h, the explants showed the most rapid propagation. The mini-stem fragments high root regeneration ratio (96.7%), short root regeneration period (18.2-/+2.0 d), large number of new roots per explant (6.3-/+1.8), and long total root length (6.8-/+3.5 cm), demonstrating that this technique can be a simple and efficient method for rapid non-test tube cloning of Jatropha curcas of potential industrial value.

  19. Experimental Study of Tensile Test in Resistance Spot Welding Process

    Directory of Open Access Journals (Sweden)

    Lebbal Habib

    Full Text Available Abstract Resistance spot welding (RSW is a widely used joining process for fabricating sheet metal assemblies in automobile industry .In comparison with other welding processes the RSW is faster and easier for automation. This process involves electrical, thermal and mechanical interactions. Resistance spot welding primarily takes place by localized melting spot at the interface of the sheets followed by its quick solidification under sequential control of pressure water-cooled electrode and flow of required electric current for certain duration. In this work the tensile tests were studied, the results obtained show that the type material, the overlap length, the angle of the rolling direction and the thickness of the sheet have an influence in resistance spot welding process.

  20. TESTING TECHNICAL AND SCALE EFFICIENCY OF KAZAKHBANKS:EVIDENCE BASED ON DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Razzaque H Bhatti

    2013-01-01

    Full Text Available This paper tests technical and scale efficiency of 20 Kazakh banks using annualdata on three inputs (interest expenses, non-interestexpenses and deposits andthree outputs (interest income, non-interest income and loans over the period2007-2011. Two input-oriented data envelopment analysis models of Charnes etal (1978 and Banker et al(1984, which are based on constant return to scale andvariable return to scale respectively, areused to evaluate technical efficiency,whereas scale efficiency is computed bydividing the former efficiency ratio bythe latter one. The resultsobtained show that the average efficiency ratios ofindividual banks under constant and variable returns toscale range from 0.88 and1.00 to 0.93 and 1.00 respectively, whereas those of all banks between 0.95 and0.98 respectively. Only are the fivebanks (ATFB, Citibank, HSBC bank,KazInvest bank and Exim bank the mostefficient banks in Kazakhstan, sincetheir efficiency ratios have been consistently equal to unity, implying that thesebanks operate at their optimal levels. The efficiency scores of the remaining 15banks range from 0.88 to 0.99, and as suchthe majority of these banks do notseem to operate far more below their optimal level. The results indicate that theperformance of the Kazakh banks deteriorated substantially during the globalfinancial crisis of 2008 because theCRS ratio dropped from 0.65 in 2007 to 0.50in 2008 and to 0.40 in 2009. The results alsoconfirm that most of the foreignbanks perform relatively better than domestic banks.

  1. Installation and Testing of a Jorin Visual Process Analyzer

    International Nuclear Information System (INIS)

    Christensen, Kristi M.

    2010-01-01

    The Jorin Visual Process Analyzer (ViPA) is an on-line instrument that uses video microscope imaging to detect and measure the physical characteristics of dispersed objects within a process stream or laboratory sample. Object analysis is performed by capturing an on-going sequence of single frames from the video feed and relaying the images in real time to a nearby control computer where the ViPA software then processes and transfigures the information from the images into meaningful process data. The ViPA captures and analyzes approximately 15 images per second and continuously records 17 material parameters including size, shape and optical density. The ViPA software uses the measured parameters to differentiate between different classes of objects including organic droplets, gas bubbles, and solid particles. Procurement of this instrument provides a unique capability to support predictive modeling and further understanding of mass transfer during solvent extraction processes. Organic droplet data collected using the ViPA can be used to develop dispersion profiles of the liquid-liquid mixing and disengagements sections for each type of process equipment. These profiles will provide insight into mixing dynamics and will guide the prevention of emulsion formation that leads to system losses. Additionally, the measurement capabilities of the ViPA will provide the input needed to create new two-phase Computational Fluid Dynamics (CFD) models that characterize both mixing and separation operations in the various types of equipment. These models can then be used to improve process efficiency by optimizing operation parameters for each proposed extraction cycle.

  2. Enhancement of the efficiency of the Open Cycle Phillips Optimized Cascade LNG process

    International Nuclear Information System (INIS)

    Fahmy, M.F.M.; Nabih, H.I.; El-Nigeily, M.

    2016-01-01

    Highlights: • Expanders replaced JT valves in the Phillips Optimized Cascade liquefaction process. • Improvement in plant liquefaction efficiency was evaluated in presence of expanders. • Comparison of the different optimum cases for the liquefaction process was presented. - Abstract: This study aims to improve the performance of the Open Cycle Phillips Optimized Cascade Process for the production of liquefied natural gas (LNG) through the replacement of Joule–Thomson (JT) valves by expanders. The expander has a higher thermodynamic efficiency than the JT valve. Moreover, the produced shaft power from the expander is integrated into the process. The study is conducted using the Aspen HYSYS-V7 simulation software for simulation of the Open Cycle Phillips Optimized Cascade Process having the JT valves. Simulation of several proposed cases in which expanders are used instead of JT valves at different locations in the process as at the propane cycle, ethylene cycle, methane cycle and the upstream of the heavies removal column is conducted. The optimum cases clearly indicate that expanders not only produce power, but also offer significant improvements in the process performance as shown by the total plant power consumption, LNG production, thermal efficiency, plant specific power and CO_2 emissions reduction. Results also reveal that replacing JT valves by expanders in the methane cycle has a dominating influence on all performance criteria and hence, can be considered as the main key contributor affecting the Phillips Optimized Cascade Process leading to a notable enhancement in its efficiency. This replacement of JT valves by liquid expanders at different locations of the methane cycle encounters power savings in the range of 4.92–5.72%, plant thermal efficiency of 92.64–92.97% and an increase in LNG production of 5.77–7.04%. Moreover, applying liquid expanders at the determined optimum cases for the different cycles, improves process performance and

  3. DEVELOPMENT OF AN ARMY STATIONARY AXLE TEST STAND FOR LUBRICANT EFFICIENCY EVALUATION-PART II

    Science.gov (United States)

    2017-01-13

    electric AC motor , and the two outputs of the axle are coupled using speed increasing gear boxes and absorbed by an identically sized AC motor ...work using the stationary axle efficiency test stand was completed using hardware representative of light and medium duty tactical wheeled vehicles ... vehicle results. However for the results to be applicable to real world field use , the driving cycle being replicated should be representative of real

  4. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  5. On the Bahadur-efficient testing of uniformity by means of entropy

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Harremoës, P.

    2008-01-01

    Roč. 54, 1/2 (2008), s. 321-331 ISSN 0018-9448 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bahadur efficiency * entropy * goodness-of-fit * power divergences * uniformity testing Subject RIV: BD - Theory of Information Impact factor: 3.793, year: 2008

  6. Eco-Efficient Process Improvement at the Early Development Stage: Identifying Environmental and Economic Process Hotspots for Synergetic Improvement Potential.

    Science.gov (United States)

    Piccinno, Fabiano; Hischier, Roland; Seeger, Stefan; Som, Claudia

    2018-05-15

    We present here a new eco-efficiency process-improvement method to highlight combined environmental and costs hotspots of the production process of new material at a very early development stage. Production-specific and scaled-up results for life cycle assessment (LCA) and production costs are combined in a new analysis to identify synergetic improvement potentials and trade-offs, setting goals for the eco-design of new processes. The identified hotspots and bottlenecks will help users to focus on the relevant steps for improvements from an eco-efficiency perspective and potentially reduce their associated environmental impacts and production costs. Our method is illustrated with a case study of nanocellulose. The results indicate that the production route should start with carrot pomace, use heat and solvent recovery, and deactivate the enzymes with bleach instead of heat. To further improve the process, the results show that focus should be laid on the carrier polymer, sodium alginate, and the production of the GripX coating. Overall, the method shows that the underlying LCA scale-up framework is valuable for purposes beyond conventional LCA studies and is applicable at a very early stage to provide researchers with a better understanding of their production process.

  7. Industrial Compositional Streamline Simulation for Efficient and Accurate Prediction of Gas Injection and WAG Processes

    Energy Technology Data Exchange (ETDEWEB)

    Margot Gerritsen

    2008-10-31

    Gas-injection processes are widely and increasingly used for enhanced oil recovery (EOR). In the United States, for example, EOR production by gas injection accounts for approximately 45% of total EOR production and has tripled since 1986. The understanding of the multiphase, multicomponent flow taking place in any displacement process is essential for successful design of gas-injection projects. Due to complex reservoir geometry, reservoir fluid properties and phase behavior, the design of accurate and efficient numerical simulations for the multiphase, multicomponent flow governing these processes is nontrivial. In this work, we developed, implemented and tested a streamline based solver for gas injection processes that is computationally very attractive: as compared to traditional Eulerian solvers in use by industry it computes solutions with a computational speed orders of magnitude higher and a comparable accuracy provided that cross-flow effects do not dominate. We contributed to the development of compositional streamline solvers in three significant ways: improvement of the overall framework allowing improved streamline coverage and partial streamline tracing, amongst others; parallelization of the streamline code, which significantly improves wall clock time; and development of new compositional solvers that can be implemented along streamlines as well as in existing Eulerian codes used by industry. We designed several novel ideas in the streamline framework. First, we developed an adaptive streamline coverage algorithm. Adding streamlines locally can reduce computational costs by concentrating computational efforts where needed, and reduce mapping errors. Adapting streamline coverage effectively controls mass balance errors that mostly result from the mapping from streamlines to pressure grid. We also introduced the concept of partial streamlines: streamlines that do not necessarily start and/or end at wells. This allows more efficient coverage and avoids

  8. Test bank to accompany Computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1980-01-01

    Test Bank to Accompany Computers and Data Processing provides a variety of questions from which instructors can easily custom tailor exams appropriate for their particular courses. This book contains over 4000 short-answer questions that span the full range of topics for introductory computing course.This book is organized into five parts encompassing 19 chapters. This text provides a very large number of questions so that instructors can produce different exam testing essentially the same topics in succeeding semesters. Three types of questions are included in this book, including multiple ch

  9. A new method for flight test determination of propulsive efficiency and drag coefficient

    Science.gov (United States)

    Bull, G.; Bridges, P. D.

    1983-01-01

    A flight test method is described from which propulsive efficiency as well as parasite and induced drag coefficients can be directly determined using relatively simple instrumentation and analysis techniques. The method uses information contained in the transient response in airspeed for a small power change in level flight in addition to the usual measurement of power required for level flight. Measurements of pitch angle and longitudinal and normal acceleration are eliminated. The theoretical basis for the method, the analytical techniques used, and the results of application of the method to flight test data are presented.

  10. Development of an Optimal Controller and Validation Test Stand for Fuel Efficient Engine Operation

    Science.gov (United States)

    Rehn, Jack G., III

    There are numerous motivations for improvements in automotive fuel efficiency. As concerns over the environment grow at a rate unmatched by hybrid and electric automotive technologies, the need for reductions in fuel consumed by current road vehicles has never been more present. Studies have shown that a major cause of poor fuel consumption in automobiles is improper driving behavior, which cannot be mitigated by purely technological means. The emergence of autonomous driving technologies has provided an opportunity to alleviate this inefficiency by removing the necessity of a driver. Before autonomous technology can be relied upon to reduce gasoline consumption on a large scale, robust programming strategies must be designed and tested. The goal of this thesis work was to design and deploy an autonomous control algorithm to navigate a four cylinder, gasoline combustion engine through a series of changing load profiles in a manner that prioritizes fuel efficiency. The experimental setup is analogous to a passenger vehicle driving over hilly terrain at highway speeds. The proposed approach accomplishes this using a model-predictive, real-time optimization algorithm that was calibrated to the engine. Performance of the optimal control algorithm was tested on the engine against contemporary cruise control. Results indicate that the "efficient'' strategy achieved one to two percent reductions in total fuel consumed for all load profiles tested. The consumption data gathered also suggests that further improvements could be realized on a different subject engine and using extended models and a slightly modified optimal control approach.

  11. Towards efficient next generation light sources: combined solution processed and evaporated layers for OLEDs

    Science.gov (United States)

    Hartmann, D.; Sarfert, W.; Meier, S.; Bolink, H.; García Santamaría, S.; Wecker, J.

    2010-05-01

    Typically high efficient OLED device structures are based on a multitude of stacked thin organic layers prepared by thermal evaporation. For lighting applications these efficient device stacks have to be up-scaled to large areas which is clearly challenging in terms of high through-put processing at low-cost. One promising approach to meet cost-efficiency, high through-put and high light output is the combination of solution and evaporation processing. Moreover, the objective is to substitute as many thermally evaporated layers as possible by solution processing without sacrificing the device performance. Hence, starting from the anode side, evaporated layers of an efficient white light emitting OLED stack are stepwise replaced by solution processable polymer and small molecule layers. In doing so different solutionprocessable hole injection layers (= polymer HILs) are integrated into small molecule devices and evaluated with regard to their electro-optical performance as well as to their planarizing properties, meaning the ability to cover ITO spikes, defects and dust particles. Thereby two approaches are followed whereas in case of the "single HIL" approach only one polymer HIL is coated and in case of the "combined HIL" concept the coated polymer HIL is combined with a thin evaporated HIL. These HIL architectures are studied in unipolar as well as bipolar devices. As a result the combined HIL approach facilitates a better control over the hole current, an improved device stability as well as an improved current and power efficiency compared to a single HIL as well as pure small molecule based OLED stacks. Furthermore, emitting layers based on guest/host small molecules are fabricated from solution and integrated into a white hybrid stack (WHS). Up to three evaporated layers were successfully replaced by solution-processing showing comparable white light emission spectra like an evaporated small molecule reference stack and lifetime values of several 100 h.

  12. Overview of planning process at FFTF [Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Gadeken, A.D.

    1986-03-01

    The planning process at the Fast Flux Test Facility (FFTF) is controlled through a hierarchy of documents ranging from a ten-year strategic plan to a weekly schedule. Within the hierarchy are a Near-Term (three-year) Operating Plan, a Cycle (six-month) Plan, and an Outage/Operating Phase Schedule. Coordination of the planning process is accomplished by a dedicated preparation team that also provides an overview of the formal planning timetable which identifies key action items required to be completed before an outage/operating phase can begin

  13. Iodox process tests in a transuranium element production campaign

    International Nuclear Information System (INIS)

    Collins, E.D.; Benker, D.E.

    1978-01-01

    The Iodox process removes gaseous iodine from air by oxidation of organic iodides and by hydrolysis-oxidation of free iodine to the stable iodate form. An Iodox system for treatment of the 10 -4 m 3 /s dissolver off-gas (DOG) stream was installed and is used for initial removal of radioiodine, this allowing the Hopcalite-charcoal to serve as a backup system in TRU. During a recent TRU production campaign, three dissolver runs were made to test the Iodox process

  14. Processing speed training increases the efficiency of attentional resource allocation in young adults

    Directory of Open Access Journals (Sweden)

    Wesley K Burge

    2013-10-01

    Full Text Available Cognitive training has been shown to improve performance on a range of tasks. However, the mechanisms underlying these improvements are still unclear. Given the wide range of transfer effects, it is likely that these effects are due to a factor common to a wide range of tasks. One such factor is a participant’s efficiency in allocating limited cognitive resources. The impact of a cognitive training program, Processing Speed Training (PST, on the allocation of resources to a set of visual tasks was measured using pupillometry in 10 young adults as compared to a control group of a 10 young adults (n = 20. PST is a well-studied computerized training program that involves identifying simultaneously presented central and peripheral stimuli. As training progresses, the task becomes increasingly more difficult, by including peripheral distracting stimuli and decreasing the duration of stimulus presentation. Analysis of baseline data confirmed that pupil diameter reflected cognitive effort. After training, participants randomized to PST used fewer attentional resources to perform complex visual tasks as compared to the control group. These pupil diameter data indicated that PST appears to increase the efficiency of attentional resource allocation. Increases in cognitive efficiency have been hypothesized to underlie improvements following experience with action video games, and improved cognitive efficiency has been hypothesized to underlie the benefits of processing speed training in older adults. These data reveal that these training schemes may share a common underlying mechanism of increasing cognitive efficiency in younger adults.

  15. Testing for coevolutionary diversification: linking pattern with process.

    Science.gov (United States)

    Althoff, David M; Segraves, Kari A; Johnson, Marc T J

    2014-02-01

    Coevolutionary diversification is cited as a major mechanism driving the evolution of diversity, particularly in plants and insects. However, tests of coevolutionary diversification have focused on elucidating macroevolutionary patterns rather than the processes giving rise to such patterns. Hence, there is weak evidence that coevolution promotes diversification. This is in part due to a lack of understanding about the mechanisms by which coevolution can cause speciation and the difficulty of integrating results across micro- and macroevolutionary scales. In this review, we highlight potential mechanisms of coevolutionary diversification, outline approaches to examine this process across temporal scales, and propose a set of minimal requirements for demonstrating coevolutionary diversification. Our aim is to stimulate research that tests more rigorously for coevolutionary diversification. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Testing the causality of Hawkes processes with time reversal

    Science.gov (United States)

    Cordi, Marcus; Challet, Damien; Muni Toke, Ioane

    2018-03-01

    We show that univariate and symmetric multivariate Hawkes processes are only weakly causal: the true log-likelihoods of real and reversed event time vectors are almost equal, thus parameter estimation via maximum likelihood only weakly depends on the direction of the arrow of time. In ideal (synthetic) conditions, tests of goodness of parametric fit unambiguously reject backward event times, which implies that inferring kernels from time-symmetric quantities, such as the autocovariance of the event rate, only rarely produce statistically significant fits. Finally, we find that fitting financial data with many-parameter kernels may yield significant fits for both arrows of time for the same event time vector, sometimes favouring the backward time direction. This goes to show that a significant fit of Hawkes processes to real data with flexible kernels does not imply a definite arrow of time unless one tests it.

  17. Progress in high-efficient solution process organic photovoltaic devices fundamentals, materials, devices and fabrication

    CERN Document Server

    Li, Gang

    2015-01-01

    This book presents an important technique to process organic photovoltaic devices. The basics, materials aspects and manufacturing of photovoltaic devices with solution processing are explained. Solution processable organic solar cells - polymer or solution processable small molecules - have the potential to significantly reduce the costs for solar electricity and energy payback time due to the low material costs for the cells, low cost and fast fabrication processes (ambient, roll-to-roll), high material utilization etc. In addition, organic photovoltaics (OPV) also provides attractive properties like flexibility, colorful displays and transparency which could open new market opportunities. The material and device innovations lead to improved efficiency by 8% for organic photovoltaic solar cells, compared to 4% in 2005. Both academic and industry research have significant interest in the development of this technology. This book gives an overview of the booming technology, focusing on the solution process fo...

  18. An Efficient Method to Search Real-Time Bulk Data for an Information Processing System

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Kim, Jong Myung; Suh, Yong Suk; Keum, Jong Yong; Park, Heui Youn

    2005-01-01

    The Man Machine Interface System (MMIS) of System-integrated Modular Advanced ReacTor (SMART) is designed with fully digitalized features. The Information Processing System (IPS) of the MMIS acquires and processes plant data from other systems. In addition, the IPS provides plant operation information to operators in the control room. The IPS is required to process bulky data in a real-time. So, it is necessary to consider a special processing method with regards to flexibility and performance because more than a few thousands of Plant Information converges on the IPS. Among other things, the processing time for searching for data from the bulk data consumes much more than other the processing times. Thus, this paper explores an efficient method for the search and examines its feasibility

  19. A preferential design approach for energy-efficient and robust implantable neural signal processing hardware.

    Science.gov (United States)

    Narasimhan, Seetharam; Chiel, Hillel J; Bhunia, Swarup

    2009-01-01

    For implantable neural interface applications, it is important to compress data and analyze spike patterns across multiple channels in real time. Such a computational task for online neural data processing requires an innovative circuit-architecture level design approach for low-power, robust and area-efficient hardware implementation. Conventional microprocessor or Digital Signal Processing (DSP) chips would dissipate too much power and are too large in size for an implantable system. In this paper, we propose a novel hardware design approach, referred to as "Preferential Design" that exploits the nature of the neural signal processing algorithm to achieve a low-voltage, robust and area-efficient implementation using nanoscale process technology. The basic idea is to isolate the critical components with respect to system performance and design them more conservatively compared to the noncritical ones. This allows aggressive voltage scaling for low power operation while ensuring robustness and area efficiency. We have applied the proposed approach to a neural signal processing algorithm using the Discrete Wavelet Transform (DWT) and observed significant improvement in power and robustness over conventional design.

  20. Increasing the efficiency of designing hemming processes by using an element-based metamodel approach

    Science.gov (United States)

    Kaiser, C.; Roll, K.; Volk, W.

    2017-09-01

    In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.

  1. Parallel-Processing Test Bed For Simulation Software

    Science.gov (United States)

    Blech, Richard; Cole, Gary; Townsend, Scott

    1996-01-01

    Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).

  2. Testing of new banknotes for machines that process currency

    Science.gov (United States)

    Foster, Eugenie E.

    2000-04-01

    Banknotes are now frequently use din machines. The Federal Reserve Board and the US Department of the Treasury have identified a need to produce notes that are reliably accepted in a variety of machine applications. This paper describes the steps that led to identifying requirements of manufacturers of machines that process banknotes for test notes, and the program developed for the Bureau of Engraving and Printing to address those requirements.

  3. Hybrid microcircuit technology handbook materials, processes, design, testing and production

    CERN Document Server

    Licari, James J

    1998-01-01

    The Hybrid Microcircuit Technology Handbook integrates the many diverse technologies used in the design, fabrication, assembly, and testing of hybrid segments crucial to the success of producing reliable circuits in high yields. Among these are: resistor trimming, wire bonding, die attachment, cleaning, hermetic sealing, and moisture analysis. In addition to thin films, thick films, and assembly processes, important chapters on substrate selections, handling (including electrostatic discharge), failure analysis, and documentation are included. A comprehensive chapter of design guidelines will

  4. Proposing and testing SOA governance process: A case study approach

    DEFF Research Database (Denmark)

    Koumaditis, Konstantinos; Themistocleous, Marinos

    2015-01-01

    Longstanding Healthcare Information Systems (HIS) integration challenges drove healthcare organisations to invest in new paradigms like Service Oriented Architecture (SOA). Yet, SOA holds challenges of its own, with SOA Governance surfacing on the top. This research depicts the development......, grounded in the normative literature and further developed to include healthcare aspects. The proposition is tested in a large Greek hospital utilising qualitative methods and the findings presented herein. This proposal aims to pinpoint attributes and guidelines for SOA Governance Process, required...

  5. Acceptance test report for 241-AW process air system

    International Nuclear Information System (INIS)

    Kostelnik, A.J.

    1994-01-01

    The acceptance test procedure (ATP) for the compressed air system at building 241-AW-273 was completed on March 11, 1993. The system was upgraded to provide a reliable source of compressed air to the tank farm. The upgrade included the demolition of the existing air compressor and associated piping, as well as the installation of a new air compressor with a closed loop cooling system. A compressed air cross-tie was added to allow the process air compressor to function as a back-up to the existing instrument air compressor. The purpose of the ATP was to achieve three primary objectives: verify system upgrade in accordance with the design media; provide functional test of system components and controls; and prepare the system for the Operational Test. The ATP was successfully completed with thirteen exceptions, which were resolved prior to completing the acceptance test. The repaired exceptions had no impact to safety or the environment and are briefly summarized. Testing ensured that the system was installed per design, that its components function as required and that it is ready for operational testing and subsequent turnover to operations

  6. Evaluation of energy efficiency efforts of oil and gas offshore processing

    DEFF Research Database (Denmark)

    Nguyen, Tuong-Van; Voldsund, Mari; Breuhaus, Peter

    2015-01-01

    the energy performance of these facilities, by decreasing the power and heating requirements and designing more efficient processes. Several technologies that have been proposed are to (i) promote energy integration within the oil and gas processing plant, (ii) add an additional pressure extraction level......, (iii) implement multiphase expanders, and (iv) install a waste heat recovery system. The present work builds on two case studies located in the North and Norwegian Seas, which differ by the type of oil processed, operating conditions and strategies. The findings suggest that no generic improvement can...

  7. Interrelationships between trait anxiety, situational stress and mental effort predict phonological processing efficiency, but not effectiveness.

    Science.gov (United States)

    Edwards, Elizabeth J; Edwards, Mark S; Lyvers, Michael

    2016-08-01

    Attentional control theory (ACT) describes the mechanisms associated with the relationship between anxiety and cognitive performance. We investigated the relationship between cognitive trait anxiety, situational stress and mental effort on phonological performance using a simple (forward-) and complex (backward-) word span task. Ninety undergraduate students participated in the study. Predictor variables were cognitive trait anxiety, indexed using questionnaire scores; situational stress, manipulated using ego threat instructions; and perceived level of mental effort, measured using a visual analogue scale. Criterion variables (a) performance effectiveness (accuracy) and (b) processing efficiency (accuracy divided by response time) were analyzed in separate multiple moderated-regression analyses. The results revealed (a) no relationship between the predictors and performance effectiveness, and (b) a significant 3-way interaction on processing efficiency for both the simple and complex tasks, such that at higher effort, trait anxiety and situational stress did not predict processing efficiency, whereas at lower effort, higher trait anxiety was associated with lower efficiency at high situational stress, but not at low situational stress. Our results were in full support of the assumptions of ACT and implications for future research are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. An efficient, maintenance free and approved method for spectroscopic control and monitoring of blend uniformity: The moving F-test.

    Science.gov (United States)

    Besseling, Rut; Damen, Michiel; Tran, Thanh; Nguyen, Thanh; van den Dries, Kaspar; Oostra, Wim; Gerich, Ad

    2015-10-10

    Dry powder mixing is a wide spread Unit Operation in the Pharmaceutical industry. With the advent of in-line Near Infrared (NIR) Spectroscopy and Quality by Design principles, application of Process Analytical Technology to monitor Blend Uniformity (BU) is taking a more prominent role. Yet routine use of NIR for monitoring, let alone control of blending processes is not common in the industry, despite the improved process understanding and (cost) efficiency that it may offer. Method maintenance, robustness and translation to regulatory requirements have been important barriers to implement the method. This paper presents a qualitative NIR-BU method offering a convenient and compliant approach to apply BU control for routine operation and process understanding, without extensive calibration and method maintenance requirements. The method employs a moving F-test to detect the steady state of measured spectral variances and the endpoint of mixing. The fundamentals and performance characteristics of the method are first presented, followed by a description of the link to regulatory BU criteria, the method sensitivity and practical considerations. Applications in upscaling, tech transfer and commercial production are described, along with evaluation of the method performance by comparison with results from quantitative calibration models. A full application, in which end-point detection via the F-test controls the blending process of a low dose product, was successfully filed in Europe and Australia, implemented in commercial production and routinely used for about five years and more than 100 batches. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. The Internet Process Addiction Test: Screening for Addictions to Processes Facilitated by the Internet

    Directory of Open Access Journals (Sweden)

    Jason C. Northrup

    2015-07-01

    Full Text Available The Internet Process Addiction Test (IPAT was created to screen for potential addictive behaviors that could be facilitated by the internet. The IPAT was created with the mindset that the term “Internet addiction” is structurally problematic, as the Internet is simply the medium that one uses to access various addictive processes. The role of the internet in facilitating addictions, however, cannot be minimized. A new screening tool that effectively directed researchers and clinicians to the specific processes facilitated by the internet would therefore be useful. This study shows that the Internet Process Addiction Test (IPAT demonstrates good validity and reliability. Four addictive processes were effectively screened for with the IPAT: Online video game playing, online social networking, online sexual activity, and web surfing. Implications for further research and limitations of the study are discussed.

  10. Processing of LEU targets for 99Mo production--testing and modification of the Cintichem process

    International Nuclear Information System (INIS)

    Wu, D.; Landsberger, S.; Buchholz, B.

    1995-09-01

    Recent experimental results on testing and modification of the Cintichem process to allow substitution of low enriched uranium (LEU) for high enriched uranium (HEU) targets are presented in this report. The main focus is on 99 Mo recovery and purification by its precipitation with α-benzoin oxime. Parameters that were studied include concentrations of nitric and sulfuric acids, partial neutralization of the acids, molybdenum and uranium concentrations, and the ratio of α-benzoin oxime to molybdenum. Decontamination factors for uranium, neptunium, and various fission products were measured. Experiments with tracer levels of irradiated LEU were conducted for testing the 99 Mo recovery and purification during each step of the Cintichem process. Improving the process with additional processing steps was also attempted. The results indicate that the conversion of molybdenum chemical processing from HEU to LEU targets is possible

  11. The Internet Process Addiction Test: Screening for Addictions to Processes Facilitated by the Internet.

    Science.gov (United States)

    Northrup, Jason C; Lapierre, Coady; Kirk, Jeffrey; Rae, Cosette

    2015-07-28

    The Internet Process Addiction Test (IPAT) was created to screen for potential addictive behaviors that could be facilitated by the internet. The IPAT was created with the mindset that the term "Internet addiction" is structurally problematic, as the Internet is simply the medium that one uses to access various addictive processes. The role of the internet in facilitating addictions, however, cannot be minimized. A new screening tool that effectively directed researchers and clinicians to the specific processes facilitated by the internet would therefore be useful. This study shows that the Internet Process Addiction Test (IPAT) demonstrates good validity and reliability. Four addictive processes were effectively screened for with the IPAT: Online video game playing, online social networking, online sexual activity, and web surfing. Implications for further research and limitations of the study are discussed.

  12. Process improvement methods increase the efficiency, accuracy, and utility of a neurocritical care research repository.

    Science.gov (United States)

    O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor

    2012-08-01

    Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.

  13. A test for judging the presence of additional scatter in a Poisson process

    International Nuclear Information System (INIS)

    Mueller, J.W.

    1978-01-01

    The effect of additional scatter on a Poisson process is studied. Possible causes for such fluctuations are insufficient stability of the detection efficiency or of the associated electronics. It is shown with a simple model that the presence of fluctuations results in a characteristic broadening of the counting distribution. Comparison of the observed distribution with the one expected for a Poisson process with the same mean value will show three different regions, each with predictable sign of the deviation; the presence of scatter can thus be decided upon by a sign test. Experimental results are in excellent agreement with this expectation

  14. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  15. Comparison of the energy efficiency to produce agroethanol between various industries and processes: Synthesis

    International Nuclear Information System (INIS)

    Chavanne, Xavier; Frangi, Jean-Pierre

    2011-01-01

    The article assesses the energy R required by a system to transform a cereal or sugar plant into ethanol. From the specific consumption r j of each process j and its weight w j in the system, process consumption share R j is deduced and hence R, sum of R j . Depending on w j definition, R j and R are relative to either 100 J of ethanol produced or 100 J of plant harvested. Depending on the nature of r j , R j and R represent either only primary external energies, or all fuel and electricity consumed directly, or external and internal energies. From one definition to another R for average sugar cane based industries is the best or the worst relative to other plants. This results also from the use of cane residues as fuels while operating outdated processes. Through r j the process based analysis allows to examine for each system the impact of modern processes or different use of residues. All systems benefit except sugar beet based industry close to its best efficiency. This flexibility permits even to build a self-sufficient system where existing processes produce from system resources substitutes to external energies. R becomes an unambiguous definition of a system efficiency. It shows that all agroethanol systems are more consuming than petroleum industry. The system can be expanded to the vehicle stage to compare with alternatives to ethanol such as electricity and biogas. Wheat straw burnt to produce electricity used in an electrical vehicle will present R close to that of petroleum industry. -- Highlights: → Study of the energy consumptions of agroethanol industries with a process based analysis. → Different definitions of energy efficiency with potential opposite conclusions. → Previous highlight is overcome using self sufficient systems with existing processes. → Consumptions of average and improved agroethanol industries larger than for petroleum industries. → Electricity from wheat straw combustion can compete with gasoline from crude oil.

  16. Arx: a toolset for the efficient simulation and direct synthesis of high-performance signal processing algorithms

    NARCIS (Netherlands)

    Hofstra, K.L.; Gerez, Sabih H.

    2007-01-01

    This paper addresses the efficient implementation of highperformance signal-processing algorithms. In early stages of such designs many computation-intensive simulations may be necessary. This calls for hardware description formalisms targeted for efficient simulation (such as the programming

  17. Towards a utilisation of transient processing in the technology of high efficiency silicon solar cells

    International Nuclear Information System (INIS)

    Eichhammer, W.

    1989-01-01

    The utilization of transient processing in the technology of high efficient silicon solar cells is investigated. An ultraviolet laser (an ArF pulsed excimer laser working at 193 nm) is applied. Laser processing induces only a short superficial melting of the material and does not modify the transport properties in the base of the material. This mode of processing associated to ion implantation to form the junction as well as an oxide layer in an atmosphere of oxygen. The volume was left entirely cold in this process. The results of the investigation show: that an entirely cold process of solar cell fabrication needs a thermal treatment at a temperature around 600 C; that the oxides obtained are not satisfying as passivating layers; and that the Rapid Thermal Processing (RTP) induced recombination centers are not directly related to the quenching step but a consequence of the presence of metal impurities. The utilisation of transient processing in the adiabatic regime (laser) and in the rapid isothermal regime (RTP) are possible as two complementary techniques for the realization of high efficiency solar cells

  18. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Bach, Matthias

    2014-07-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  19. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    International Nuclear Information System (INIS)

    Bach, Matthias

    2014-01-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  20. Diffraction Efficiency Testing of Sinusoidal and Blazed Off-Plane Reflection Gratings

    Science.gov (United States)

    Tutt, James H.; McEntaffer, Randall L.; Marlowe, Hannah; Miles, Drew M.; Peterson, Thomas J.; Deroo, Casey T.; Scholze, Frank; Laubis, Christian

    2016-09-01

    Reflection gratings in the off-plane mount have the potential to enhance the performance of future high resolution soft X-ray spectrometers. Diffraction efficiency can be optimized through the use of blazed grating facets, achieving high-throughput on one side of zero-order. This paper presents the results from a comparison between a grating with a sinusoidally grooved profile and two gratings that have been blazed. The results show that the blaze does increase throughput to one side of zero-order; however, the total throughput of the sinusoidal gratings is greater than the blazed gratings, suggesting the method of manufacturing the blazed gratings does not produce precise facets. The blazed gratings were also tested in their Littrow and anti-Littrow configurations to quantify diffraction efficiency sensitivity to rotations about the grating normal. Only a small difference in the energy at which efficiency is maximized between the Littrow and anti-Littrow configurations is seen with a small shift in peak efficiency towards higher energies in the anti-Littrow case. This is due to a decrease in the effective blaze angle in the anti-Littrow mounting. This is supported by PCGrate-SX V6.1 modeling carried out for each blazed grating which predicts similar response trends in the Littrow and anti-Littrow orientations.

  1. Effect of Shoes on Stiffness and Energy Efficiency of Ankle-Foot Orthosis: Bench Testing Analysis.

    Science.gov (United States)

    Kobayashi, Toshiki; Gao, Fan; LeCursi, Nicholas; Foreman, K Bo; Orendurff, Michael S

    2017-12-01

    Understanding the mechanical properties of ankle-foot orthoses (AFOs) is important to maximize their benefit for those with movement disorders during gait. Though mechanical properties such as stiffness and/or energy efficiency of AFOs have been extensively studied, it remains unknown how and to what extent shoes influence their properties. The aim of this study was to investigate the effect of shoes on stiffness and energy efficiency of an AFO using a custom mechanical testing device. Stiffness and energy efficiency of the AFO were measured in the plantar flexion and dorsiflexion range, respectively, under AFO-alone and AFO-Shoe combination conditions. The results of this study demonstrated that the stiffness of the AFO-Shoe combination was significantly decreased compared to the AFO-alone condition, but no significant differences were found in energy efficiency. From the results, we recommend that shoes used with AFOs should be carefully selected not only based on their effect on alignment of the lower limb, but also their effects on overall mechanical properties of the AFO-Shoe combination. Further study is needed to clarify the effects of differences in shoe designs on AFO-Shoe combination mechanical properties.

  2. Testing the weak-form efficiency of the WTI crude oil futures market

    Science.gov (United States)

    Jiang, Zhi-Qiang; Xie, Wen-Jie; Zhou, Wei-Xing

    2014-07-01

    The weak-form efficiency of energy futures markets has long been studied and empirical evidence suggests controversial conclusions. In this work, nonparametric methods are adopted to estimate the Hurst indexes of the WTI crude oil futures prices (1983-2012) and a strict statistical test in the spirit of bootstrapping is put forward to verify the weak-form market efficiency hypothesis. The results show that the crude oil futures market is efficient when the whole period is considered. When the whole series is divided into three sub-series separated by the outbreaks of the Gulf War and the Iraq War, it is found that the Gulf War reduced the efficiency of the market. If the sample is split into two sub-series based on the signing date of the North American Free Trade Agreement, the market is found to be inefficient in the sub-periods during which the Gulf War broke out. The same analysis on short-time series in moving windows shows that the market is inefficient only when some turbulent events occur, such as the oil price crash in 1985, the Gulf war, and the oil price crash in 2008.

  3. Phaco-efficiency test and re-aspiration analysis of repulsed particle in phacoemulsification.

    Science.gov (United States)

    Kim, Jae-hyung; Ko, Dong-Ah; Kim, Jae Yong; Kim, Myoung Joon; Tchah, Hungwon

    2013-04-01

    To measure the efficiency of phacoemulsification, we have developed a new experimental model for testing phaco-efficiency and analyzed re-aspiration of repulsed particles. Using a Kitaro wetlab system, a piece of blood agar (BA) was placed in an artificial chamber and the phacoemulsifier was placed horizontally. The settings of the phacoemulsifier (Infiniti, Alcon Laboratories) were 26 cc/min for aspiration, 350 cc/min for vacuum, and 95 cm of bottle height. The time to remove BAs was measured using Ozil 100 %, Ozil 40 %, and longitudinal 40 % of phaco power. The angle between the re-aspirated BA particles and the axis of the phacoemulsifier (re-aspiration zone, degree) was analyzed. The average time (seconds) to remove BAs was lower in the Ozil 100 % and the Ozil 40 % mode than in the longitudinal mode (0.37 ± 0.39, 0.85 ± 0.57, and 2.22 ± 1.40 respectively, P value < 0.01). Repulsion exceeding 1 mm occurred more frequently in the longitudinal mode than in the Ozil 100 % mode (100 % vs 40 %, P value = 0.01, Fisher's exact test). The average of re-aspiration zone was 25.9 ± 14.5 in the longitudinal 40 % and 54.0 ± 23.0 in the Ozil 40 % (P value = 0.016). The Ozil mode was more efficient than the longitudinal mode. In addition, the Ozil mode provided less repulsion and wider aspiration zone.

  4. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  5. Energy use and implications for efficiency strategies in global fluid-milk processing industry

    International Nuclear Information System (INIS)

    Xu Tengfang; Flapper, Joris

    2009-01-01

    The fluid-milk processing industry around the world processes approximately 60% of total raw milk production to create diverse fresh fluid-milk products. This paper reviews energy usage in existing global fluid-milk markets to identify baseline information that allows comparisons of energy performance of individual plants and systems. In this paper, we analyzed energy data compiled through extensive literature reviews on fluid-milk processing across a number of countries and regions. The study has found that the average final energy intensity of individual plants exhibited significant large variations, ranging from 0.2 to 12.6 MJ per kg fluid-milk product across various plants in different countries and regions. In addition, it is observed that while the majority of larger plants tended to exhibit higher energy efficiency, some exceptions existed for smaller plants with higher efficiency. These significant differences have indicated large potential energy-savings opportunities in the sector across many countries. Furthermore, this paper illustrates a positive correlation between implementing energy-monitoring programs and curbing the increasing trend in energy demand per equivalent fluid-milk product over time in the fluid-milk sector, and suggests that developing an energy-benchmarking framework, along with promulgating new policy options should be pursued for improving energy efficiency in global fluid-milk processing industry.

  6. Bacterial magnetic particles improve testes-mediated transgene efficiency in mice.

    Science.gov (United States)

    Wang, Chao; Sun, Guanghong; Wang, Ye; Kong, Nana; Chi, Yafei; Yang, Leilei; Xin, Qiliang; Teng, Zhen; Wang, Xu; Wen, Yujun; Li, Ying; Xia, Guoliang

    2017-11-01

    Nano-scaled materials have been proved to be ideal DNA carriers for transgene. Bacterial magnetic particles (BMPs) help to reduce the toxicity of polyethylenimine (PEI), an efficient gene-transferring agent, and assist tissue transgene ex vivo. Here, the effectiveness of the BMP-PEI complex-conjugated foreign DNAs (BPDs) in promoting testes-mediated gene transfer (TMGT) in mouse was compared with that of liposome-conjugated foreign DNAs. The results proved that through testes injection, the clusters of BPDs successfully reached the cytoplasm and the nuclear of spermatogenesis cell, and expressed in testes of transgene founder mice. Additionally, the ratio of founder mice obtained from BPDs (88%) is about 3 times higher than the control (25%) (p mice from BPD group were significantly improved, as compared with the control (p mice within the first filial was significantly higher in BPDs compared with the control (73.8% versus 11.6%, p mice in vivo.

  7. High efficiency grating couplers based on shared process with CMOS MOSFETs

    International Nuclear Information System (INIS)

    Qiu Chao; Sheng Zhen; Wu Ai-Min; Wang Xi; Zou Shi-Chang; Gan Fu-Wan; Li Le; Albert Pang

    2013-01-01

    Grating couplers are widely investigated as coupling interfaces between silicon-on-insulator waveguides and optical fibers. In this work, a high-efficiency and complementary metal—oxide—semiconductor (CMOS) process compatible grating coupler is proposed. The poly-Si layer used as a gate in the CMOS metal—oxide—semiconductor field effect transistor (MOSFET) is combined with a normal fully etched grating coupler, which greatly enhances its coupling efficiency. With optimal structure parameters, a coupling efficiency can reach as high as ∼ 70% at a wavelength of 1550 nm as indicated by simulation. From the angle of fabrication, all masks and etching steps are shared between MOSFETs and grating couplers, thereby making the high performance grating couplers easily integrated with CMOS circuits. Fabrication errors such as alignment shift are also simulated, showing that the device is quite tolerant in fabrication. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  8. Conversion efficiency in the process of copolarized spontaneous four-wave mixing

    International Nuclear Information System (INIS)

    Garay-Palmett, Karina; U'Ren, Alfred B.; Rangel-Rojo, Raul

    2010-01-01

    We study the process of copolarized spontaneous four-wave mixing in single-mode optical fibers, with an emphasis on an analysis of the conversion efficiency. We consider both the monochromatic-pump and pulsed-pump regimes, as well as both the degenerate-pump and nondegenerate-pump configurations. We present analytical expressions for the conversion efficiency, which are given in terms of double integrals. In the case of pulsed pumps we take these expressions to closed analytical form with the help of certain approximations. We present results of numerical simulations, and compare them to values obtained from our analytical expressions, for the conversion efficiency as a function of several key experimental parameters.

  9. Self Cleaning High Efficiency Particulate Air (HEPA) Filtration without Interrupting Process Flow - 59347

    International Nuclear Information System (INIS)

    Chadwick, Chris

    2012-01-01

    The strategy of protecting the traditional glass fibre HEPA filtration train from it's blinding contamination and the recovery of dust by the means of self cleaning, pre-filtration is a proven means in the reduction of ultimate disposal volumes and has been used within the Fuel Production Industry. However, there is an increasing demand in nuclear applications requiring elevated operating temperatures, fire resistance, moisture resistance and chemical composition that the existing glass fibre HEPA filtration cannot accommodate, which can be remedied by the use of a metallic HEPA filter media. Previous research suggests that the then costs to the Department of Energy (DOE), based on a five year life cycle, was $29.5 million for the installation, testing, removal and disposal of glass fibre HEPA filtration trains. Within these costs, $300 was the value given to the filter and $4, 450 was given to the peripheral activity. Development of a low cost, cleanable, metallic, direct replacement of the traditional filter train will the clear solution. The Bergman et al work has suggested that a 1000 ft 3 /min, cleanable, stainless HEPA could be commercially available for $5, 000 each, whereas the industry has determined that the truer cost of such an item in isolation would be closer to $15, 000. This results in a conflict within the requirement between 'low cost' and 'stainless HEPA'. By proposing a system that combines metallic HEPA filtration with the ability to self clean without interrupting the process flow, the need for a tradition HEPA filtration train will be eliminated and this dramatically reduces the resources required for cleaning or disposal, thus presenting a route to reducing ultimate costs. The paper will examine the performance characteristics, filtration efficiency, flow verses differential pressure and cleanability of a self cleaning HEPA grade sintered metal filter element, together with data to prove the contention. (authors)

  10. Economic efficiency of countries' clinical review processes and competitiveness on the market of human experimentation.

    Science.gov (United States)

    Ippoliti, Roberto

    2013-01-01

    Clinical research is a specific phase of pharmaceutical industry's production process in which companies test candidate drugs on patients to collect clinical evidence about safety and effectiveness. Information is essential to obtain manufacturing authorization from the national drug agency and, in this way, make profits on the market. Considering this activity, however, the public stakeholder has to face a conflict of interests. On the one side, there is society's necessity to make advances in medicine and, of course, to promote pharmaceutical companies' investments in this specific phase (new generation). On the other side, there is the duty to protect patients involved in these experimental treatments (old generation). To abide by this moral duty, a protection system was developed through the years, based on two legal institutions: informed consent and institutional review board. How should an efficient protection system that would take human experimentation into account be shaped? Would it be possible for the national protection system of patients' rights to affect the choice of whether to develop a clinical trial in a given country or not? Looking at Europe and considering a protection system that is shaped around institutional review boards, this article is an empirical work that tries to give answers to these open questions. It shows how a protection system that can minimize the time necessary to start a trial can positively affect pharmaceutical clinical research, that is, the choice of pharmaceutical companies to start innovative medical treatments in a given country. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Energy Efficient Microwave Hybrid Processing of Lime for Cement, Steel, and Glass Industries

    Energy Technology Data Exchange (ETDEWEB)

    Fall, Morgana L; Yakovlev, Vadim; Sahi, Catherine; Baranova, Inessa; Bowers, Johnney G; Esquenazi\t, Gibran L

    2012-02-10

    In this study, the microwave materials interactions were studied through dielectric property measurements, process modeling, and lab scale microwave hybrid calcination tests. Characterization and analysis were performed to evaluate material reactions and energy usage. Processing parameters for laboratory scale and larger scale calcining experiments were developed for MAT limestone calcination. Early stage equipment design concepts were developed, with a focus on microwave post heating treatment. The retrofitting of existing rotary calcine equipment in the lime industry was assessed and found to be feasible. Ceralink sought to address some of the major barriers to the uptake of MAT identified as the need for (1) team approach with end users, technology partners, and equipment manufacturers, (2) modeling that incorporates kiln materials and variations to the design of industrial microwave equipment. This project has furthered the commercialization effort of MAT by working closely with an industrial lime manufacturer to educate them regarding MAT, identifying equipment manufacturer to supply microwave equipment, and developing a sophisticated MAT modeling with WPI, the university partner. MAT was shown to enhance calcining through lower energy consumption and faster reaction rates compared to conventional processing. Laboratory testing concluded that a 23% reduction in energy was possible for calcining small batches (5kg). Scale-up testing indicated that the energy savings increased as a function of load size and 36% energy savings was demonstrated (22 kg). A sophisticated model was developed which combines simultaneous microwave and conventional heating. Continued development of this modeling software could be used for larger scale calcining simulations, which would be a beneficial low-cost tool for exploring equipment design prior to actual building. Based on these findings, estimates for production scale MAT calcining benefits were calculated, assuming uptake of

  12. Efficient energy conversion in the pulp and paper industry: application to a sulfite wood pulping process

    Energy Technology Data Exchange (ETDEWEB)

    Marechal, F.

    2007-07-01

    This report measures the actions performed in 2006 and the actions planned for 2007 within the framework of the project Efficient Energy Conversion in the Pulp and Paper Industry. In addition to the data reconciliation models of the steam and condensate networks and of the process of Borregaard Schweiz AG, process models have been developed with the goal of defining the heat requirements of the process. The combination of utility system data reconciliation with the process models allows to considerably reduce the need for detailed process modelling and for on-site data collection and measurement. A systematic definition of the hot and cold streams in the process has been developed in order to compute the minimum energy requirement of the process. The process requirements have been defined using the dual representation concept where the energy requirement of the process unit operations are systematically analysed from their thermodynamic requirement and the way they are satisfied by the technology that implements the operation. Corresponding to the same energy requirement but realised with different temperature allows on one hand to define the exergy efficiency of the heat transfer system in each of the process unit operations and to identify possible energy savings by heat exchange in the system. The analysis has been completed by the definition of the possible energy recovery from waste streams. The minimum energy requirement of the process using the different requirement representation has been realised and the analysis of the energy savings opportunities is now under preparation. This new step will first concern the definition of the utility system integration and the systematic analysis of the energy savings opportunities followed by the techno-economic evaluation of the most profitable energy savings options in the process. The national and international collaborations constitute also an important part of this project. The project is done in close

  13. Image processing of globular clusters - Simulation for deconvolution tests (GlencoeSim)

    Science.gov (United States)

    Blazek, Martin; Pata, Petr

    2016-10-01

    This paper presents an algorithmic approach for efficiency tests of deconvolution algorithms in astronomic image processing. Due to the existence of noise in astronomical data there is no certainty that a mathematically exact result of stellar deconvolution exists and iterative or other methods such as aperture or PSF fitting photometry are commonly used. Iterative methods are important namely in the case of crowded fields (e.g., globular clusters). For tests of the efficiency of these iterative methods on various stellar fields, information about the real fluxes of the sources is essential. For this purpose a simulator of artificial images with crowded stellar fields provides initial information on source fluxes for a robust statistical comparison of various deconvolution methods. The "GlencoeSim" simulator and the algorithms presented in this paper consider various settings of Point-Spread Functions, noise types and spatial distributions, with the aim of producing as realistic an astronomical optical stellar image as possible.

  14. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  15. Highly efficient red phosphorescent organic light-emitting diodes based on solution processed emissive layer

    International Nuclear Information System (INIS)

    Liu, Baiquan; Xu, Miao; Tao, Hong; Ying, Lei; Zou, Jianhua; Wu, Hongbin; Peng, Junbiao

    2013-01-01

    Highly efficient red phosphorescent organic polymer light-emitting diodes (PhOLEDs) were fabricated based on a solution-processed small-molecule host 4,4′-bis(N-carbazolyl)-1,1′-biphenyl (CBP) by doping an iridium complex, tris(1-(2,6-dimethylphenoxy)-4-(4-chlorophenyl)phthalazine)iridium (III) (Ir(MPCPPZ) 3 ). A hole blocking layer 1,3,5-tri(1-phenyl-1H-benzo[d]imidazol-2-yl)phenyl (TPBI) with a function of electron transport was thermally deposited onto the top of CBP layer. The diode with the structure of ITO/PEDOT:PSS (50 nm)/CBP:Ir(MPCPPZ) 3 (55 nm)/TPBI (30 nm)/Ba (4 nm)/Al (120 nm) showed an external quantum efficiency (QE ext ) of 19.3% and luminous efficiency (LE) of 18.3 cd/A at a current density of 0.16 mA/cm 2 , and Commission International de I'Eclairage (CIE) coordinates of (0.607, 0.375). It was suggested that the diodes using TPBI layer exhibited nearly 100% internal quantum efficiency and one order magnitude enhanced LE or QE ext efficiencies. -- Highlights: • Efficient red PhOLEDs based on a solution-processed small-molecule host were fabricated. • By altering volume ratio of chloroform/chlorobenzene solvent, we got best film quality of CBP. • EQE of the diode was 19.3%, indicating nearly 100% internal quantum yield was achieved

  16. Energy-efficient neural information processing in individual neurons and neuronal networks.

    Science.gov (United States)

    Yu, Lianchun; Yu, Yuguo

    2017-11-01

    Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Defense Waste Processing Facility Canister Closure Weld Current Validation Testing

    Energy Technology Data Exchange (ETDEWEB)

    Korinko, P. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Maxwell, D. N. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2018-01-29

    Two closure welds on filled Defense Waste Processing Facility (DWPF) canisters failed to be within the acceptance criteria in the DWPF operating procedure SW4-15.80-2.3 (1). In one case, the weld heat setting was inadvertently provided to the canister at the value used for test welds (i.e., 72%) and this oversight produced a weld at a current of nominally 210 kA compared to the operating procedure range (i.e., 82%) of 240 kA to 263 kA. The second weld appeared to experience an instrumentation and data acquisition upset. The current for this weld was reported as 191 kA. Review of the data from the Data Acquisition System (DAS) indicated that three of the four current legs were reading the expected values, approximately 62 kA each, and the fourth leg read zero current. Since there is no feasible way by further examination of the process data to ascertain if this weld was actually welded at either the target current or the lower current, a test plan was executed to provide assurance that these Nonconforming Welds (NCWs) meet the requirements for strength and leak tightness. Acceptance of the welds is based on evaluation of Test Nozzle Welds (TNW) made specifically for comparison. The TNW were nondestructively and destructively evaluated for plug height, heat tint, ultrasonic testing (UT) for bond length and ultrasonic volumetric examination for weld defects, burst pressure, fractography, and metallography. The testing was conducted in agreement with a Task Technical and Quality Assurance Plan (TTQAP) (2) and applicable procedures.

  18. The retrieval efficiency test of descriptors and free vocabulary terms in INIS on-line search

    International Nuclear Information System (INIS)

    Ebinuma, Yukio; Takahashi, Satoko

    1981-01-01

    The test was done for 1) search topics with appropriate descriptors, 2) search topics with considerably broader descriptors, 3) search topics with no appropriate descriptors. As to (1) and (2) the retrieval efficiency was the same both on descriptor system and on keyword system (descriptors + free terms), and the search formulas were easily constructed. As to (3) the descriptor system ensured the recall ratio but decreased the precision ratio. On the other hand the keyword system made the construction of search formulas easy and resulted in good retrieval efficiency. The search system which is available both for full match method of descriptors and truncation method of keywords is desirable because each method can be selected according to the searcher's strategy and search topics. Free-term system seems unnecessary. (author)

  19. Integration of solar thermal for improved energy efficiency in low-temperature-pinch industrial processes

    International Nuclear Information System (INIS)

    Atkins, Martin J.; Walmsley, Michael R.W.; Morrison, Andrew S.

    2010-01-01

    Solar thermal systems have the potential to provide renewable industrial process heat and are especially suited for low pinch temperature processes such as those in the food, beverage, and textile sectors. When correctly integrated within an industrial process, they can provide significant progress towards both increased energy efficiency and reduction in emissions. However, the integration of renewable solar energy into industrial processes presents a challenge for existing process integration techniques due to the non-continuous nature of the supply. A thorough pinch analysis study of the industrial process, taking in to account non-continuous operating rates, should be performed to evaluate the utility demand profile. Solar collector efficiency data under variable climatic conditions should also be collected for the specific site. A systematic method of combining this information leads to improved design and an optimal operating strategy. This approach has been applied to a New Zealand milk powder plant and benefits of several integration strategies, including mass integration, are investigated. The appropriate placement of the solar heat is analogous to the placement of a hot utility source and an energy penalty will be incurred when the solar thermal system provides heat below the pinch temperature.

  20. Integration of solar thermal for improved energy efficiency in low-temperature-pinch industrial processes

    Energy Technology Data Exchange (ETDEWEB)

    Atkins, Martin J.; Walmsley, Michael R.W.; Morrison, Andrew S. [Energy Research Group, School of Science and Engineering, University of Waikato, Private Bag 3105, Hamilton 3240 (New Zealand)

    2010-05-15

    Solar thermal systems have the potential to provide renewable industrial process heat and are especially suited for low pinch temperature processes such as those in the food, beverage, and textile sectors. When correctly integrated within an industrial process, they can provide significant progress towards both increased energy efficiency and reduction in emissions. However, the integration of renewable solar energy into industrial processes presents a challenge for existing process integration techniques due to the non-continuous nature of the supply. A thorough pinch analysis study of the industrial process, taking in to account non-continuous operating rates, should be performed to evaluate the utility demand profile. Solar collector efficiency data under variable climatic conditions should also be collected for the specific site. A systematic method of combining this information leads to improved design and an optimal operating strategy. This approach has been applied to a New Zealand milk powder plant and benefits of several integration strategies, including mass integration, are investigated. The appropriate placement of the solar heat is analogous to the placement of a hot utility source and an energy penalty will be incurred when the solar thermal system provides heat below the pinch temperature. (author)

  1. [Efficiency indicators to assess the organ donation and transplantation process: systematic review of the literature].

    Science.gov (United States)

    Siqueira, Marina Martins; Araujo, Claudia Affonso; de Aguiar Roza, Bartira; Schirmer, Janine

    2016-08-01

    To search the literature and identify indicators used to monitor and control the organ donation and transplantation process and to group these indicators into categories. In November 2014, a systematic review of the literature was carried out in the following databases: Biblioteca Virtual em Saúde (BVS), EBSCO, Emerald, Proquest, Science Direct, and Web of Science. The following search terms (and the corresponding terms in Brazilian Portuguese) were employed: "efficiency," "indicators," "organ donation," "tissue and organ procurement," and "organ transplantation." Of the 344 articles retrieved, 23 original articles published between 1992 and 2013 were selected and reviewed for analysis of efficiency indicators. The review revealed 117 efficiency indicators, which were grouped according to similarity of content and divided into three categories: 1) 71 indicators related to organ donation, covering mortality statistics, communication of brain death, clinical status of donors and exclusion of donors for medical reasons, attitude of families, confirmation of donations, and extraction of organs and tissues; 2) 22 indicators related to organ transplantation, covering the surgical procedure per se and post-transplantation follow-up; and 3) 24 indicators related to the demand for organs and the resources of hospitals involved in the process. Even if organ transplantation is a recent phenomenon, the high number of efficiency indicators described in the literature suggests that scholars interested in this field have been searching for ways to measure performance. However, there is little standardization of the indicators used. Also, most indicators focus on the donation step, suggesting gaps in the measurement of efficiency at others points in the process. Additional indicators are needed to monitor important stages, such as organ distribution (for example, organ loss indicators) and post-transplantation aspects (for example, survival and quality of life).

  2. On the definition of exergy efficiencies for petroleum systems: Application to offshore oil and gas processing

    International Nuclear Information System (INIS)

    Nguyen, Tuong-Van; Voldsund, Mari; Elmegaard, Brian; Ertesvåg, Ivar Ståle; Kjelstrup, Signe

    2014-01-01

    Exergy-based efficiencies are measures of the thermodynamic perfection of systems and processes. A meaningful formulation of these performance criteria for petroleum systems is difficult because of (i) the high chemical exergy of hydrocarbons, (ii) the large variety of chemical components, and (iii) the differences in operating conditions between facilities. This work focuses on offshore processing plants, considering four oil platforms that differ by their working conditions and designs. Several approaches from the scientific literature for similar processes are presented and applied to the four cases. They showed a low sensitivity to performance improvements, gave inconsistent results, or favoured facilities operating under certain conditions. We suggest an alternative formulation, called the component-by-component exergy efficiency, which builds on the decomposition of the exergy flows at the level of the chemical compounds. It allows therefore for sound comparisons of separation systems, while it successfully evaluates their theoretical improvement potentials. The platform displaying the lowest efficiency (1.7%) is characterised by little pumping and compression works, at the opposite of the one displaying the highest performance (29.6%). A more realistic measure of the technical potential for improving these systems can be carried out by splitting further the exergy destruction into its avoidable and unavoidable parts. - Highlights: • Different exergy efficiency definitions for petroleum systems are reviewed. • These definitions are applied to four oil and gas platforms and are revealed to be inapplicable. • A new formulation, namely the component-by-component efficiency, is proposed. • The performance of the offshore platforms under study varies between 1.7% and 29.6%

  3. Photonic efficiency of the photodegradation of paracetamol in water by the photo-Fenton process.

    Science.gov (United States)

    Yamal-Turbay, E; Ortega, E; Conte, L O; Graells, M; Mansilla, H D; Alfano, O M; Pérez-Moya, M

    2015-01-01

    An experimental study of the homogeneous Fenton and photo-Fenton degradation of 4-amidophenol (paracetamol, PCT) is presented. For all the operation conditions evaluated, PCT degradation is efficiently attained by both Fenton and photo-Fenton processes. Also, photonic efficiencies of PCT degradation and mineralization are determined under different experimental conditions, characterizing the influence of hydrogen peroxide (H2O2) and Fe(II) on both contaminant degradation and sample mineralization. The maximum photonic degradation efficiencies for 5 and 10 mg L(-1) Fe(II) were 3.9 (H2O2 = 189 mg L(-1)) and 5 (H2O2 = 378 mg L(-1)), respectively. For higher concentrations of oxidant, H2O2 acts as a "scavenger" radical, competing in pollutant degradation and reducing the reaction rate. Moreover, in order to quantify the consumption of the oxidizing agent, the specific consumption of the hydrogen peroxide was also evaluated. For all operating conditions of both hydrogen peroxide and Fe(II) concentration, the consumption values obtained for Fenton process were always higher than the corresponding values observed for photo-Fenton. This implies a less efficient use of the oxidizing agent for dark conditions.

  4. Zinc oxide nanostructures and its nano-compounds for efficient visible light photo-catalytic processes

    Science.gov (United States)

    Adam, Rania E.; Alnoor, Hatim; Elhag, Sami; Nur, Omer; Willander, Magnus

    2017-02-01

    Zinc oxide (ZnO) in its nanostructure form is a promising material for visible light emission/absorption and utilization in different energy efficient photocatalytic processes. We will first present our recent results on the effect of varying the molar ratio of the synthesis nutrients on visible light emission. Further we will use the optimized conditions from the molar ration experiments to vary the synthesis processing parameters like stirring time etc. and the effect of all these parameters in order to optimize the efficiency and control the emission spectrum are investigated using different complementary techniques. Cathodoluminescence (CL) is combined with photoluminescence (PL) and electroluminescence (EL) as the techniques to investigate and optimizes visible light emission from ZnO/GaN light emitting diodes. We will then show and discuss our recent finding of the use of high quality ZnO nanoparticles (NPs) for efficient photo-degradation of toxic dyes using the visible spectra, namely with a wavelength up to 800 nm. In the end, we show how ZnO nanorods (NRs) are used as the first template to be transferred to bismuth zinc vanadate (BiZn2VO6). The BiZn2VO6 is then used to demonstrate efficient and cost effective hydrogen production through photoelectrochemical water splitting using solar radiation.

  5. 10 CFR 431.107 - Uniform test method for the measurement of energy efficiency of commercial heat pump water...

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Uniform test method for the measurement of energy efficiency of commercial heat pump water heaters. [Reserved] 431.107 Section 431.107 Energy DEPARTMENT OF....107 Uniform test method for the measurement of energy efficiency of commercial heat pump water heaters...

  6. The Rapid Integration and Test Environment - A Process for Achieving Software Test Acceptance

    OpenAIRE

    Jack, Rick

    2010-01-01

    Proceedings Paper (for Acquisition Research Program) Approved for public release; distribution unlimited. The Rapid Integration and Test Environment (RITE) initiative, implemented by the Program Executive Office, Command, Control, Communications, Computers and Intelligence, Command and Control Program Office (PMW-150), was born of necessity. Existing processes for requirements definition and management, as well as those for software development, did not consistently deliver high-qualit...

  7. ENERGY EFFICIENCY AS A CRITERION IN THE VEHICLE FLEET MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Davor Vujanović

    2010-01-01

    Full Text Available Transport represents an industry sector with intense energy consumption, the road transport sector within is the dominant subsector. The objective of the research presented in this paper is in defining the activities which applied within road freight transport companies contribute to enhancing vehicles' energy efficiency. Vehicle fleet operation management process effects on fuel consumption decrease have been looked into. Operation parameters that influence vehicle fuel consumption were analysed. In this sense, a survey has been realised in order to evaluate the vehicle load factor impact on the specific fuel consumption. Measures for enhancing vehicle's logistics efficiency have been defined. As a tool for those measures' implementation an algorithm for vehicle fleet operation management was developed which represented a basis for a dedicated software package development for vehicle dispatching process decision support. A set of measures has been recommended and their effects in fuel savings were evaluated.

  8. The improving of the heat networks operating process under the conditions of the energy efficiency providing

    Directory of Open Access Journals (Sweden)

    Blinova Tatiana

    2016-01-01

    Full Text Available Among the priorities it is important to highlight the modernization and improvement of energy efficiency of housing and communal services, as well as the transition to the principle of using the most efficient technologies used in reproduction (construction, creation of objects of municipal infrastructure and housing modernization. The main hypothesis of this study lies in the fact that in modern conditions the realization of the most important priorities of the state policy in the sphere of housing and communal services, is possible in the conditions of use of the most effective control technologies for the reproduction of thermal networks. It is possible to raise the level of information security Heat Distribution Company, and other market participants by improving business processes through the development of organizational and economic mechanism in the conditions of complex monitoring of heat network operation processes

  9. Advanced Thermoelectric Materials for Efficient Waste Heat Recovery in Process Industries

    Energy Technology Data Exchange (ETDEWEB)

    Adam Polcyn; Moe Khaleel

    2009-01-06

    The overall objective of the project was to integrate advanced thermoelectric materials into a power generation device that could convert waste heat from an industrial process to electricity with an efficiency approaching 20%. Advanced thermoelectric materials were developed with figure-of-merit ZT of 1.5 at 275 degrees C. These materials were not successfully integrated into a power generation device. However, waste heat recovery was demonstrated from an industrial process (the combustion exhaust gas stream of an oxyfuel-fired flat glass melting furnace) using a commercially available (5% efficiency) thermoelectric generator coupled to a heat pipe. It was concluded that significant improvements both in thermoelectric material figure-of-merit and in cost-effective methods for capturing heat would be required to make thermoelectric waste heat recovery viable for widespread industrial application.

  10. TDat: An Efficient Platform for Processing Petabyte-Scale Whole-Brain Volumetric Images.

    Science.gov (United States)

    Li, Yuxin; Gong, Hui; Yang, Xiaoquan; Yuan, Jing; Jiang, Tao; Li, Xiangning; Sun, Qingtao; Zhu, Dan; Wang, Zhenyu; Luo, Qingming; Li, Anan

    2017-01-01

    Three-dimensional imaging of whole mammalian brains at single-neuron resolution has generated terabyte (TB)- and even petabyte (PB)-sized datasets. Due to their size, processing these massive image datasets can be hindered by the computer hardware and software typically found in biological laboratories. To fill this gap, we have developed an efficient platform named TDat, which adopts a novel data reformatting strategy by reading cuboid data and employing parallel computing. In data reformatting, TDat is more efficient than any other software. In data accessing, we adopted parallelization to fully explore the capability for data transmission in computers. We applied TDat in large-volume data rigid registration and neuron tracing in whole-brain data with single-neuron resolution, which has never been demonstrated in other studies. We also showed its compatibility with various computing platforms, image processing software and imaging systems.

  11. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  12. HANFORD CONTAINERIZED CAST STONE FACILITY TASK 1 PROCESS TESTING & DEVELOPMENT FINAL TEST REPORT

    Energy Technology Data Exchange (ETDEWEB)

    LOCKREM, L L

    2005-07-13

    Laboratory testing and technical evaluation activities on Containerized Cast Stone (CCS) were conducted under the Scope of Work (SOW) contained in CH2M HILL Hanford Group, Inc. (CHG) Contract No. 18548 (CHG 2003a). This report presents the results of testing and demonstration activities discussed in SOW Section 3.1, Task I--''Process Development Testing'', and described in greater detail in the ''Containerized Grout--Phase I Testing and Demonstration Plan'' (CHG, 2003b). CHG (2003b) divided the CCS testing and evaluation activities into six categories, as follows: (1) A short set of tests with simulant to select a preferred dry reagent formulation (DRF), determine allowable liquid addition levels, and confirm the Part 2 test matrix. (2) Waste form performance testing on cast stone made from the preferred DRF and a backup DRF, as selected in Part I, and using low activity waste (LAW) simulant. (3) Waste form performance testing on cast stone made from the preferred DRF using radioactive LAW. (4) Waste form validation testing on a selected nominal cast stone formulation using the preferred DRF and LAW simulant. (5) Engineering evaluations of explosive/toxic gas evolution, including hydrogen, from the cast stone product. (6) Technetium ''getter'' testing with cast stone made with LAW simulant and with radioactive LAW. In addition, nitrate leaching observations were drawn from nitrate leachability data obtained in the course of the Parts 2 and 3 waste form performance testing. The nitrate leachability index results are presented along with other data from the applicable activity categories.

  13. Automation and efficiency in the operational processes: a case study in a logistics operator

    OpenAIRE

    Nascimento, Dener Gomes do; Silva, Giovanni Henrique da

    2017-01-01

    Globalization has made the automations become increasingly feasible and with the technological development many operations can be optimized, bringing productivity gains. Logistics is a major benefit of all this development, because lives a time extremely competitive, in which being efficient is a requirement to stay alive in the market. Inserted in this context, this article seeks from the analysis of the processes in a distribution center, identify opportunities to automate operations to gai...

  14. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  15. Efficient simulation of press hardening process through integrated structural and CFD analyses

    International Nuclear Information System (INIS)

    Palaniswamy, Hariharasudhan; Mondalek, Pamela; Wronski, Maciek; Roy, Subir

    2013-01-01

    Press hardened steel parts are being increasingly used in automotive structures for their higher strength to meet safety standards while reducing vehicle weight to improve fuel consumption. However, manufacturing of sheet metal parts by press hardening process to achieve desired properties is extremely challenging as it involves complex interaction of plastic deformation, metallurgical change, thermal distribution, and fluid flow. Numerical simulation is critical for successful design of the process and to understand the interaction among the numerous process parameters to control the press hardening process in order to consistently achieve desired part properties. Until now there has been no integrated commercial software solution that can efficiently model the complete process from forming of the blank, heat transfer between the blank and tool, microstructure evolution in the blank, heat loss from tool to the fluid that flows through water channels in the tools. In this study, a numerical solution based on Altair HyperWorks® product suite involving RADIOSS®, a non-linear finite element based structural analysis solver and AcuSolve®, an incompressible fluid flow solver based on Galerkin Least Square Finite Element Method have been utilized to develop an efficient solution for complete press hardening process design and analysis. RADIOSS is used to handle the plastic deformation, heat transfer between the blank and tool, and microstructure evolution in the blank during cooling. While AcuSolve is used to efficiently model heat loss from tool to the fluid that flows through water channels in the tools. The approach is demonstrated through some case studies

  16. Testing the efficiency of rover science protocols for robotic sample selection: A GeoHeuristic Operational Strategies Test

    Science.gov (United States)

    Yingst, R. A.; Bartley, J. K.; Chidsey, T. C.; Cohen, B. A.; Gilleaudeau, G. J.; Hynek, B. M.; Kah, L. C.; Minitti, M. E.; Williams, R. M. E.; Black, S.; Gemperline, J.; Schaufler, R.; Thomas, R. J.

    2018-05-01

    The GHOST field tests are designed to isolate and test science-driven rover operations protocols, to determine best practices. During a recent field test at a potential Mars 2020 landing site analog, we tested two Mars Science Laboratory data-acquisition and decision-making methods to assess resulting science return and sample quality: a linear method, where sites of interest are studied in the order encountered, and a "walkabout-first" method, where sites of interest are examined remotely before down-selecting to a subset of sites that are interrogated with more resource-intensive instruments. The walkabout method cost less time and fewer resources, while increasing confidence in interpretations. Contextual data critical to evaluating site geology was acquired earlier than for the linear method, and given a higher priority, which resulted in development of more mature hypotheses earlier in the analysis process. Combined, this saved time and energy in the collection of data with more limited spatial coverage. Based on these results, we suggest that the walkabout method be used where doing so would provide early context and time for the science team to develop hypotheses-critical tests; and that in gathering context, coverage may be more important than higher resolution.

  17. Operating Room Efficiency before and after Entrance in a Benchmarking Program for Surgical Process Data.

    Science.gov (United States)

    Pedron, Sara; Winter, Vera; Oppel, Eva-Maria; Bialas, Enno

    2017-08-23

    Operating room (OR) efficiency continues to be a high priority for hospitals. In this context the concept of benchmarking has gained increasing importance as a means to improve OR performance. The aim of this study was to investigate whether and how participation in a benchmarking and reporting program for surgical process data was associated with a change in OR efficiency, measured through raw utilization, turnover times, and first-case tardiness. The main analysis is based on panel data from 202 surgical departments in German hospitals, which were derived from the largest database for surgical process data in Germany. Panel regression modelling was applied. Results revealed no clear and univocal trend of participation in a benchmarking and reporting program for surgical process data. The largest trend was observed for first-case tardiness. In contrast to expectations, turnover times showed a generally increasing trend during participation. For raw utilization no clear and statistically significant trend could be evidenced. Subgroup analyses revealed differences in effects across different hospital types and department specialties. Participation in a benchmarking and reporting program and thus the availability of reliable, timely and detailed analysis tools to support the OR management seemed to be correlated especially with an increase in the timeliness of staff members regarding first-case starts. The increasing trend in turnover time revealed the absence of effective strategies to improve this aspect of OR efficiency in German hospitals and could have meaningful consequences for the medium- and long-run capacity planning in the OR.

  18. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  19. The analysis of the process in the cooling tower with the low efficiency

    Science.gov (United States)

    Badriev, A. I.; Sharifullin, V. N.

    2017-11-01

    We put quite a difficult task maintaining a temperature drop to 11-12 degrees at thermal power plants to ensure the required depth of cooling of vacuum in the condenser, cooling towers. This requirement is achieved with the reducing of the hydraulic load with the low efficiency of the apparatus. The task analysis process in this unit and identify the causes of his poor performance was put in the work. One of the possible reasons may be the heterogeneity of the process in the volume of the apparatus. Therefore, it was decided to investigate experimentally the distribution of the irrigation water and the air flow in the cross section of industrial cooling towers. As a result, we found a significant uneven distribution of flows of water and air in the volume of the apparatus. We have shown theoretically that the uneven distribution of irrigation leads to a significant decrease in the efficiency of evaporation in the cooling tower. The velocity distribution of the air as the tower sections, and inside sections are interesting. The obtained experimental data allowed to establish the internal communication: the effects of the distributions of the density of irrigation in sections of the apparatus for the distribution of changes of the temperature and the air velocity. The obtained results allowed to formulate a methodology for determining process problems and to develop actions on increase of the efficiency of the cooling tower.

  20. System and process for efficient separation of biocrudes and water in a hydrothermal liquefaction system

    Science.gov (United States)

    Elliott, Douglas C.; Hart, Todd R.; Neuenschwander, Gary G.; Oyler, James R.; Rotness, Jr, Leslie J.; Schmidt, Andrew J.; Zacher, Alan H.

    2016-08-02

    A system and process are described for clean separation of biocrudes and water by-products from hydrothermal liquefaction (HTL) product mixtures of organic and biomass-containing feedstocks at elevated temperatures and pressures. Inorganic compound solids are removed prior to separation of biocrude and water by-product fractions to minimize formation of emulsions that impede separation. Separation may be performed at higher temperatures that reduce heat loss and need to cool product mixtures to ambient. The present invention thus achieves separation efficiencies not achieved in conventional HTL processing.