WorldWideScience

Sample records for performance analysis based

  1. Structured Performance Analysis for Component Based Systems

    OpenAIRE

    Salmi , N.; Moreaux , Patrice; Ioualalen , M.

    2012-01-01

    International audience; The Component Based System (CBS) paradigm is now largely used to design software systems. In addition, performance and behavioural analysis remains a required step for the design and the construction of efficient systems. This is especially the case of CBS, which involve interconnected components running concurrent processes. % This paper proposes a compositional method for modeling and structured performance analysis of CBS. Modeling is based on Stochastic Well-formed...

  2. Web-based turbine cycle performance analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Heo, Gyun Young; Lee, Sung Jin; Chang, Soon Heung; Choi, Seong Soo

    2000-01-01

    As an approach to improve the economical efficiency of operating nuclear power plants, a thermal performance analysis tool for steam turbine cycle has been developed. For the validation and the prediction of the signals used in thermal performance analysis, a few statistical signal processing techniques are integrated. The developed tool provides predicted performance calculation capability that is steady-state wet steam turbine cycle simulation, and measurement performance calculation capability which determines component- and cycle-level performance indexes. Web-based interface with all performance analysis is implemented, so even remote users can achieve performance analysis. Comparing to ASME PTC6 (Performance Test Code 6), the focusing point of the developed tool is historical performance analysis rather than single accurate performance test. The proposed signal processing techniques are validated using actual plant signals, and turbine cycle models are tested by benchmarking with a commercial thermal analysis tool

  3. Cash-Flow Analysis Base of the Company's Performance Evaluation

    OpenAIRE

    Radu Riana Iren; Mihalcea Lucean; Negoescu Gheorghe

    2013-01-01

    Analyses based on the study of financial flows allow coherent merge to study the financial equilibrium of the firm's performance. If static analysis to assess the financial imbalance at some point, but does not explain its evolution, in contrast, dynamic analysis highlights the evolution of financial imbalance, but does not indicate the extent of it. It follows that the two kinds of analysis are complementary and should be pursued simultaneously. Dynamic analysis is based on the concept of st...

  4. Distance Based Root Cause Analysis and Change Impact Analysis of Performance Regressions

    Directory of Open Access Journals (Sweden)

    Junzan Zhou

    2015-01-01

    Full Text Available Performance regression testing is applied to uncover both performance and functional problems of software releases. A performance problem revealed by performance testing can be high response time, low throughput, or even being out of service. Mature performance testing process helps systematically detect software performance problems. However, it is difficult to identify the root cause and evaluate the potential change impact. In this paper, we present an approach leveraging server side logs for identifying root causes of performance problems. Firstly, server side logs are used to recover call tree of each business transaction. We define a novel distance based metric computed from call trees for root cause analysis and apply inverted index from methods to business transactions for change impact analysis. Empirical studies show that our approach can effectively and efficiently help developers diagnose root cause of performance problems.

  5. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  6. Estimating Driving Performance Based on EEG Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Jung Tzyy-Ping

    2005-01-01

    Full Text Available The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  7. Orthogonal Analysis Based Performance Optimization for Vertical Axis Wind Turbine

    Directory of Open Access Journals (Sweden)

    Lei Song

    2016-01-01

    Full Text Available Geometrical shape of a vertical axis wind turbine (VAWT is composed of multiple structural parameters. Since there are interactions among the structural parameters, traditional research approaches, which usually focus on one parameter at a time, cannot obtain performance of the wind turbine accurately. In order to exploit overall effect of a novel VAWT, we firstly use a single parameter optimization method to obtain optimal values of the structural parameters, respectively, by Computational Fluid Dynamics (CFD method; based on the results, we then use an orthogonal analysis method to investigate the influence of interactions of the structural parameters on performance of the wind turbine and to obtain optimization combination of the structural parameters considering the interactions. Results of analysis of variance indicate that interactions among the structural parameters have influence on performance of the wind turbine, and optimization results based on orthogonal analysis have higher wind energy utilization than that of traditional research approaches.

  8. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    Science.gov (United States)

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  9. Performance-based training: from job and task analysis to training materials

    International Nuclear Information System (INIS)

    Davis, L.T.; Spinney, R.W.

    1983-01-01

    Historically, the smoke filled room approach has been used to revise training programs: instructors would sit down and design a program based on existing training materials and any federal requirements that applied. This failure to reflect a systematic definition of required job functions, responsibilities and performance standards in training programs has resulted in generic program deficiencies: they do not provide complete training of required skills and knowledge. Recognition of this need for change, coupled with a decrease in experienced industry personnel inputs and long training pipelines, has heightened the need for efficient performance-based training programs which are derived from and referenced to job performance criteria. This paper presents the process for developing performance-based training materials based on job and task analysis products

  10. Performance monitoring and analysis of task-based OpenMP.

    Directory of Open Access Journals (Sweden)

    Yi Ding

    Full Text Available OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  11. Performance Evaluation of Hadoop-based Large-scale Network Traffic Analysis Cluster

    Directory of Open Access Journals (Sweden)

    Tao Ran

    2016-01-01

    Full Text Available As Hadoop has gained popularity in big data era, it is widely used in various fields. The self-design and self-developed large-scale network traffic analysis cluster works well based on Hadoop, with off-line applications running on it to analyze the massive network traffic data. On purpose of scientifically and reasonably evaluating the performance of analysis cluster, we propose a performance evaluation system. Firstly, we set the execution times of three benchmark applications as the benchmark of the performance, and pick 40 metrics of customized statistical resource data. Then we identify the relationship between the resource data and the execution times by a statistic modeling analysis approach, which is composed of principal component analysis and multiple linear regression. After training models by historical data, we can predict the execution times by current resource data. Finally, we evaluate the performance of analysis cluster by the validated predicting of execution times. Experimental results show that the predicted execution times by trained models are within acceptable error range, and the evaluation results of performance are accurate and reliable.

  12. Asymptotic performance of regularized quadratic discriminant analysis based classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-12-13

    This paper carries out a large dimensional analysis of the standard regularized quadratic discriminant analysis (QDA) classifier designed on the assumption that data arise from a Gaussian mixture model. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that depends only on the covariances and means associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized QDA and can be used to determine the optimal regularization parameter that minimizes the misclassification error probability. Despite being valid only for Gaussian data, our theoretical findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from popular real data bases, thereby making an interesting connection between theory and practice.

  13. Thermal Power Plant Performance Analysis

    CERN Document Server

    2012-01-01

    The analysis of the reliability and availability of power plants is frequently based on simple indexes that do not take into account the criticality of some failures used for availability analysis. This criticality should be evaluated based on concepts of reliability which consider the effect of a component failure on the performance of the entire plant. System reliability analysis tools provide a root-cause analysis leading to the improvement of the plant maintenance plan.   Taking in view that the power plant performance can be evaluated not only based on  thermodynamic related indexes, such as heat-rate, Thermal Power Plant Performance Analysis focuses on the presentation of reliability-based tools used to define performance of complex systems and introduces the basic concepts of reliability, maintainability and risk analysis aiming at their application as tools for power plant performance improvement, including: ·         selection of critical equipment and components, ·         defini...

  14. Residents' surgical performance during the laboratory years: an analysis of rule-based errors.

    Science.gov (United States)

    Nathwani, Jay N; Wise, Brett J; Garren, Margaret E; Mohamadipanah, Hossein; Van Beek, Nicole; DiMarco, Shannon M; Pugh, Carla M

    2017-11-01

    Nearly one-third of surgical residents will enter into academic development during their surgical residency by dedicating time to a research fellowship for 1-3 y. Major interest lies in understanding how laboratory residents' surgical skills are affected by minimal clinical exposure during academic development. A widely held concern is that the time away from clinical exposure results in surgical skills decay. This study examines the impact of the academic development years on residents' operative performance. We hypothesize that the use of repeated, annual assessments may result in learning even without individual feedback on participants simulated performance. Surgical performance data were collected from laboratory residents (postgraduate years 2-5) during the summers of 2014, 2015, and 2016. Residents had 15 min to complete a shortened, simulated laparoscopic ventral hernia repair procedure. Final hernia repair skins from all participants were scored using a previously validated checklist. An analysis of variance test compared the mean performance scores of repeat participants to those of first time participants. Twenty-seven (37% female) laboratory residents provided 2-year assessment data over the 3-year span of the study. Second time performance revealed improvement from a mean score of 14 (standard error = 1.0) in the first year to 17.2 (SD = 0.9) in the second year, (F[1, 52] = 5.6, P = 0.022). Detailed analysis demonstrated improvement in performance for 3 grading criteria that were considered to be rule-based errors. There was no improvement in operative strategy errors. Analysis of longitudinal performance of laboratory residents shows higher scores for repeat participants in the category of rule-based errors. These findings suggest that laboratory residents can learn from rule-based mistakes when provided with annual performance-based assessments. This benefit was not seen with operative strategy errors and has important implications for

  15. Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis

    Science.gov (United States)

    2016-07-31

    distribution unlimited Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis Matthew...vital importance for hydrocarbon -fueled propulsion systems: fuel thermal performance as indicated by physical and chemical effects of cooling passage... analysis . The selection and acquisition of a set of chemically diverse fuels is pivotal for a successful outcome since test method validation and

  16. Development of web based performance analysis program for nuclear power plant turbine cycle

    International Nuclear Information System (INIS)

    Park, Hoon; Yu, Seung Kyu; Kim, Seong Kun; Ji, Moon Hak; Choi, Kwang Hee; Hong, Seong Ryeol

    2002-01-01

    Performance improvement of turbine cycle affects economic operation of nuclear power plant. We developed performance analysis system for nuclear power plant turbine cycle. The system is based on PTC (Performance Test Code), that is estimation standard of nuclear power plant performance. The system is developed using Java Web-Start and JSP(Java Server Page)

  17. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  18. Exergy Based Performance Analysis of Double Flow Solar Air Heater with Corrugated Absorber

    OpenAIRE

    S. P. Sharma; Som Nath Saha

    2017-01-01

    This paper presents the performance, based on exergy analysis of double flow solar air heaters with corrugated and flat plate absorber. A mathematical model of double flow solar air heater based on energy balance equations has been presented and the results obtained have been compared with that of a conventional flat-plate solar air heater. The double flow corrugated absorber solar air heater performs thermally better than the flat plate double flow and conventional flat-plate solar air heate...

  19. Performance analysis of a potassium-base AMTEC cell

    International Nuclear Information System (INIS)

    Huang, C.; Hendricks, T.J.; Hunt, T.K.

    1998-01-01

    Sodium-BASE Alkali-Metal-Thermal-to-Electric-Conversion (AMTEC) cells have been receiving increased attention and funding from the Department of Energy, NASA and the United States Air Force. Recently, sodium-BASE (Na-BASE) AMTEC cells were selected for the Advanced Radioisotope Power System (ARPS) program for the next generation of deep-space missions and spacecraft. Potassium-BASE (K-BASE) AMTEC cells have not received as much attention to date, even though the vapor pressure of potassium is higher than that of sodium at the same temperature. So that, K-BASE AMTEC cells with potentially higher open circuit voltage and higher power output than Na-BASE AMTEC cells are possible. Because the surface tension of potassium is about half of the surface tension of sodium at the same temperature, the artery and evaporator design in a potassium AMTEC cell has much more challenging pore size requirements than designs using sodium. This paper uses a flexible thermal/fluid/electrical model to predict the performance of a K-BASE AMTEC cell. Pore sizes in the artery of K-BASE AMTEC cells must be smaller by an order of magnitude than in Na-BASE AMTEC cells. The performance of a K-BASE AMTEC cell was higher than a Na-BASE AMTEC cell at low voltages/high currents. K-BASE AMTEC cells also have the potential of much better electrode performance, thereby creating another avenue for potentially better performance in K-BASE AMTEC cells

  20. Simulation-Based Stochastic Sensitivity Analysis of a Mach 4.5 Mixed-Compression Intake Performance

    Science.gov (United States)

    Kato, H.; Ito, K.

    2009-01-01

    A sensitivity analysis of a supersonic mixed-compression intake of a variable-cycle turbine-based combined cycle (TBCC) engine is presented. The TBCC engine is de- signed to power a long-range Mach 4.5 transport capable of antipodal missions studied in the framework of an EU FP6 project, LAPCAT. The nominal intake geometry was designed using DLR abpi cycle analysis pro- gram by taking into account various operating require- ments of a typical mission profile. The intake consists of two movable external compression ramps followed by an isolator section with bleed channel. The compressed air is then diffused through a rectangular-to-circular subsonic diffuser. A multi-block Reynolds-averaged Navier- Stokes (RANS) solver with Srinivasan-Tannehill equilibrium air model was used to compute the total pressure recovery and mass capture fraction. While RANS simulation of the nominal intake configuration provides more realistic performance characteristics of the intake than the cycle analysis program, the intake design must also take into account in-flight uncertainties for robust intake performance. In this study, we focus on the effects of the geometric uncertainties on pressure recovery and mass capture fraction, and propose a practical approach to simulation-based sensitivity analysis. The method begins by constructing a light-weight analytical model, a radial-basis function (RBF) network, trained via adaptively sampled RANS simulation results. Using the RBF network as the response surface approximation, stochastic sensitivity analysis is performed using analysis of variance (ANOVA) technique by Sobol. This approach makes it possible to perform a generalized multi-input- multi-output sensitivity analysis based on high-fidelity RANS simulation. The resulting Sobol's influence indices allow the engineer to identify dominant parameters as well as the degree of interaction among multiple parameters, which can then be fed back into the design cycle.

  1. Two-stage process analysis using the process-based performance measurement framework and business process simulation

    NARCIS (Netherlands)

    Han, K.H.; Kang, J.G.; Song, M.S.

    2009-01-01

    Many enterprises have recently been pursuing process innovation or improvement to attain their performance goals. To align a business process with enterprise performances, this study proposes a two-stage process analysis for process (re)design that combines the process-based performance measurement

  2. Model Solutions for Performance-Based Seismic Analysis of an Anchored Sheet Pile Quay Wall

    NARCIS (Netherlands)

    Habets, C.J.W.; Peters, D.J.; de Gijt, J.G.; Metrikine, A.; Jonkman, S.N.

    2016-01-01

    Conventional seismic designs of quay walls in ports are mostly based on pseudo-static analysis. A more advanced alternative is the Performance-Based Design (PBD) method, which evaluates permanent deformations and amounts of (repairable) damage under seismic loading. The aim of this study is to

  3. Performance analysis of a minichannel-based solar collector using different nanofluids

    International Nuclear Information System (INIS)

    Mahian, Omid; Kianifar, Ali; Sahin, Ahmet Z.; Wongwises, Somchai

    2014-01-01

    Highlights: • Performance of a minichannel-based solar collector has been studied using four different nanofluids. • First and second law thermodynamic analyses are conducted by considering constant mass flow rate of nanofluid. • Al 2 O 3 /water nanofluids show the highest heat transfer coefficient in the tubes. • The highest outlet temperature is provided by Cu/water nanofluids. • Cu/water nanofluid produces the lowest entropy generation among the nanofluids. - Abstract: In this paper, an analytical analysis has been performed to evaluate the performance of a minichannel-based solar collector using four different nanofluids including Cu/water, Al 2 O 3 /water, TiO 2 /water, and SiO 2 /water. The analysis of first and second laws is conducted for turbulent flow by considering the constant mass flow rate of nanofluid. The results are presented for volume fractions up to 4% and nanoparticle size of 25 nm where the inner diameter of the risers of flat plate collector is assumed to be 2 mm. Analysis of the first law of thermodynamics reveals that Al 2 O 3 /water nanofluids show the highest heat transfer coefficient in the tubes while the lowest value belongs to SiO 2 /water nanofluids. The highest outlet temperature is provided by Cu/water nanofluids, and after that TiO 2 /water, Al 2 O 3 /water, and SiO 2 /water nanofluids are in ranks of second to fourth. The results of second law analysis elucidate that Cu/water nanofluid produces the lowest entropy generation among the nanofluids. It is found that although the effective thermal conductivity of TiO 2 /water nanofluids is less than Al 2 O 3 /water nanofluids, but the entropy generation of TiO 2 /water is lower than Al 2 O 3 /water. Finally, some recommendations are given for future studies on the applications of nanofluids in solar collectors

  4. CONTEMPORARY APPROACHES OF COMPANY PERFORMANCE ANALYSIS BASED ON RELEVANT FINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Sziki Klara

    2012-12-01

    Full Text Available In this paper we chose to present two components of the financial statements: the profit and loss account and the cash flow statement. These summary documents and different indicators calculated based on them allow us to formulate assessments on the performance and profitability on various functions and levels of the company’s activity. This paper aims to support the hypothesis that the accounting information presented in the profit and loss account and in the cash flow statement is an appropriate source for assessing company performance. The purpose of this research is to answer the question linked to the main hypothesis: Is it the profit and loss statement or the cash flow account that reflects better the performance of a business? Based on the literature of specialty studied we tried a conceptual, analytical and practical approach of the term performance, overviewing some terminological acceptations of the term performance as well as the main indicators of performance analysis on the basis of the profit and loss account and of the cash flow statement: aggregated indicators, also known as intermediary balances of administration, economic rate of return, rate of financial profitability, rate of return through cash flows, operating cash flow rate, rate of generating operating cash out of gross operating result. At the same time we had a comparative approach of the profit and loss account and cash flow statement, outlining the main advantages and disadvantages of these documents. In order to demonstrate the above theoretical assessments, we chose to analyze these indicators based on information from the financial statements of SC Sinteza SA, a company in Bihor county, listed on the Bucharest Stock Exchange.

  5. Performance analysis of CDMA-based wireless communication

    African Journals Online (AJOL)

    move towards higher data transmission rates, however, a number of other ... performance with number of users, number ... SYSTEM MODEL AND ANALYSIS ..... If we now define Fk as the variable for the set of all integers in [0, N-2] for which ai ...

  6. Performance-based seismic assessment of vulnerability of dam using time history analysis

    Directory of Open Access Journals (Sweden)

    Elmrabet Oumnia

    2018-01-01

    Full Text Available The current performance-based seismic assessment procedure can be computationally intensive as it requires many time history analyses (THA each requiring time intensive post-processing of results. Time history analysis is a part of structural analysis and is the calculation of the response of a structure to any earthquake. It is one of the main processes of structural design in regions where earthquakes are prevalent. The objective of this study is to evaluate the seismic performance of embankment dam located on the Oued RHISS in the Province of AL HOCEIMA using the THA method. To monitor structural behavior, the seismic vulnerability of structure is evaluated under real earthquake records with considering the soil-structure-fluide interaction. In this study, a simple assistant program is developed for implementing earthquake analyses of structure with ANSYS, ground acceleration–time history data are used for seismic analysis and dynamic numerical simulations were conducted to study and identify the total response of the soil-structure system.

  7. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  8. CFD analysis of heat transfer performance of graphene based hybrid nanofluid in radiators

    Science.gov (United States)

    Bharadwaj, Bharath R.; Sanketh Mogeraya, K.; Manjunath, D. M.; Rao Ponangi, Babu; Rajendra Prasad, K. S.; Krishna, V.

    2018-04-01

    For Improved performance of an automobile engine, Cooling systems are one of the critical systems that need attention. With increased capacity to carry away large amounts of wasted heat, performance of an engine is increased. Current research on Nano-fluids suggests that they offer higher heat transfer rate compared to that of conventional coolants. Hence this project seeks to investigate the use of hybrid-nanofluids in radiators so as to increase its heat transfer performance. Carboxyl Graphene and Graphene Oxide based nanoparticles were selected due to the very high thermal conductivity of Graphene. System Analysis of the radiator was performed by considering a small part of the whole automobile radiator modelled using SEIMENS NX. CFD analysis was conducted using ANSYS FLUENT® for the nanofluid defined and the increase in effectiveness was compared to that of conventional coolants. Usage of such nanofluids for a fixed cooling requirement in the future can lead to significant downsizing of the radiator.

  9. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey

    Science.gov (United States)

    Rau, Hsiao-Hsien; Chen, Kang-Hua

    2017-01-01

    Background Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced “My health bank,” a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. Objective This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. Methods This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 (“disagree strongly”) to 5 (“Agree strongly”). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. Results This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65

  10. Airports’ Operational Performance and Efficiency Evaluation Based on Multicriteria Decision Analysis (MCDA and Data Envelopment Analysis (DEA Tools

    Directory of Open Access Journals (Sweden)

    João Jardim

    2015-12-01

    Full Text Available Airport benchmarking depends on airports’ operational performance and efficiency indicators, which are important for business agents, operational managers, regulatory agencies, airlines and passengers. There are several sets of single and complex indicators to evaluate airports’ performance and efficiency as well as several techniques to benchmark such infrastructures. The general aim of this work is twofold: to balance the data envelopment analysis (DEA and multicriteria decision analysis (MCDA tools and to show that airport benchmarking is also possible using a multicriteria decision analysis tool called Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH. Whilst DEA measures the relative performance in the presence of multiple inputs and outputs, MCDA/ MACBETH uses performance and efficiency indicators to support benchmark results, being useful for evaluating the real importance and weight of the selected indicators. The work is structured as follows: first, a state-of-the-art review concerning either airport benchmarking and performance indicators or DEA and MCDA tool techniques; second, an overview of the impacts on airports’ operational performance and efficiency of emergent operational factors (sudden meteorological/natural phenomena; third, two case studies on a set of worldwide airports and Madeira (FNC Airport; and fourth, some insights into and challenges for future research that are still under development.

  11. ANALYSIS OF FREE ROUTE AIRSPACE AND PERFORMANCE BASED NAVIGATION IMPLEMENTATION IN THE EUROPEAN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Svetlana Pavlova

    2014-12-01

    Full Text Available European Air Traffic Management system requires continuous improvements as air traffic is increasingday by day. For this purpose it was developed by international organizations Free Route Airspace and PerformanceBased Navigation concepts that allow to offer a required level of safety, capacity, environmental performance alongwith cost-effectiveness. The aim of the article is to provide detailed analysis of Free Route Airspace and PerformanceBased Navigation implementation status within European region including Ukrainian air navigation system.

  12. Performance of an Axisymmetric Rocket Based Combined Cycle Engine During Rocket Only Operation Using Linear Regression Analysis

    Science.gov (United States)

    Smith, Timothy D.; Steffen, Christopher J., Jr.; Yungster, Shaye; Keller, Dennis J.

    1998-01-01

    The all rocket mode of operation is shown to be a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. An axisymmetric RBCC engine was used to determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and multiple linear regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inlet diameter ratio. A perfect gas computational fluid dynamics analysis, using both the Spalart-Allmaras and k-omega turbulence models, was performed with the NPARC code to obtain values of vacuum specific impulse. Results from the multiple linear regression analysis showed that for both the full flow and gas generator configurations increasing mixer-ejector area ratio and rocket area ratio increase performance, while increasing mixer-ejector inlet area ratio and mixer-ejector length-to-diameter ratio decrease performance. Increasing injected secondary flow increased performance for the gas generator analysis, but was not statistically significant for the full flow analysis. Chamber pressure was found to be not statistically significant.

  13. TAP 2: Performance-Based Training Manual

    International Nuclear Information System (INIS)

    1993-08-01

    Cornerstone of safe operation of DOE nuclear facilities is personnel performing day-to-day functions which accomplish the facility mission. Performance-based training is fundamental to the safe operation. This manual has been developed to support the Training Accreditation Program (TAP) and assist contractors in efforts to develop performance-based training programs. It provides contractors with narrative procedures on performance-based training that can be modified and incorporated for facility-specific application. It is divided into sections dealing with analysis, design, development, implementation, and evaluation

  14. Performance analysis

    International Nuclear Information System (INIS)

    2008-05-01

    This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.

  15. Performance-Based Technology Selection Filter description report. INEL Buried Waste Integrated Demonstration System Analysis project

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  16. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey.

    Science.gov (United States)

    Rau, Hsiao-Hsien; Wu, Yi-Syuan; Chu, Chi-Ming; Wang, Fu-Chung; Hsu, Min-Huei; Chang, Chi-Wen; Chen, Kang-Hua; Lee, Yen-Liang; Kao, Senyeong; Chiu, Yu-Lung; Wen, Hsyien-Chia; Fuad, Anis; Hsu, Chien-Yeh; Chiu, Hung-Wen

    2017-04-27

    Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced "My health bank," a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 ("disagree strongly") to 5 ("Agree strongly"). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65.1%). They were still students (195 out of 350

  17. Performance analysis of a lunar based solar thermal power system with regolith thermal storage

    International Nuclear Information System (INIS)

    Lu, Xiaochen; Ma, Rong; Wang, Chao; Yao, Wei

    2016-01-01

    The manned deep-space exploration is a hot topic of the current space activities. The continuous supply of thermal and electrical energy for the scientific equipment and human beings is a crucial issue for the lunar outposts. Since the night lasts for periods of about 350 h at most locations on the lunar surface, massive energy storage is required for continuous energy supply during the lengthy lunar night and the in-situ resource utilization is demanded. A lunar based solar thermal power system with regolith thermal storage is presented in this paper. The performance analysis is carried out by the finite-time thermodynamics to take into account major irreversible losses. The influences of some key design parameters are analyzed for system optimization. The analytical results shows that the lunar based solar thermal power system with regolith thermal storage can meet the requirement of the continuous energy supply for lunar outposts. - Highlights: • A lunar based solar thermal power system with regolith thermal storage is presented. • The performance analysis is carried out by the finite-time thermodynamics. • The influences of some key design parameters are analyzed.

  18. The Effectiveness of Self-Regulated Learning Scaffolds on Academic Performance in Computer-Based Learning Environments: A Meta-Analysis

    Science.gov (United States)

    Zheng, Lanqin

    2016-01-01

    This meta-analysis examined research on the effects of self-regulated learning scaffolds on academic performance in computer-based learning environments from 2004 to 2015. A total of 29 articles met inclusion criteria and were included in the final analysis with a total sample size of 2,648 students. Moderator analyses were performed using a…

  19. Model Solutions for Performance-Based Seismic Analysis of an Anchored Sheet Pile Quay Wall

    OpenAIRE

    C. J. W. Habets; D. J. Peters; J. G. de Gijt; A. V. Metrikine; S. N. Jonkman

    2016-01-01

    Conventional seismic designs of quay walls in ports are mostly based on pseudo-static analysis. A more advanced alternative is the Performance-Based Design (PBD) method, which evaluates permanent deformations and amounts of (repairable) damage under seismic loading. The aim of this study is to investigate the suitability of this method for anchored sheet pile quay walls that were not purposely designed for seismic loads. A research methodology is developed in which pseudo-static, permanent-di...

  20. An investigation and comparison on network performance analysis

    OpenAIRE

    Lanxiaopu, Mi

    2012-01-01

    This thesis is generally about network performance analysis. It contains two parts. The theory part summarizes what network performance is and inducts the methods of doing network performance analysis. To answer what network performance is, a study into what network services are is done. And based on the background research, there are two important network performance metrics: Network delay and Throughput should be included in network performance analysis. Among the methods of network a...

  1. Performance evaluation of tile-based Fisher Ratio analysis using a benchmark yeast metabolome dataset.

    Science.gov (United States)

    Watson, Nathanial E; Parsons, Brendon A; Synovec, Robert E

    2016-08-12

    Performance of tile-based Fisher Ratio (F-ratio) data analysis, recently developed for discovery-based studies using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC-TOFMS), is evaluated with a metabolomics dataset that had been previously analyzed in great detail, but while taking a brute force approach. The previously analyzed data (referred to herein as the benchmark dataset) were intracellular extracts from Saccharomyces cerevisiae (yeast), either metabolizing glucose (repressed) or ethanol (derepressed), which define the two classes in the discovery-based analysis to find metabolites that are statistically different in concentration between the two classes. Beneficially, this previously analyzed dataset provides a concrete means to validate the tile-based F-ratio software. Herein, we demonstrate and validate the significant benefits of applying tile-based F-ratio analysis. The yeast metabolomics data are analyzed more rapidly in about one week versus one year for the prior studies with this dataset. Furthermore, a null distribution analysis is implemented to statistically determine an adequate F-ratio threshold, whereby the variables with F-ratio values below the threshold can be ignored as not class distinguishing, which provides the analyst with confidence when analyzing the hit table. Forty-six of the fifty-four benchmarked changing metabolites were discovered by the new methodology while consistently excluding all but one of the benchmarked nineteen false positive metabolites previously identified. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Multiple performance optimization of electrochemical drilling of Inconel 625 using Taguchi based Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    N. Manikandan

    2017-04-01

    Full Text Available In this present investigation, a multi performance characteristics optimization based on Taguchi approach with Grey Relational Analysis (GRA is proposed for Electrochemical Drilling process on Inconel 625 material which is used for marine, nuclear, aerospace applications, especially in corrosive environments. Experimental runs have been planned as per Taguchi’s principle with three input machining variables such as feed rate, flow rate of electrolyte and concentration of electrolyte. Besides the material removal rate and surface roughness, the geometric measures such as overcut, form and orientation tolerance are included as performance measures in this investigation. Outcomes of the analysis show that the feed rate is the predominant variable for the desired performance characteristics. On establishing the desired performance measures and multiple regression models are developed to be used as predictive tools. The confirmation test also conducted to validate the results attained by GRA approach and affirmed that there is considerable improvement with the help of proposed approach.

  3. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  4. Conflict Resolution for Product Performance Requirements Based on Propagation Analysis in the Extension Theory

    Directory of Open Access Journals (Sweden)

    Yanwei Zhao

    2014-01-01

    Full Text Available Traditional product data mining methods are mainly focused on the static data. Performance requirements are generally met as possible by finding some cases and changing their structures. However, when one is satisfied with the structures changed, the other effects are not taken into account by analyzing the correlations; that is, design conflicts are not identified and resolved. An approach to resolving the conflict problems is proposed based on propagation analysis in Extension Theory. Firstly, the extension distance is improved to better fit evaluating the similarity among cases, then, a case retrieval method is developed. Secondly, the transformations that can be made on selected cases are formulated by understanding the conflict natures in the different performance requirements, which leads to the extension transformation strategy development for coordinating conflicts using propagation analysis. Thirdly, the effects and levels of propagation are determined by analyzing the performance values before and after the transformations, thus the co-existing conflict coordination strategy of multiple performances is developed. The method has been implemented in a working prototype system for supporting decision-making. And it has been demonstrated the feasible and effective through resolving the conflicts of noise, exhaust, weight and intake pressure for the screw air compressor performance design.

  5. Performance Based Clustering for Benchmarking of Container Ports: an Application of Dea and Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Jie Wu

    2010-12-01

    Full Text Available The operational performance of container ports has received more and more attentions in both academic and practitioner circles, the performance evaluation and process improvement of container ports have also been the focus of several studies. In this paper, Data Envelopment Analysis (DEA, an effective tool for relative efficiency assessment, is utilized for measuring the performances and benchmarking of the 77 world container ports in 2007. The used approaches in the current study consider four inputs (Capacity of Cargo Handling Machines, Number of Berths, Terminal Area and Storage Capacity and a single output (Container Throughput. The results for the efficiency scores are analyzed, and a unique ordering of the ports based on average cross efficiency is provided, also cluster analysis technique is used to select the more appropriate targets for poorly performing ports to use as benchmarks.

  6. Thermal performance of gas turbine power plant based on exergy analysis

    International Nuclear Information System (INIS)

    Ibrahim, Thamir K.; Basrawi, Firdaus; Awad, Omar I.; Abdullah, Ahmed N.; Najafi, G.; Mamat, Rizlman; Hagos, F.Y.

    2017-01-01

    Highlights: • Modelling theoretical framework for the energy and exergy analysis of the Gas turbine. • Investigated the effects of ambient temperature on the energy and exergy performance. • The maximum exergy loss occurs in the gas turbine components. - Abstract: This study is about energy and exergy analysis of gas turbine power plant. Energy analysis is more quantitatively while exergy analysis is about the same but with the addition of qualitatively. The lack quality of the thermodynamic process in the system leads to waste of potential energy, also known as exergy destruction which affects the efficiency of the power plant. By using the first and second law of thermodynamics, the model for the gas turbine power plant is built. Each component in the thermal system which is an air compressor, combustion chamber and gas turbine play roles in affecting the efficiency of the gas turbine power plant. The exergy flow rate for the compressor (AC), the combustion chamber (CC) and the gas turbine (GT) inlet and outlet are calculated based on the physical exergy and chemical exergy. The exergy destruction calculation based on the difference between the exergy flow in and exergy flow out of the component. The combustion chamber has the highest exergy destruction. The air compressor has 94.9% and 92% of exergy and energy efficiency respectively. The combustion chamber has 67.5% and 61.8% of exergy and energy efficiency respectively while gas turbine has 92% and 82% of exergy and energy efficiency respectively. For the overall efficiency, the plant has 32.4% and 34.3% exergy and energy efficiency respectively. To enhance the efficiency, the intake air temperature should be reduced, modify the combustion chamber to have the better air-fuel ratio and increase the capability of the gas turbine to receive high inlet temperature.

  7. Performance Analysis of Spectral Amplitude Coding Based OCDMA System with Gain and Splitter Mismatch

    Science.gov (United States)

    Umrani, Fahim A.; Umrani, A. Waheed; Umrani, Naveed A.; Memon, Kehkashan A.; Kalwar, Imtiaz Hussain

    2013-09-01

    This paper presents the practical analysis of the optical code-division multiple-access (O-CDMA) systems based on perfect difference codes. The work carried out use SNR criterion to select the optimal value of avalanche photodiodes (APD) gain and shows how the mismatch in the splitters and gains of the APD used in the transmitters and receivers of network can degrade the BER performance of the system. The investigations also reveal that higher APD gains are not suitable for such systems even at higher powers. The system performance, with consideration of shot noise, thermal noise, bulk and surface leakage currents is also investigated.

  8. Performance analysis of a novel energy storage system based on liquid carbon dioxide

    International Nuclear Information System (INIS)

    Wang, Mingkun; Zhao, Pan; Wu, Yi; Dai, Yiping

    2015-01-01

    Due to the intermittence and fluctuation of wind resource, the increasing penetration level of wind power will bring huge challenges to maintain the stability of power system. Integrating compressed air energy storage (CAES) system with wind farms can weaken this negative effect. However CAES system needs large caverns or mines to store compressed air, which is restricted in application. In this paper, a novel energy storage system based on liquid carbon dioxide is presented. The mathematical models of compressed liquid-carbon dioxide energy storage system are developed. The parametric analysis is conducted to examine the effect of some key thermodynamic parameters on the system performance. Compared with AA-CAES, the liquid carbon dioxide energy storage system has advantages such as a high energy density, high EVR. Moreover, the round trip efficiency of this system can reach about 56.64%, which is acceptable in consideration of the storage volume. Therefore, this proposed system has a good potential for storing wind power in large scale and offers an attractive solution to the challenges of the increasing penetration level of wind power. - Highlights: • A novel energy storage system based on liquid carbon dioxide is presented. • The effects of some key parameters on the system performance are studied. • The operation optimization is conducted by genetic algorithm. • Comparative analysis of AA-CAES and liquid carbon dioxide system is studied.

  9. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  10. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  11. Performance analysis of opportunistic nonregenerative relaying

    KAUST Repository

    Tourki, Kamel; Alouini, Mohamed-Slim; Qaraqe, Khalid A.; Yang, Hongchuan

    2013-01-01

    Opportunistic relaying in cooperative communication depends on careful relay selection. However, the traditional centralized method used for opportunistic amplify-and-forward protocols requires precise measurements of channel state information at the destination. In this paper, we adopt the max-min criterion as a relay selection framework for opportunistic amplify-and-forward cooperative communications, which was exhaustively used for the decode-and-forward protocol, and offer an accurate performance analysis based on exact statistics of the local signal-to-noise ratios of the best relay. Furthermore, we evaluate the asymptotical performance and deduce the diversity order of our proposed scheme. Finally, we validate our analysis by showing that performance simulation results coincide with our analytical results over Rayleigh fading channels, and we compare the max-min relay selection with their centralized channel state information-based and partial relay selection counterparts.

  12. Cost and performance analysis of physical security systems

    International Nuclear Information System (INIS)

    Hicks, M.J.; Yates, D.; Jago, W.H.

    1997-01-01

    CPA - Cost and Performance Analysis - is a prototype integration of existing PC-based cost and performance analysis tools: ACEIT (Automated Cost Estimating Integrated Tools) and ASSESS (Analytic System and Software for Evaluating Safeguards and Security). ACE is an existing DOD PC-based tool that supports cost analysis over the full life cycle of a system; that is, the cost to procure, operate, maintain and retire the system and all of its components. ASSESS is an existing DOE PC-based tool for analysis of performance of physical protection systems. Through CPA, the cost and performance data are collected into Excel workbooks, making the data readily available to analysts and decision makers in both tabular and graphical formats and at both the system and subsystem levels. The structure of the cost spreadsheets incorporates an activity-based approach to cost estimation. Activity-based costing (ABC) is an accounting philosophy used by industry to trace direct and indirect costs to the products or services of a business unit. By tracing costs through security sensors and procedures and then mapping the contributions of the various sensors and procedures to system effectiveness, the CPA architecture can provide security managers with information critical for both operational and strategic decisions. The architecture, features and applications of the CPA prototype are presented. 5 refs., 3 figs

  13. Performance Analysis of a Threshold-Based Parallel Multiple Beam Selection Scheme for WDM FSO Systems

    KAUST Repository

    Nam, Sung Sik

    2018-04-09

    In this paper, we statistically analyze the performance of a threshold-based parallel multiple beam selection scheme for a free-space optical (FSO) based system with wavelength division multiplexing (WDM) in cases where a pointing error has occurred under independent identically distributed Gamma-Gamma fading conditions. To simplify the mathematical analysis, we additionally consider Gamma turbulence conditions, which are a good approximation of Gamma-Gamma distribution. Specifically, we statistically analyze the characteristics in operation under conventional detection schemes (i.e., heterodyne detection (HD) and intensity modulation/direct detection (IM/DD) techniques) for both adaptive modulation (AM) case in addition to non-AM case (i.e., coherent/non-coherent binary modulation). Then, based on the statistically derived results, we evaluate the outage probability of a selected beam, the average spectral efficiency (ASE), the average number of selected beams (ANSB) and the average bit error rate (BER). Selected results show that we can obtain higher spectral efficiency and simultaneously reduce the potential for increasing the complexity of implementation caused by applying the selection-based beam selection scheme without considerable performance loss. Especially for the AM case, the ASE can be increased further compared to the non- AM cases. Our derived results based on the Gamma distribution as an approximation of the Gamma-Gamma distribution can be used as approximated performance measure bounds, especially, they may lead to lower bounds on the approximated considered performance measures.

  14. Risk-based performance indicators

    International Nuclear Information System (INIS)

    Azarm, M.A.; Boccio, J.L.; Vesely, W.E.; Lofgren, E.

    1987-01-01

    The purpose of risk-based indicators is to monitor plant safety. Safety is measured by monitoring the potential for core melt (core-melt frequency) and the public risk. Targets for these measures can be set consistent with NRC safety goals. In this process, the performance of safety systems, support systems, major components, and initiating events can be monitored using measures such as unavailability, failure or occurrence frequency. The changes in performance measures and their trends are determined from the time behavior of monitored measures by differentiation between stochastical and actual variations. Therefore, degradation, as well as improvement in the plant safety performance, can be determined. The development of risk-based performance indicators will also provide the means to trace a change in the safety measures to specific problem areas which are amenable to root cause analysis and inspection audits. In addition, systematic methods will be developed to identify specific improvement policies using the plant information system for the identified problem areas. The final product of the performance indicator project will be a methodology, and an integrated and validated set of software packages which, if properly interfaced with the logic model software of a plant, can monitor the plant performance as plant information is provided as input

  15. A Performance-Based Instructional Theory

    Science.gov (United States)

    Lawson, Tom E.

    1974-01-01

    The rationale for a performanced- based instructional theory has arisen from significant advances during the past several years in instructional psychology. Four major areas of concern are: analysis of subject-matter content in terms of performance competencies, diagnosis of pre-instructional behavior, formulation of an instructional…

  16. Performance analysis of vortex based mixers for confined flows

    Science.gov (United States)

    Buschhagen, Timo

    The hybrid rocket is still sparsely employed within major space or defense projects due to their relatively poor combustion efficiency and low fuel grain regression rate. Although hybrid rockets can claim advantages in safety, environmental and performance aspects against established solid and liquid propellant systems, the boundary layer combustion process and the diffusion based mixing within a hybrid rocket grain port leaves the core flow unmixed and limits the system performance. One principle used to enhance the mixing of gaseous flows is to induce streamwise vorticity. The counter-rotating vortex pair (CVP) mixer utilizes this principle and introduces two vortices into a confined flow, generating a stirring motion in order to transport near wall media towards the core and vice versa. Recent studies investigated the velocity field introduced by this type of swirler. The current work is evaluating the mixing performance of the CVP concept, by using an experimental setup to simulate an axial primary pipe flow with a radially entering secondary flow. Hereby the primary flow is altered by the CVP swirler unit. The resulting setup therefore emulates a hybrid rocket motor with a cylindrical single port grain. In order to evaluate the mixing performance the secondary flow concentration at the pipe assembly exit is measured, utilizing a pressure-sensitive paint based procedure.

  17. Performance analysis of a microcontroller based slip power recovery ...

    African Journals Online (AJOL)

    Slip power recovery wound rotor induction motor drives are used in high power, limited speed range applications where control of slip power provides the variable speed drive system. In this paper, the steady state performance analysis of conventional slip power recovery scheme using static line commutated inverter in the ...

  18. An analysis on equal width quantization and linearly separable subcode encoding-based discretization and its performance resemblances

    Directory of Open Access Journals (Sweden)

    Lim Meng-Hui

    2011-01-01

    Full Text Available Abstract Biometric discretization extracts a binary string from a set of real-valued features per user. This representative string can be used as a cryptographic key in many security applications upon error correction. Discretization performance should not degrade from the actual continuous features-based classification performance significantly. However, numerous discretization approaches based on ineffective encoding schemes have been put forward. Therefore, the correlation between such discretization and classification has never been made clear. In this article, we aim to bridge the gap between continuous and Hamming domains, and provide a revelation upon how discretization based on equal-width quantization and linearly separable subcode encoding could affect the classification performance in the Hamming domain. We further illustrate how such discretization can be applied in order to obtain a highly resembled classification performance under the general Lp distance and the inner product metrics. Finally, empirical studies conducted on two benchmark face datasets vindicate our analysis results.

  19. Performance analysis of an optical self-interference cancellation system with a directly modulated laser-based demonstration.

    Science.gov (United States)

    Yu, Yinghong; Zhang, Yunhao; Huang, Lin; Xiao, Shilin

    2018-02-20

    In this paper, two main performance indices of the optical self-interference cancellation (OSIC) system are theoretically analyzed: cancellation bandwidth and depth. Delay deviation is investigated to be the determining factor of cancellation bandwidth, based on which the bandwidth advantage of the OSIC system over electrical schemes is also proven theoretically. Cancellation depth in the narrowband is mostly influenced by attenuation and delay-adjusting deviation, while in the broadband case, the performance is mostly limited by frequency-dependent amplitude and phase mismatch. The cancellation performance analysis is suitable for most linear modulation-demodulation OSIC systems, including the directly modulated laser (DML)-based OSIC system verified experimentally in this paper. The cancellation model is well demonstrated by the agreement between experimental cancellation results and predicted performance. For over-the-air demonstration with the employment of antennas, broadband cancellation within 450 MHz bandwidth of 22 dB and 25 dB is achieved at 900 MHz and 2.4 GHz, respectively. In addition, orthogonal frequency division multiplexing signals are employed to show in-band full-duplex transmission with good performance by the DML-based OSIC system, with successful suppression of self-interference and recovery of the signal of interest.

  20. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    Science.gov (United States)

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  1. Performance Analysis of SAC Optical PPM-CDMA System-Based Interference Rejection Technique

    Science.gov (United States)

    Alsowaidi, N.; Eltaif, Tawfig; Mokhtar, M. R.

    2016-03-01

    In this paper, we aim to theoretically analyse optical code division multiple access (OCDMA) system that based on successive interference cancellation (SIC) using pulse position modulation (PPM), considering the interference between the users, imperfection cancellation occurred during the cancellation process and receiver noises. Spectral amplitude coding (SAC) scheme is used to suppress the overlapping between the users and reduce the receiver noises effect. The theoretical analysis of the multiple access interference (MAI)-limited performance of this approach indicates the influence of the size of M-ary PPM on OCDMA system. The OCDMA system performance improves with increasing M-ary PPM. Therefore, it was found that the SIC/SAC-OCDMA system using PPM technique along with modified prime (MPR) codes used as signature sequence code offers significant improvement over the one without cancellation and it can support up to 103 users at the benchmarking value of bit error rate (BER) = 10-9 with prime number p = 11 while the system without cancellation scheme can support only up to 52 users.

  2. Energetic and exergetic performance analysis of CdS/CdTe based photovoltaic technology in real operating conditions of composite climate

    International Nuclear Information System (INIS)

    Rawat, Rahul; Kaushik, S.C.; Sastry, O.S.; Singh, Y.K.; Bora, B.

    2016-01-01

    Highlights: • Performance analysis of 3.2 kW_P CdTe PV system has been carried out. • Performance ratio and thermodynamic efficiencies have been determined. • Electrical parameters have been assessed at alternative reporting conditions. • Exergy analysis of the system has been carried out. • Degradation rate of the system has been calculated to be 0.18%/year. - Abstract: The solar photovoltaic (PV) technology market has increased rapidly with the continuously increasing electricity demand and climate concerns. With the increased PV installed capacity, the performance analysis of these systems has become critically important in order to ensure its reliable operation for a long lifetime, monetary payback and to identify the scope of improvement. The well-established conventional energetic performance analysis techniques are quantitative approaches based on energy conservation while the exergetic performance analysis is the qualitative method which is based on second law of thermodynamics. In this paper, an approach for energetic and exergetic performance analysis has been developed so as to determine the long term performance of PV systems in real operating conditions. A methodology has been proposed for utilizing the long term time series outdoor data in order to assess the performance of PV system statistically and to determine the system degradation rate. The degradation rate of 3.2 kW_P CdTe PV system is found to be 0.18% per year after 23 months of operation in composite climate which is lower than the reported degradation rate of earlier CdTe technology. The average performance ratio (PR), energetic and exergetic efficiency of the system is found to be 0.89%, 9.84% and 10.62% respectively. The average exergetic efficiency is found to be increased by 12% by utilizing the recoverable thermal exergy loss in photovoltaic-thermal system. The instantaneous PR of 93.5% of the per minute data is found to be in the range of 0.84–0.95. Additionally, the

  3. Performance analysis of ventilation systems with desiccant wheel cooling based on exergy destruction

    International Nuclear Information System (INIS)

    Tu, Rang; Liu, Xiao-Hua; Hwang, Yunho; Ma, Fei

    2016-01-01

    Highlights: • Ventilation systems with desiccant wheel were analyzed from exergy destruction. • Main performances influencing factors for ventilation systems are put forward. • Improved ventilation systems with lower exergy destruction are suggested. • Performances of heat pumps driven ventilation systems are greatly increased. - Abstract: This paper investigates the performances of ventilation systems with desiccant wheel cooling from the perspective of exergy destructions. Based on the inherent influencing factors for exergy destructions of heat and mass transfer and heat sources, provide guidelines for efficient system design. First, performances of a basic ventilation system are simulated, which is operated at high regeneration temperature and low coefficient of performance (COP). Then, exergy analysis of the basic ventilation system shows that exergy destructions mainly exist in the heat and mass transfer components and the heat source. The inherent influencing factors for the heat and mass transfer exergy destruction are heat and mass transfer capacities, which are related to over dehumidification of the desiccant wheel, and unmatched coefficients, which represent the uniformity of the temperature or humidity ratio differences fields for heat and mass transfer components. Based on these findings, two improved ventilation systems are suggested. For the first system, over dehumidification is avoided and unmatched coefficients for each component are reduced. With lower heat and mass transfer exergy destructions and lower regeneration temperature, COP and exergy efficiency of the first system are increased compared with the basic ventilation system. For the second system, a heat pump, which recovers heat from the process air to heat the regeneration air, is adopted to replace the electrical heater and cooling devices. The exergy destruction of the heat pump is considerably reduced as compared with heat source exergy destruction of the basic ventilation

  4. Building America House Performance Analysis Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Farrar-Nagy, S.; Anderson, R.; Judkoff, R.

    2001-10-29

    As the Building America Program has grown to include a large and diverse cross section of the home building industry, accurate and consistent analysis techniques have become more important to help all program partners as they perform design tradeoffs and calculate energy savings for prototype houses built as part of the program. This document illustrates some of the analysis concepts proven effective and reliable for analyzing the transient energy usage of advanced energy systems as well as entire houses. The analysis procedure described here provides a starting point for calculating energy savings of a prototype house relative to two base cases: builder standard practice and regional standard practice. Also provides building simulation analysis to calculate annual energy savings based on side-by-side short-term field testing of a prototype house.

  5. Long term energy performance analysis of Egbin thermal power ...

    African Journals Online (AJOL)

    This study is aimed at providing an energy performance analysis of Egbin thermal power plant. The plant operates on Regenerative Rankine cycle with steam as its working fluid .The model equations were formulated based on some performance parameters used in power plant analysis. The considered criteria were plant ...

  6. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  7. Modeling and performance analysis of movement-based group location management using RFID sensing in public transportation systems.

    Science.gov (United States)

    Chung, Yun Won

    2012-11-22

    Location management, which consists of location registration and paging, is essential to provide mobile communication services to mobile stations (MSs). Since MSs riding on a public transportation system (TS) generates significant location registration signaling loads simultaneously when a TS with riding MSs moves between location areas (LAs), group location management was proposed. Under the group location management, an MS performs group registration when it gets on a TS and performs group deregistration when it gets off a TS. Then, only a TS updates its current location when it changes LA, on behalf of all riding MSs. In this paper, movement-based group location management using radio frequency identification (RFID) is proposed, where the MS's getting on and getting off behaviors are detected using RFID and only location update of a TS is carried out if the number of crossed cells from the last updated cell exceeds a predefined movement threshold, on behalf of all riding MSs. Then, we develop an analytical model for the performance analysis of the movement-based group location management and analyze the effects of various parameters on the performance. The results show that the movement-based group location management has reduced signaling cost compared with movement-based individual location management, and optimal performance can be achieved by choosing appropriate movement threshold values.

  8. Analysis of concept and application or Risk-Informed Performed-Based Regulation (RI-PBR)

    International Nuclear Information System (INIS)

    Kim, W. S.; Sung, K. Y.; Lee, C. J.; Kim, H. J.

    2002-01-01

    For improving regulation of nuclear power plants, the USNRC is adopting the Risk-Informed Performance-Based Regulation (RI-PBR) as an alternative, in parallel with implementing current deterministic regulation. This paper introduces a research plan for 'Institutionalization of RI-PBR' that is being conducted by KINS as a national project for evaluating feasibility for application of the alternative to Korean regulation system. Analysis of regulation characteristics, case study and experience on RI-PBR were presented as interim research results. In addition, the future plan of development of RI-PBR concept as understandable to the public and evaluation of level of techniques needed for implementation of RI-PBR was introduced

  9. Cost and performance analysis of conceptual designs of physical protection systems

    International Nuclear Information System (INIS)

    Hicks, M.J.; Snell, M.S.; Sandoval, J.S.; Potter, C.S.

    1998-01-01

    CPA -- Cost and Performance Analysis -- is a methodology that joins Activity Based Cost (ABC) estimation with performance based analysis of physical protection systems. CPA offers system managers an approach that supports both tactical decision making and strategic planning. Current exploratory applications of the CPA methodology are addressing analysis of alternative conceptual designs. To support these activities, the original architecture for CPA, is being expanded to incorporate results from a suite of performance and consequence analysis tools such as JTS (Joint Tactical Simulation), ERAD (Explosive Release Atmospheric Dispersion) and blast effect models. The process flow for applying CPA to the development and analysis conceptual designs is illustrated graphically

  10. Performance Analysis of 3D Massive MIMO Cellular Systems with Collaborative Base Station

    Directory of Open Access Journals (Sweden)

    Xingwang Li

    2014-01-01

    Full Text Available Massive MIMO have drawn considerable attention as they enable significant capacity and coverage improvement in wireless cellular network. However, pilot contamination is a great challenge in massive MIMO systems. Under this circumstance, cooperation and three-dimensional (3D MIMO are emerging technologies to eliminate the pilot contamination and to enhance the performance relative to the traditional interference-limited implementations. Motivated by this, we investigate the achievable sum rate performance of MIMO systems in the uplink employing cooperative base station (BS and 3D MIMO systems. In our model, we consider the effects of both large-scale and small-scale fading, as well as the spatial correlation and indoor-to-outdoor high-rise propagation environment. In particular, we investigate the cooperative communication model based on 3D MIMO and propose a closed-form lower bound on the sum rate. Utilizing this bound, we pursue a “large-system” analysis and provide the asymptotic expression when the number of antennas at the BS grows large, and when the numbers of antennas at transceiver grow large with a fixed ratio. We demonstrate that the lower bound is very tight and becomes exact in the massive MIMO system limits. Finally, under the sum rate maximization condition, we derive the optimal number of UTs to be served.

  11. Space Launch System Base Heating Test: Sub-Scale Rocket Engine/Motor Design, Development and Performance Analysis

    Science.gov (United States)

    Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan; Kirchner, Robert; Engel, Carl D.

    2014-01-01

    The Space Launch System (SLS) base heating test is broken down into two test programs: (1) Pathfinder and (2) Main Test. The Pathfinder Test Program focuses on the design, development, hot-fire test and performance analyses of the 2% sub-scale SLS core-stage and booster element propulsion systems. The core-stage propulsion system is composed of four gaseous oxygen/hydrogen RS-25D model engines and the booster element is composed of two aluminum-based model solid rocket motors (SRMs). The first section of the paper discusses the motivation and test facility specifications for the test program. The second section briefly investigates the internal flow path of the design. The third section briefly shows the performance of the model RS-25D engines and SRMs for the conducted short duration hot-fire tests. Good agreement is observed based on design prediction analysis and test data. This program is a challenging research and development effort that has not been attempted in 40+ years for a NASA vehicle.

  12. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  13. submitter Simulation-Based Performance Analysis of the ALICE Mass Storage System

    CERN Document Server

    Vickovic, L; Celar, S

    2016-01-01

    CERN – the European Organization for Nuclear Research today, in the era of big data, is one of the biggest data generators in the world. Especially interesting is transient data storage system in the ALICE experiment. With the goal to optimize its performance this paper discusses a dynamic, discrete event simulation model of disk based Storage Area Network (SAN) and its usage for the performance analyses. Storage system model is based on modular, bottom up approach and the differences between measured and simulated values vary between 1.5 % and 4 % depending on the simulated component. Once finished, simulation model was used for detailed performance analyses. Among other findings it showed that system performances can be seriously affected if the array stripe size is larger than the size of cache on individual disks in the array, which so far has been completely ignored in the literature.

  14. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  15. Performance Based Logistics... What’s Stopping Us

    Science.gov (United States)

    2016-03-01

    performance-based life cycle product support, where outcomes are acquired through performance-based arrangements that deliver Warfighter requirements and...correlates to the acquisition life cycle framework: spend the time and effort to identify and lock in the PBL requirements; conduct an analysis to...studies and reports on PBLs over the past 15 or more years. Much like fashion styles, the af- finity for PBLs has ebbed and flowed during this time

  16. Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS

    International Nuclear Information System (INIS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2012-01-01

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  17. Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Minh-Chau, E-mail: thanchau7787@gmail.com [Changwon National University, 9 Sarim-Dong, Changwon 641-733 (Korea, Republic of); Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon [Changwon National University, 9 Sarim-Dong, Changwon 641-733 (Korea, Republic of); Yu, In-Keun, E-mail: yuik@changwon.ac.kr [Changwon National University, 9 Sarim-Dong, Changwon 641-733 (Korea, Republic of)

    2012-08-15

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  18. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    Science.gov (United States)

    Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang

    2017-10-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.

  19. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    International Nuclear Information System (INIS)

    Zhang, Zhu; Li, Hongbin; Hu, Chen; Jiao, Yang; Tang, Dengping

    2017-01-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer. (paper)

  20. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    International Nuclear Information System (INIS)

    Kelly, Dana L.; Boring, Ronald L.; Mosleh, Ali; Smidts, Carol

    2011-01-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  1. Performance-Based Technology Selection Filter description report

    International Nuclear Information System (INIS)

    O'Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL)

  2. Performance-Based Technology Selection Filter description report

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  3. Techno-economic performance analysis of bio-oil based Fischer-Tropsch and CHP synthesis platform

    International Nuclear Information System (INIS)

    Ng, Kok Siew; Sadhukhan, Jhuma

    2011-01-01

    The techno-economic potential of the UK poplar wood and imported oil palm empty fruit bunch derived bio-oil integrated gasification and Fischer-Tropsch (BOIG-FT) systems for the generation of transportation fuels and combined heat and power (CHP) was investigated. The bio-oil was represented in terms of main chemical constituents, i.e. acetic acid, acetol and guaiacol. The compositional model of bio-oil was validated based on its performance through a gasification process. Given the availability of large scale gasification and FT technologies and logistic constraints in transporting biomass in large quantities, distributed bio-oil generations using biomass pyrolysis and centralised bio-oil processing in BOIG-FT system are technically more feasible. Heat integration heuristics and composite curve analysis were employed for once-through and full conversion configurations, and for a range of economies of scale, 1 MW, 675 MW and 1350 MW LHV of bio-oil. The economic competitiveness increases with increasing scale. A cost of production of FT liquids of 78.7 Euro/MWh was obtained based on 80.12 Euro/MWh of electricity, 75 Euro/t of bio-oil and 116.3 million Euro/y of annualised capital cost. -- Highlights: → Biomass to liquid process and gas to liquid process synthesis. → Biorefinery economic analysis. → Pyrolysis oil to biofuel. → Gasification and Fischer-Tropsch. → Process integration, pinch analysis and energy efficiency.

  4. Workplace-based assessment: raters' performance theories and constructs.

    Science.gov (United States)

    Govaerts, M J B; Van de Wiel, M W J; Schuwirth, L W T; Van der Vleuten, C P M; Muijtjens, A M M

    2013-08-01

    Weaknesses in the nature of rater judgments are generally considered to compromise the utility of workplace-based assessment (WBA). In order to gain insight into the underpinnings of rater behaviours, we investigated how raters form impressions of and make judgments on trainee performance. Using theoretical frameworks of social cognition and person perception, we explored raters' implicit performance theories, use of task-specific performance schemas and the formation of person schemas during WBA. We used think-aloud procedures and verbal protocol analysis to investigate schema-based processing by experienced (N = 18) and inexperienced (N = 16) raters (supervisor-raters in general practice residency training). Qualitative data analysis was used to explore schema content and usage. We quantitatively assessed rater idiosyncrasy in the use of performance schemas and we investigated effects of rater expertise on the use of (task-specific) performance schemas. Raters used different schemas in judging trainee performance. We developed a normative performance theory comprising seventeen inter-related performance dimensions. Levels of rater idiosyncrasy were substantial and unrelated to rater expertise. Experienced raters made significantly more use of task-specific performance schemas compared to inexperienced raters, suggesting more differentiated performance schemas in experienced raters. Most raters started to develop person schemas the moment they began to observe trainee performance. The findings further our understanding of processes underpinning judgment and decision making in WBA. Raters make and justify judgments based on personal theories and performance constructs. Raters' information processing seems to be affected by differences in rater expertise. The results of this study can help to improve rater training, the design of assessment instruments and decision making in WBA.

  5. Impact of a function-based payment model on the financial performance of acute inpatient medical rehabilitation providers: a simulation analysis.

    Science.gov (United States)

    Sutton, J P; DeJong, G; Song, H; Wilkerson, D

    1997-12-01

    To operationalize research findings about a medical rehabilitation classification and payment model by building a prototype of a prospective payment system, and to determine whether this prototype model promotes payment equity. This latter objective is accomplished by identifying whether any facility or payment model characteristics are systematically associated with financial performance. This study was conducted in two phases. In Phase 1 the components of a diagnosis-related group (DRG)-like payment system, including a base rate, function-related group (FRG) weights, and adjusters, were identified and estimated using hospital cost functions. Phase 2 consisted of a simulation analysis in which each facility's financial performance was modeled, based on its 1990-1991 case mix. A multivariate regression equation was conducted to assess the extent to which characteristics of 42 rehabilitation facilities contribute toward determining financial performance under the present Medicare payment system as well as under the hypothetical model developed. Phase 1 (model development) included 61 rehabilitation hospitals. Approximately 59% were rehabilitation units within a general hospital and 48% were teaching facilities. The number of rehabilitation beds averaged 52. Phase 2 of the stimulation analysis included 42 rehabilitation facilities, subscribers to UDS in 1990-1991. Of these, 69% were rehabilitation units and 52% were teaching facilities. The number of rehabilitation beds averaged 48. Financial performance, as measured by the ratio of reimbursement to average costs. Case-mix index is the primary determinant of financial performance under the present Medicare payment system. None of the facility characteristics included in this analysis were associated with financial performance under the hypothetical FRG payment model. The most notable impact of an FRG-based payment model would be to create a stronger link between resource intensity and level of reimbursement

  6. Performance analysis of InSb based QWFET for ultra high speed applications

    International Nuclear Information System (INIS)

    Subash, T. D.; Gnanasekaran, T.; Divya, C.

    2015-01-01

    An indium antimonide based QWFET (quantum well field effect transistor) with the gate length down to 50 nm has been designed and investigated for the first time for L-band radar applications at 230 GHz. QWFETs are designed at the high performance node of the International Technology Road Map for Semiconductors (ITRS) requirements of drive current (Semiconductor Industry Association 2010). The performance of the device is investigated using the SYNOPSYS CAD (TCAD) software. InSb based QWFET could be a promising device technology for very low power and ultra-high speed performance with 5–10 times low DC power dissipation. (semiconductor devices)

  7. AHP-based risk analysis of energy performance contracting projects in Russia

    International Nuclear Information System (INIS)

    Garbuzova-Schlifter, Maria; Madlener, Reinhard

    2016-01-01

    Understanding and properly managing risks that could potentially affect the target- and performance-based profits of energy performance contracting (EPC) projects are essential. It is particularly important for the establishment and success of energy service companies (ESCOs) acting in the vulnerable environment of the vast but highly energy-inefficient Russian market. This study systematically explores common risk factors and causes of risk associated with EPC projects executed in three Russian sectors: (1) industrial; (2) housing and communal services; and (3) public. Several interviews with the Russian EPC experts were accomplished and a qualitative risk assessment by using an analytic hierarchy process (AHP) approach. The data were obtained from a web-based questionnaire survey conducted among Russian EPC project executors. For each focus sector, a specific preference-based ranking of the identified risk factors and causes of risk was derived. The AHP results show that causes of risk related to the financial and regulatory aspects contribute most to the riskiness of EPC projects performed in all three focus sectors in Russia, calling for the special attention of EPC policy- and business-makers. Due to sectorial particularities and different actors involved, we conclude that there is a need for elaboration of sector-specific contractual schemes for EPC projects. - Highlights: • AHP- and survey-based study of energy performance contracting (EPC) projects in Russia. • Main risk factors and causes of risk associated with EPC projects are investigated. • In practice, lack of a feasible risk management approach in EPC projects. • Regulatory and financial risks contribute most to the EPC projects’ riskiness. • Elaboration of the sector-specific EPC project contractual scheme is required.

  8. Exploratory analysis of normative performance on the UCSD Performance-Based Skills Assessment-Brief.

    Science.gov (United States)

    Vella, Lea; Patterson, Thomas L; Harvey, Philip D; McClure, Margaret McNamara; Mausbach, Brent T; Taylor, Michael J; Twamley, Elizabeth W

    2017-10-01

    The UCSD Performance-Based Skills Assessment (UPSA) is a performance-based measure of functional capacity. The brief, two-domain (finance and communication ability) version of the assessment (UPSA-B) is now widely used in both clinical research and treatment trials. To date, research has not examined possible demographic-UPSA-B relationships within a non-psychiatric population. We aimed to produce and describe preliminary normative scores for the UPSA-B over a full range of ages and educational attainment. The finance and communication subscales of the UPSA were administered to 190 healthy participants in the context of three separate studies. These data were combined to examine the effects of age, sex, and educational attainment on the UPSA-B domain and total scores. Fractional polynomial regression was used to compute demographically-corrected T-scores for the UPSA-B total score, and percentile rank conversion was used for the two subscales. Age and education both had significant non-linear effects on the UPSA-B total score. The finance subscale was significantly related to both gender and years of education, whereas the communication subscale was not significantly related to any of the demographic characteristics. Demographically corrected T-scores and percentile ranks for UPSA-B scores are now available for use in clinical research. Published by Elsevier B.V.

  9. Design and performance analysis of gas and liquid radial turbines

    Science.gov (United States)

    Tan, Xu

    In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.

  10. The Blame Game: Performance Analysis of Speaker Diarization System Components

    NARCIS (Netherlands)

    Huijbregts, M.A.H.; Wooters, Chuck

    2007-01-01

    In this paper we discuss the performance analysis of a speaker diarization system similar to the system that was submitted by ICSI at the NIST RT06s evaluation benchmark. The analysis that is based on a series of oracle experiments, provides a good understanding of the performance of each system

  11. A virtualized software based on the NVIDIA cuFFT library for image denoising: performance analysis

    DEFF Research Database (Denmark)

    Galletti, Ardelio; Marcellino, Livia; Montella, Raffaele

    2017-01-01

    Abstract Generic Virtualization Service (GVirtuS) is a new solution for enabling GPGPU on Virtual Machines or low powered devices. This paper focuses on the performance analysis that can be obtained using a GPGPU virtualized software. Recently, GVirtuS has been extended in order to support CUDA...... ancillary libraries with good results. Here, our aim is to analyze the applicability of this powerful tool to a real problem, which uses the NVIDIA cuFFT library. As case study we consider a simple denoising algorithm, implementing a virtualized GPU-parallel software based on the convolution theorem...

  12. A Shot Number Based Approach to Performance Analysis in Table Tennis

    Directory of Open Access Journals (Sweden)

    Tamaki Sho

    2017-01-01

    Full Text Available The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method.

  13. Parametric Analysis to Study the Influence of Aerogel-Based Renders' Components on Thermal and Mechanical Performance.

    Science.gov (United States)

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-05-04

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.

  14. Parametric Analysis to Study the Influence of Aerogel-Based Renders’ Components on Thermal and Mechanical Performance

    Directory of Open Access Journals (Sweden)

    Sofia Ximenes

    2016-05-01

    Full Text Available Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types, fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types, and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences, based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.

  15. Risk-based plant performance indicators

    International Nuclear Information System (INIS)

    Boccio, J.L.; Azarm, M.A.; Hall, R.E.

    1991-01-01

    Tasked by the 1979 President's Commission on the Accident at Three Mile Island, the U.S. nuclear power industry has put into place a performance indicator program as one means for showing a demonstrable record of achievement. Largely through the efforts of the Institute of Nuclear Power Operations (INPO), plant performance data has, since 1983, been collected and analyzed to aid utility management in measuring their plants' performance progress. The U.S. Nuclear Regulatory Commission (NRC) has also developed a set of performance indicators. This program, conducted by NRC's Office for the Analysis and Evaluation of Operational Data (AEOD), is structured to present information on plant operational performance in a manner that could enhance the staff's ability to recognize changes in the safety performance. Both organizations recognized that performance indicators have limitations and could be subject to misinterpretation and misuse with the potential for an adverse impact on safety. This paper reports on performance indicators presently in use, e.g., unplanned automatic scrams, unplanned safety system actuation, safety system failures, etc., which are logically related to safety. But, a reliability/risk-based method for evaluating either individual indicators or an aggregated set of indicators is not yet available

  16. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  17. Performance evaluation of existing building structure with pushover analysis

    Science.gov (United States)

    Handana, MAP; Karolina, R.; Steven

    2018-02-01

    In the management of the infrastructure of the building, during the period of buildings common building damage as a result of several reasons, earthquakes are common. The building is planned to work for a certain service life. But during the certain service life, the building vulnerable to damage due to various things. Any damage to cultivate can be detected as early as possible, because the damage could spread, triggering and exacerbating the latest. The newest concept to earthquake engineering is Performance Based Earthquake Engineering (PBEE). PBEE divided into two, namely Performance Based Seismic Design (PBSD) and Performance Based Seismic Evaluation (PBSE). Evaluation on PBSE one of which is the analysis of nonlinear pushover. Pushover analysis is a static analysis of nonlinear where the influence of the earthquake plan on building structure is considered as burdens static catch at the center of mass of each floor, which it was increased gradually until the loading causing the melting (plastic hinge) first within the building structure, then the load increases further changes the shapes of post-elastic large it reached the condition of elastic. Then followed melting (plastic hinge) in the location of the other structured.

  18. Preliminary Analysis of the General Performance and Mechanical Behavior of Irradiated FeCrAl Base Alloys and Weldments

    Energy Technology Data Exchange (ETDEWEB)

    Gussev, Maxim N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Field, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Briggs, Samuel A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Yamamoto, Yukinori [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-30

    The iron-based, iron-chromium-aluminum (FeCrAl) alloys are promising, robust materials for deployment in current and future nuclear power plants. This class of alloys demonstrates excellent performance in a range of environments and conditions, including high-temperature steam (>1000°C). Furthermore, these alloys have the potential to have prolonged survival under loss-of-coolant accident (LOCA) conditions compared to the more traditional cladding materials that are either Zr-based alloys or austenitic steels. However, one of the issues associated with FeCrAl alloys is cracking during welding. The present project investigates the possibility of mitigating welding-induced cracking via alloying and precise structure control of the weldments; in the frame work of the project, several advanced alloys were developed and are being investigated prior to and after neutron irradiation to provide insight into the radiation tolerance and mechanical performance of the weldments. The present report provides preliminary results on the post-irradiation characterization and mechanical tests performed during United States Fiscal Year (FY) 2016. Chapter 1 provides a general introduction, and Chapter 2 describes the alloy compositions, welding procedure, specimen geometry and manufacturing parameters. Also, a brief discussion of the irradiation at the High Flux Isotope Reactor (HFIR) is provided. Chapter 3 is devoted to the analysis of mechanical tests performed at the hot cell facility; tensile curves and mechanical properties are discussed in detail focusing on the irradiation temperature. Limited fractography results are also presented and analyzed. The discussion highlights the limitations of the testing within a hot cell. Chapter 4 underlines the advantages of in-situ testing and discusses the preliminary results obtained with newly developed miniature specimens. Specimens were moved to the Low Activation Materials Development and Analysis (LAMDA) laboratory and prepared for

  19. Factors affecting construction performance: exploratory factor analysis

    Science.gov (United States)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  20. Performance-Based Funding Brief

    Science.gov (United States)

    Washington Higher Education Coordinating Board, 2011

    2011-01-01

    A number of states have made progress in implementing performance-based funding (PFB) and accountability. This policy brief summarizes main features of performance-based funding systems in three states: Tennessee, Ohio, and Indiana. The brief also identifies key issues that states considering performance-based funding must address, as well as…

  1. Performance analysis of indigenous spectroscopy system based on Lanthanum Bromide (LaBr3(Ce)) scintillation detector

    International Nuclear Information System (INIS)

    Kulkarni, C.P.; Punnal, Mahesh; Vinod, M.; Padmini, S.; Bhatnagar, P.V.; Behere, Anita; Kulkarni, D.B.; Paranjape, D.B.

    2018-01-01

    This paper presents detailed performance analysis of LaBr 3 (Ce) based compact spectroscopy system indigenously developed in Electronics Division BARC. The system incorporates state of the art low power electronic components along-with advanced spectroscopy software. Performance parameters and spectral response are experimentally determined and the results are presented in comparison with a standard HPGe system under similar test conditions. These experiments are conducted in Radiation Standards Section, RSSD in BARC using calibration sources for acquiring gamma spectrums in various combinations of source-to-detector (S-D) distances and acquisition times. The acquired data is used for deriving energy calibration and computing FWHM, dead time, count-rate (cps) and efficiency to evaluate system performance; particularly with smaller acquisition times as necessitated by field applications. Self activity of the detector is also determined experimentally and presented along-with comments on its effect on low count-rate applications

  2. Performance analysis and dynamic modeling of a single-spool turbojet engine

    Science.gov (United States)

    Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin

    2017-01-01

    The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.

  3. PERFORMANCE ANALYSIS OF AN ENTITY FROM CONSTRUCTION SECTOR USING DASHBOARD

    Directory of Open Access Journals (Sweden)

    SORIN BRICIU

    2015-12-01

    Full Text Available This research paper deals with the analysis of performances of economic entities from the construction sector from Romania. The necessary data for preparation and analysis up through dashboard of the economic entity are provided by managerial accounting through Target Costing method. The way of implementing and observing of stages which are completed in managerial accounting through Target Costing method are also presented in the paper based on the existing literature. For data analysis it was used a questionnaire based on three questions whose results were analyzed and which formed the basis of our entire course of scientific approach. The paper ends with the authors' conclusions about the performance analysis of economic entities in a construction project using dashboard showing the benefits of its long-term decisions.

  4. Performance-based contracts for road projects comparative analysis of different types

    CERN Document Server

    Gajurel, Ashish

    2014-01-01

    This book focuses on the aspects of contracting contracts, basically related to road construction and management contracts. The book presents an analytical study of Performance-Based Road Management and Maintenance (PMMR), Funktionsbauvertrag (FBV) (Function-Based Construction Contract) and Public Private Partnerships (PPP). A separate chapter is also included about the comparative study of these contract types. The book provides useful material for university libraries, construction companies and government departments of construction.

  5. Performance criteria for emergency medicine residents: a job analysis.

    Science.gov (United States)

    Blouin, Danielle; Dagnone, Jeffrey Damon

    2008-11-01

    A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.

  6. Cost and performance analysis of physical protection systems - a case study

    International Nuclear Information System (INIS)

    Hicks, M.J.; Snell, M.S.; Sandoval, J.S.; Potter, C.S.

    1998-01-01

    Design and analysis of physical protection systems requires (1) identification of mission critical assets; (2) identification of potential threats that might undermine mission capability; (3) identification of the consequences of loss of mission-critical assets (e.g., time and cost to recover required capability and impact on operational readiness); and (4) analysis of the effectiveness of physical protection elements. CPA -- Cost and Performance Analysis -- addresses the fourth of these four issues. CPA is a methodology that joins Activity Based Cost estimation with performance-based analysis of physical protection systems. CPA offers system managers an approach that supports both tactical decision making and strategic planning. Current exploratory applications of the CPA methodology address analysis of alternative conceptual designs. Hypothetical data is used to illustrate this process

  7. Offline analysis of context contribution to ERP-based typing BCI performance

    Science.gov (United States)

    Orhan, Umut; Erdogmus, Deniz; Roark, Brian; Oken, Barry; Fried-Oken, Melanie

    2013-12-01

    Objective. We aim to increase the symbol rate of electroencephalography (EEG) based brain-computer interface (BCI) typing systems by utilizing context information. Approach. Event related potentials (ERP) corresponding to a stimulus in EEG can be used to detect the intended target of a person for BCI. This paradigm is widely utilized to build letter-by-letter BCI typing systems. Nevertheless currently available BCI typing systems still require improvement due to low typing speeds. This is mainly due to the reliance on multiple repetitions before making a decision to achieve higher typing accuracy. Another possible approach to increase the speed of typing while not significantly reducing the accuracy of typing is to use additional context information. In this paper, we study the effect of using a language model (LM) as additional evidence for intent detection. Bayesian fusion of an n-gram symbol model with EEG features is proposed, and a specifically regularized discriminant analysis ERP discriminant is used to obtain EEG-based features. The target detection accuracies are rigorously evaluated for varying LM orders, as well as the number of ERP-inducing repetitions. Main results. The results demonstrate that the LMs contribute significantly to letter classification accuracy. For instance, we find that a single-trial ERP detection supported by a 4-gram LM may achieve the same performance as using 3-trial ERP classification for the non-initial letters of words. Significance. Overall, the fusion of evidence from EEG and LMs yields a significant opportunity to increase the symbol rate of a BCI typing system.

  8. Performance analysis of switching based hybrid FSO/RF transmission

    KAUST Repository

    Usman, Muneer; Yang, Hongchuan; Alouini, Mohamed-Slim

    2014-01-01

    Hybrid free space optical (FSO)/ radio frequency (RF) systems have emerged as a promising solution for high data rate wireless back haul.We present and analyze a switching based transmission scheme for hybrid FSO/RF system. Specifically, either FSO or RF link will be active at a certain time instance, with FSO link enjoying a higher priority. Analytical expressions have been obtained for the outage probability, average bit error rate and ergodic capacity for the resulting system. Numerical examples are presented to compare the performance of the hybrid scheme with FSO only scenario.

  9. Performance analysis of switching based hybrid FSO/RF transmission

    KAUST Repository

    Usman, Muneer

    2014-09-01

    Hybrid free space optical (FSO)/ radio frequency (RF) systems have emerged as a promising solution for high data rate wireless back haul.We present and analyze a switching based transmission scheme for hybrid FSO/RF system. Specifically, either FSO or RF link will be active at a certain time instance, with FSO link enjoying a higher priority. Analytical expressions have been obtained for the outage probability, average bit error rate and ergodic capacity for the resulting system. Numerical examples are presented to compare the performance of the hybrid scheme with FSO only scenario.

  10. A performance analysis of advanced I/O architectures for PC-based network file servers

    Science.gov (United States)

    Huynh, K. D.; Khoshgoftaar, T. M.

    1994-12-01

    In the personal computing and workstation environments, more and more I/O adapters are becoming complete functional subsystems that are intelligent enough to handle I/O operations on their own without much intervention from the host processor. The IBM Subsystem Control Block (SCB) architecture has been defined to enhance the potential of these intelligent adapters by defining services and conventions that deliver command information and data to and from the adapters. In recent years, a new storage architecture, the Redundant Array of Independent Disks (RAID), has been quickly gaining acceptance in the world of computing. In this paper, we would like to discuss critical system design issues that are important to the performance of a network file server. We then present a performance analysis of the SCB architecture and disk array technology in typical network file server environments based on personal computers (PCs). One of the key issues investigated in this paper is whether a disk array can outperform a group of disks (of same type, same data capacity, and same cost) operating independently, not in parallel as in a disk array.

  11. A string matching based algorithm for performance evaluation of ...

    Indian Academy of Sciences (India)

    Zanibbi et al (2011) have proposed performance metrics based on bipartite graphs at stroke level. ... bipartite graphs on which metrics based on hamming distances are defined. ...... Document Image Analysis for Libraries 320–331 ... Lee H J and Wang J S 1997 Design of a mathematical expression understanding system.

  12. PERFORMANCE ANALYSIS OF PILOT BASED CHANNEL ESTIMATION TECHNIQUES IN MB OFDM SYSTEMS

    Directory of Open Access Journals (Sweden)

    M. Madheswaran

    2011-12-01

    Full Text Available Ultra wideband (UWB communication is mainly used for short range of communication in wireless personal area networks. Orthogonal Frequency Division Multiplexing (OFDM is being used as a key physical layer technology for Fourth Generation (4G wireless communication. OFDM based communication gives high spectral efficiency and mitigates Inter-symbol Interference (ISI in a wireless medium. In this paper the IEEE 802.15.3a based Multiband OFDM (MB OFDM system is considered. The pilot based channel estimation techniques are considered to analyze the performance of MB OFDM systems over Liner Time Invariant (LTI Channel models. In this paper, pilot based Least Square (LS and Least Minimum Mean Square Error (LMMSE channel estimation technique has been considered for UWB OFDM system. In the proposed method, the estimated Channel Impulse Responses (CIRs are filtered in the time domain for the consideration of the channel delay spread. Also the performance of proposed system has been analyzed for different modulation techniques for various pilot density patterns.

  13. Comparative performance analysis of the artificial-intelligence-based thermal control algorithms for the double-skin building

    International Nuclear Information System (INIS)

    Moon, Jin Woo

    2015-01-01

    This study aimed at developing artificial-intelligence-(AI)-theory-based optimal control algorithms for improving the indoor temperature conditions and heating energy efficiency of the double-skin buildings. For this, one conventional rule-based and four AI-based algorithms were developed, including artificial neural network (ANN), fuzzy logic (FL), and adaptive neuro fuzzy inference systems (ANFIS), for operating the surface openings of the double skin and the heating system. A numerical computer simulation method incorporating the matrix laboratory (MATLAB) and the transient systems simulation (TRNSYS) software was used for the comparative performance tests. The analysis results revealed that advanced thermal-environment comfort and stability can be provided by the AI-based algorithms. In particular, the FL and ANFIS algorithms were superior to the ANN algorithm in terms of providing better thermal conditions. The ANN-based algorithm, however, proved its potential to be the most energy-efficient and stable strategy among the four AI-based algorithms. It can be concluded that the optimal algorithm can be differently determined according to the major focus of the strategy. If comfortable thermal condition is the principal interest, then the FL or ANFIS algorithm could be the proper solution, and if energy saving for space heating and system operation stability is the main concerns, then the ANN-based algorithm may be applicable. - Highlights: • Integrated control algorithms were developed for the heating system and surface openings. • AI theories were applied to the control algorithms. • ANN, FL, and ANFIS were the applied AI theories. • Comparative performance tests were conducted using computer simulation. • AI algorithms presented superior temperature environment.

  14. Activity Performance Management Framework Based on Outcome Based Budgeting Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Aisya Raihan Abdul Kadir; Mohd Azmi Sidid Omar; Noriah Jamal

    2015-01-01

    The implementation of the Outcome Based Budgeting (OBB) in the planning and implementation of national development and public spending will emphasize the impact and effectiveness of programs and activities in line with the policies and objectives of the four pillars in the National Transformation programme, which is 1 Malaysia: People First, Performance Now, Government Transformation Programme (GTP), Economic Transformation Programme (ETP) and Malaysia Five Year Development Plan. OBB effective implementation at the ministry level was implemented by the Ministry OBB Implementation Committee (OIC) and Program Performance Management Committee (PPMC). At the agency it will be implemented by the Performance Management Committee Activities (APMC). OBB involve strategic implementation cycle consisting of four main processes, namely, outcome-based planning, budgeting, monitoring, evaluation, and reporting performance. OBB will be fully implemented in 2016 to replace the Modified Budgeting System (MBS). Performance Management Framework Activity (APMF) is based on outcome-based planning has been developed using methodologies such as ProLL Model (Logic and Linkages Programme), Problem Tree Analysis (PTA), Top-down approach, SMART principle, Framework Approach and rigour test. By applying this methodology several Activity Performance Management Framework (APMF) has been produced which consists of 3 output, 6 KPI output, 3 outcome and 8 KPI outcome in line with the direction and outcome of programme level and ministries level. APMF was planned at the beginning of each year and reporting of the performance on a quarterly basis through My Results application. (author)

  15. Performance-based analysis of current South African semi-trailer designs

    CSIR Research Space (South Africa)

    Thorogood, R

    2009-07-01

    Full Text Available , performance based standards, dynamic stability, tractor semi- trailers, directional response, static rollover threshold Introduction South African heavy vehicles are currently designed according to prescriptive standards designed and enforced... productivity. These include Central Tyre Inflation (CTI), on-board weighing, new materials such as Domex and vehicle satellite tracking, all leading towards increased payloads and reduced costs. There have also been improvements in technology...

  16. Human reliability analysis of performing tasks in plants based on fuzzy integral

    International Nuclear Information System (INIS)

    Washio, Takashi; Kitamura, Yutaka; Takahashi, Hideaki

    1991-01-01

    The effective improvement of the human working conditions in nuclear power plants might be a solution for the enhancement of the operation safety. The human reliability analysis (HRA) gives a methodological basis of the improvement based on the evaluation of human reliability under various working conditions. This study investigates some difficulties of the human reliability analysis using conventional linear models and recent fuzzy integral models, and provides some solutions to the difficulties. The following practical features of the provided methods are confirmed in comparison with the conventional methods: (1) Applicability to various types of tasks (2) Capability of evaluating complicated dependencies among working condition factors (3) A priori human reliability evaluation based on a systematic task analysis of human action processes (4) A conversion scheme to probability from indices representing human reliability. (author)

  17. Enhancing importance-performance analysis

    DEFF Research Database (Denmark)

    Eskildsen, Jacob Kjær; Kristensen, Kai

    2006-01-01

    Purpose: The interpretation of the importance/performance map is based on an assumption of independence between importance and performance but many studies question the validity of this assumption. The aim of this research is to develop a new typology for job satisfaction attributes as well...... as a new importance/performance map that can be an aid for organizations when they prioritize their improvement actions based on a job satisfaction study. Design/methodology/approach: A typology for possible relationships between importance and performance in job satisfaction studies is developed based...... on theoretical considerations. This typology is then applied and validated on approximately 10,000 responses from the European Employee Index 2002. Ultimately a new importance/performance map for priority setting in job satisfaction studies is developed based on the new typology for possible relationships...

  18. An SQL-based approach to physics analysis

    International Nuclear Information System (INIS)

    Limper, Dr Maaike

    2014-01-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced 'ROOT-ntuple' files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  19. Multi performance option in direct displacement based design

    Directory of Open Access Journals (Sweden)

    Muljati Ima

    2017-01-01

    Full Text Available Compare to traditional method, direct displacement based design (DDBD offers the more rational design choice due to its compatibility with performance based design which is controlled by the targeted displacement in design. The objectives of this study are: 1 to explore the performance of DDBD for design Level-1, -2 and -3; 2 to determine the most appropriate design level based on material efficiency and damage risk; and 3 to verify the chosen design in order to check its performance under small-, moderate- and severe earthquake. As case study, it uses regular concrete frame structures consists of fourand eight-story with typical plan, located in low- and high-risk seismicity area. The study shows that design Level-2 (repairable damage is the most appropriate choice. Nonlinear time history analysis is run for each case study in order to verify their performance based on parameter: story drift, damage indices, and plastic mechanism. It can be concluded that DDBD performed very well in predicting seismic demand of the observed structures. Design Level-2 can be chosen as the most appropriate design level. Structures are in safe plastic mechanism under all level of seismicity although some plastic hinges formed at some unexpected locations.

  20. Performance analysis and prediction in triathlon.

    Science.gov (United States)

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  1. An Analysis of Performance-Based Funding Policies and Recommendations for the Florida College System

    Science.gov (United States)

    Balog, Scott E.

    2016-01-01

    Nearly 30 states have adopted or are transitioning to performance-based funding programs for community colleges that allocate funding based on institutional performance according to defined metrics. While embraced by state lawmakers and promoted by outside advocacy groups as a method to improve student outcomes, enhance accountability and ensure…

  2. Performance Analysis of the Decentralized Eigendecomposition and ESPRIT Algorithm

    Science.gov (United States)

    Suleiman, Wassim; Pesavento, Marius; Zoubir, Abdelhak M.

    2016-05-01

    In this paper, we consider performance analysis of the decentralized power method for the eigendecomposition of the sample covariance matrix based on the averaging consensus protocol. An analytical expression of the second order statistics of the eigenvectors obtained from the decentralized power method which is required for computing the mean square error (MSE) of subspace-based estimators is presented. We show that the decentralized power method is not an asymptotically consistent estimator of the eigenvectors of the true measurement covariance matrix unless the averaging consensus protocol is carried out over an infinitely large number of iterations. Moreover, we introduce the decentralized ESPRIT algorithm which yields fully decentralized direction-of-arrival (DOA) estimates. Based on the performance analysis of the decentralized power method, we derive an analytical expression of the MSE of DOA estimators using the decentralized ESPRIT algorithm. The validity of our asymptotic results is demonstrated by simulations.

  3. Plant operator performance evaluation based on cognitive process analysis experiment

    International Nuclear Information System (INIS)

    Ujita, H.; Fukuda, M.

    1990-01-01

    This paper reports on an experiment to clarify plant operators' cognitive processes that has been performed, to improve the man-machine interface which supports their diagnoses and decisions. The cognitive processes under abnormal conditions were evaluated by protocol analyses interviews, etc. in the experiment using a plant training simulator. A cognitive process model is represented by a stochastic network, based on Rasmussen's decision making model. Each node of the network corresponds to an element of the cognitive process, such as observation, interpretation, execution, etc. Some observations were obtained as follows, by comparison of Monte Carlo simulation results with the experiment results: A process to reconfirm the plant parameters after execution of a task and feedback paths from this process to the observation and the task definition of next task were observed. The feedback probability average and standard deviation should be determined for each incident type to explain correctly the individual differences in the cognitive processes. The tendency for the operator's cognitive level to change from skill-based to knowledge-based via rule-based behavior was observed during the feedback process

  4. Performance analysis of communication links based on VCSEL and silicon photonics technology for high-capacity data-intensive scenario.

    Science.gov (United States)

    Boletti, A; Boffi, P; Martelli, P; Ferrario, M; Martinelli, M

    2015-01-26

    To face the increased demand for bandwidth, cost-effectiveness and simplicity of future Ethernet data communications, a comparison between two different solutions based on directly-modulated VCSEL sources and Silicon Photonics technologies is carried out. Also by exploiting 4-PAM modulation, the transmission of 50-Gb/s and beyond capacity per channel is analyzed by means of BER performance. Applications for optical backplane, very short reach and in case of client-optics networks and intra and inter massive data centers communications (up to 10 km) are taken into account. A comparative analysis based on the power consumption is also proposed.

  5. Experimental analysis on the performance of lithium based batteries for road full electric and hybrid vehicles

    International Nuclear Information System (INIS)

    Capasso, Clemente; Veneri, Ottorino

    2014-01-01

    Highlights: • Performance analysis for lithium storage technologies, such as Li[NiCoMn]O 2 and LiFePO 4 batteries. • Actual capacity of lithium technologies analyzed almost close to their nominal capacity also for high discharging current. • The charging efficiency for Li[NiCoMn]O 2 positively affects the regenerative breaking and fast recharging operations. • The analyzed battery packs follow dynamic power requirements on performed road driving cycles. • Experimental results demonstrate driving range is much higher when battery packs are based on lithium technology. - Abstract: This paper deals with an experimental evaluation regarding the real performance of lithium based energy storage systems for automotive applications. In particular real working operations of different lithium based storage system technologies, such as Li[NiCoMn]O 2 and LiFePO 4 batteries, are compared in this work from the point of view of their application in supplying full electric and hybrid vehicles, taking as a reference the well-known behavior of lead acid batteries. For this purpose, the experimental tests carried out in laboratory are firstly performed on single storage modules in stationary conditions. In this case the related results are obtained by means of a bidirectional cycle tester based on the IGBT technology, and consent to evaluate, compare and contrast charge/discharge characteristics and efficiency at constant values of current/voltage/power for each storage technology analyzed. Then, lithium battery packs are tested in supplying a 1.8 kW electric power train using a laboratory test bench, based on a 48 V DC bus and specifically configured to simulate working operations of electric vehicles on the road. For this other experimentation the test bench is equipped with an electric brake and acquisition/control system, able to represent in laboratory the real vehicle conditions and road characteristics on predefined driving cycles at different slopes. The obtained

  6. QUALITY OF NURSING CARE BASED ON ANALYSIS OF NURSING PERFORMANCE AND NURSE AND PATIENT SATISFACTION

    Directory of Open Access Journals (Sweden)

    Abdul Muhith

    2017-04-01

    Full Text Available Introduction: Nurses who frequently often contact to patients and most of their time serve patients in 24 hours, have an important role in caring for the patient. Patient satisfaction as quality indicator is the key success for competitiveness of service in hospital. The aim of this research was to develop nursing service quality model based on the nursing performance, nurse and patient satisfaction. Method: The research method used cross sectional study, at 14 wards of Gresik Hospital. Research factors were namely: oganization characteristic (organization culture and leadership, work factors (feedback and variety of nurses work, nurse characteristics (motivation, attitude, commitment and mental model, nursing practice, interpersonal communication, nurse and patient satisfaction. Statistical analysis of study data was analyzed by Partial Least Square (PLS. Results: The results of nursing performance revealed that nurse characteristic were not affected by organization culture and leadership style, nurse characteristics were affected by work factors, nurse characteristics affected nursing quality service (nursing practice, nursing professional, nurse and patient satisfaction, nurse satisfaction did not affect nursing professionals. Discussion: Based on the overall results of the development of nursing care model that was originally only emphasizes the process of nursing care only, should be consider the input factor of organizational characteristics, job characteristics, and characteristics of individual nurses and consider the process factors of nursing care standards and professional performance of nurses and to consider the outcome factors nurse and patient satisfaction. So in general the development model of quality of existing nursing care refers to a comprehensive system of quality.

  7. Performance analysis of tandem queues with small buffers

    NARCIS (Netherlands)

    Vuuren, van M.; Adan, I.J.B.F.; Papadopoulos, C.T.

    2005-01-01

    In this paper we present an approximation for the performance analysis of single-server tandem queues with small buffers and generally distributed service times. The approximation is based on decomposition of the tandem queue in subsystems, the parameters of which are determined by an iterative

  8. Performance of Water-Based Liquid Scintillator: An Independent Analysis

    Directory of Open Access Journals (Sweden)

    D. Beznosko

    2014-01-01

    Full Text Available The water-based liquid scintillator (WbLS is a new material currently under development. It is based on the idea of dissolving the organic scintillator in water using special surfactants. This material strives to achieve the novel detection techniques by combining the Cerenkov rings and scintillation light, as well as the total cost reduction compared to pure liquid scintillator (LS. The independent light yield measurement analysis for the light yield measurements using three different proton beam energies (210 MeV, 475 MeV, and 2000 MeV for water, two different WbLS formulations (0.4% and 0.99%, and pure LS conducted at Brookhaven National Laboratory, USA, is presented. The results show that a goal of ~100 optical photons/MeV, indicated by the simulation to be an optimal light yield for observing both the Cerenkov ring and the scintillation light from the proton decay in a large water detector, has been achieved.

  9. Frequency selective surfaces based high performance microstrip antenna

    CERN Document Server

    Narayan, Shiv; Jha, Rakesh Mohan

    2016-01-01

    This book focuses on performance enhancement of printed antennas using frequency selective surfaces (FSS) technology. The growing demand of stealth technology in strategic areas requires high-performance low-RCS (radar cross section) antennas. Such requirements may be accomplished by incorporating FSS into the antenna structure either in its ground plane or as the superstrate, due to the filter characteristics of FSS structure. In view of this, a novel approach based on FSS technology is presented in this book to enhance the performance of printed antennas including out-of-band structural RCS reduction. In this endeavor, the EM design of microstrip patch antennas (MPA) loaded with FSS-based (i) high impedance surface (HIS) ground plane, and (ii) the superstrates are discussed in detail. The EM analysis of proposed FSS-based antenna structures have been carried out using transmission line analogy, in combination with the reciprocity theorem. Further, various types of novel FSS structures are considered in desi...

  10. Caffeine ingestion enhances Wingate performance: a meta-analysis.

    Science.gov (United States)

    Grgic, Jozo

    2018-03-01

    The positive effects of caffeine ingestion on aerobic performance are well-established; however, recent findings are suggesting that caffeine ingestion might also enhance components of anaerobic performance. A commonly used test of anaerobic performance and power output is the 30-second Wingate test. Several studies explored the effects of caffeine ingestion on Wingate performance, with equivocal findings. To elucidate this topic, this paper aims to determine the effects of caffeine ingestion on Wingate performance using meta-analytic statistical techniques. Following a search through PubMed/MEDLINE, Scopus, and SportDiscus ® , 16 studies were found meeting the inclusion criteria (pooled number of participants = 246). Random-effects meta-analysis of standardized mean differences (SMD) for peak power output and mean power output was performed. Study quality was assessed using the modified version of the PEDro checklist. Results of the meta-analysis indicated a significant difference (p = .005) between the placebo and caffeine trials on mean power output with SMD values of small magnitude (0.18; 95% confidence interval: 0.05, 0.31; +3%). The meta-analysis performed for peak power output indicated a significant difference (p = .006) between the placebo and caffeine trials (SMD = 0.27; 95% confidence interval: 0.08, 0.47 [moderate magnitude]; +4%). The results from the PEDro checklist indicated that, in general, studies are of good and excellent methodological quality. This meta-analysis adds on to the current body of evidence showing that caffeine ingestion can also enhance components of anaerobic performance. The results presented herein may be helpful for developing more efficient evidence-based recommendations regarding caffeine supplementation.

  11. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  12. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  13. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  14. Facility/equipment performance evaluation using microcomputer simulation analysis

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered

  15. Intelligent Performance Analysis with a Natural Language Interface

    Science.gov (United States)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  16. Quality control and performance evaluation of k0-based neutron activation analysis at the Portuguese research reactor

    International Nuclear Information System (INIS)

    Dung, H.M.; Freitas, M.C.; Blaauw, M.; Almeida, S.M.; Dionisio, I.; Canha, N.H.

    2010-01-01

    The quality control (QC) and performance evaluation for the k 0 -based neutron activation analysis (k 0 -NAA) at the Portuguese research reactor (RPI) has been developed with the intention of using the method to meet the demands of trace element analysis for the applications in environmental, epidemiological and nutritional studies amongst others. The QC and performance evaluation include the following aspects: (1) estimation of the overall/combined standard uncertainty from the primary uncertainty sources; (2) validation of the method using a synthetic multi-element standard (SMELS); and (3) analysis of the certified reference materials from the National Institute of Standards and Technology (USA): NIST-SRM-1633a and NIST-SRM-1648 and the reference material from the International Atomic Energy Agency: IAEA-RM-336, for the purpose of controlling the overall accuracy and precision of the analytical results. The obtained results revealed that the k 0 -NAA method established at the RPI was fit for the purpose. The overall/combined standard uncertainty was estimated for elements of interest in the intended applications. The laboratory's analytical results as compared to the assigned values with the bias were less than 12% for most elements, except for a few elements which biased within 13-18%. The u-score values for most elements were less than |1.64|, except for Co, La and Ti within |1.64|-|1.96| and Sc, Cr, K and Sb within |1.96|-|2.58|. The NIST-1633a was also analyzed over 14 months for the purpose of evaluating the reproducibility of the method. The quality factors of k 0 -NAA established at RPI were evaluated, proving that the method meets the requirements of trace element analysis, which is also considering the method's performance for which the k 0 -NAA affords a specific, rapid and convenient capability for the intended applications.

  17. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  18. Computer-Based Image Analysis for Plus Disease Diagnosis in Retinopathy of Prematurity: Performance of the "i-ROP" System and Image Features Associated With Expert Diagnosis.

    Science.gov (United States)

    Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F

    2015-11-01

    We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.

  19. [The debate concerning performance-based financing in Africa South of the Sahara: analysis of the nature].

    Science.gov (United States)

    Manitu, Serge Mayaka; Meessen, Bruno; Lushimba, Michel Muvudi; Macq, Jean

    2015-01-01

    Performance-based financing (PBF) is a strategy designed to link thefunding of health services to predetermined results. Payment by an independent strategic purchaser is subject to verification of effective achievement of health outcomes in terms ofquantity and quality. This article investigates the complex tensions observed in relation to performance based financing (PBF) and identifies some reasons for disagreement on this approach. This study was essentially qualitative. Interviews were conducted with a panel of experts on PBF mobilizing their ability to reflect on the various arguments and positions concerning this financing mechanism. To enhance our analyses, we proposed a framework based on the main reasonsfor scientific or political controversies and factors involved in their emergence. Analysis of the information collected therefore consisted of combining experts verbatim reports with corresponding factors of controversies of our framework. Graphic representations of the differences were also established. Tensions concerning PBF are based on facts (experts' interpretation ofPBF), principles and values (around each expert's conceptual framework), balances of power between experts but also inappropriate behavior in the discussion process. Viewpoints remain isolated, each individual experience and an overview are lacking, which can interfere with decision-making and maintain the Health system reform crisis. Potential solutions to reduce these tensions are proposed. Our study shows that experts have difficulties agreeing on a theoretical priority approach to PBE. A good understanding of the nature of the tensions and an improvement in the quality of dialogue will promote a real dynamic of change and the proposal of an agenda of PBF actions.

  20. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  1. Design and performance analysis of delay insensitive multi-ring structures

    DEFF Research Database (Denmark)

    Sparsø, Jens; Staunstrup, Jørgen

    1993-01-01

    A set of simple design and performance analysis techniques that have been successfully used to design a number of nontrivial delay insensitive circuits is described. Examples are building blocks for digital filters and a vector multiplier using a serial-parallel multiply and accumulate algorithm....... The vector multiplier circuit has been laid out, submitted for fabrication and successfully tested. Throughout the analysis elements from this design are used to illustrate the design and performance analysis techniques. The design technique is based on a data flow approach using pipelines and rings...... that are composed into larger multiring structures by joining and forking of signals. By limiting to this class of structures, it is possible, even for complex designs, to analyze the performance and establish an understanding of the bottlenecks....

  2. Evaluating the spatio-temporal performance of sky-imager-based solar irradiance analysis and forecasts

    Science.gov (United States)

    Schmidt, Thomas; Kalisch, John; Lorenz, Elke; Heinemann, Detlev

    2016-03-01

    Clouds are the dominant source of small-scale variability in surface solar radiation and uncertainty in its prediction. However, the increasing share of solar energy in the worldwide electric power supply increases the need for accurate solar radiation forecasts. In this work, we present results of a very short term global horizontal irradiance (GHI) forecast experiment based on hemispheric sky images. A 2-month data set with images from one sky imager and high-resolution GHI measurements from 99 pyranometers distributed over 10 km by 12 km is used for validation. We developed a multi-step model and processed GHI forecasts up to 25 min with an update interval of 15 s. A cloud type classification is used to separate the time series into different cloud scenarios. Overall, the sky-imager-based forecasts do not outperform the reference persistence forecasts. Nevertheless, we find that analysis and forecast performance depends strongly on the predominant cloud conditions. Especially convective type clouds lead to high temporal and spatial GHI variability. For cumulus cloud conditions, the analysis error is found to be lower than that introduced by a single pyranometer if it is used representatively for the whole area in distances from the camera larger than 1-2 km. Moreover, forecast skill is much higher for these conditions compared to overcast or clear sky situations causing low GHI variability, which is easier to predict by persistence. In order to generalize the cloud-induced forecast error, we identify a variability threshold indicating conditions with positive forecast skill.

  3. Evaluating the spatio-temporal performance of sky-imager-based solar irradiance analysis and forecasts

    Directory of Open Access Journals (Sweden)

    T. Schmidt

    2016-03-01

    Full Text Available Clouds are the dominant source of small-scale variability in surface solar radiation and uncertainty in its prediction. However, the increasing share of solar energy in the worldwide electric power supply increases the need for accurate solar radiation forecasts. In this work, we present results of a very short term global horizontal irradiance (GHI forecast experiment based on hemispheric sky images. A 2-month data set with images from one sky imager and high-resolution GHI measurements from 99 pyranometers distributed over 10 km by 12 km is used for validation. We developed a multi-step model and processed GHI forecasts up to 25 min with an update interval of 15 s. A cloud type classification is used to separate the time series into different cloud scenarios. Overall, the sky-imager-based forecasts do not outperform the reference persistence forecasts. Nevertheless, we find that analysis and forecast performance depends strongly on the predominant cloud conditions. Especially convective type clouds lead to high temporal and spatial GHI variability. For cumulus cloud conditions, the analysis error is found to be lower than that introduced by a single pyranometer if it is used representatively for the whole area in distances from the camera larger than 1–2 km. Moreover, forecast skill is much higher for these conditions compared to overcast or clear sky situations causing low GHI variability, which is easier to predict by persistence. In order to generalize the cloud-induced forecast error, we identify a variability threshold indicating conditions with positive forecast skill.

  4. Evaluating the spatio-temporal performance of sky imager based solar irradiance analysis and forecasts

    Science.gov (United States)

    Schmidt, T.; Kalisch, J.; Lorenz, E.; Heinemann, D.

    2015-10-01

    Clouds are the dominant source of variability in surface solar radiation and uncertainty in its prediction. However, the increasing share of solar energy in the world-wide electric power supply increases the need for accurate solar radiation forecasts. In this work, we present results of a shortest-term global horizontal irradiance (GHI) forecast experiment based on hemispheric sky images. A two month dataset with images from one sky imager and high resolutive GHI measurements from 99 pyranometers distributed over 10 km by 12 km is used for validation. We developed a multi-step model and processed GHI forecasts up to 25 min with an update interval of 15 s. A cloud type classification is used to separate the time series in different cloud scenarios. Overall, the sky imager based forecasts do not outperform the reference persistence forecasts. Nevertheless, we find that analysis and forecast performance depend strongly on the predominant cloud conditions. Especially convective type clouds lead to high temporal and spatial GHI variability. For cumulus cloud conditions, the analysis error is found to be lower than that introduced by a single pyranometer if it is used representatively for the whole area in distances from the camera larger than 1-2 km. Moreover, forecast skill is much higher for these conditions compared to overcast or clear sky situations causing low GHI variability which is easier to predict by persistence. In order to generalize the cloud-induced forecast error, we identify a variability threshold indicating conditions with positive forecast skill.

  5. Performance Analysis of the Romanian Administration

    Directory of Open Access Journals (Sweden)

    Marius Constantin PROFIROIU

    2013-10-01

    Full Text Available The performance of public administration is one of the top priorities of the national governments worldwide, not only for Romania. The role of a performing management system at the level of public administration is to ensure a high quality and efficiency of the adopted policies and strategies, of the provided public services and of the administrative act itself, and to guarantee the advantage of a competitive and efficient administration both in relation to its own citizens, and in competition with other cities and countries throughout Europe and all around the world. Following these considerations, and based upon an empirical research conducted with the aid of a survey regarding ‘The analysis of the performance level of the Romanian public administration’ the article aims to (1 identify modern management tools that determine and influence the performance of Romanian public institutions, (2 analyze the effects of using project management as organizational capacity development instruments by public administration in Romania, and (3 determine the influence and effects of the external factors on the performance and development of Romanian public administration.

  6. Experience with performance based training of nuclear criticality safety engineers

    International Nuclear Information System (INIS)

    Taylor, R.G.

    1993-01-01

    For non-reactor nuclear facilities, the U.S. Department of Energy (DOE) does not require that nuclear criticality safety engineers demonstrate qualification for their job. It is likely, however, that more formalism will be required in the future. Current DOE requirements for those positions which do have to demonstrate qualification indicate that qualification should be achieved by using a systematic approach such as performance based training (PBT). Assuming that PBT would be an acceptable mechanism for nuclear criticality safety engineer training in a more formal environment, a site-specific analysis of the nuclear criticality safety engineer job was performed. Based on this analysis, classes are being developed and delivered to a target audience of newer nuclear criticality safety engineers. Because current interest is in developing training for selected aspects of the nuclear criticality safety engineer job, the analysis is incompletely developed in some areas

  7. Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool

    OpenAIRE

    Tafesse, Wondwesen; Skallerud, Kåre; Korneliussen, Tor

    2010-01-01

    Author's accepted version (post-print). The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be used to eval...

  8. Moments Based Framework for Performance Analysis of One-Way/Two-Way CSI-Assisted AF Relaying

    KAUST Repository

    Xia, Minghua

    2012-09-01

    When analyzing system performance of conventional one-way relaying or advanced two-way relaying, these two techniques are always dealt with separately and, thus, their performance cannot be compared efficiently. Moreover, for ease of mathematical tractability, channels considered in such studies are generally assumed to be subject to Rayleigh fading or to be Nakagami-$m$ channels with integer fading parameters, which is impractical in typical urban environments. In this paper, we propose a unified moments-based framework for general performance analysis of channel-state-information (CSI) assisted amplify-and-forward (AF) relaying systems. The framework is applicable to both one-way and two-way relaying over arbitrary Nakagami-$m$ fading channels, and it includes previously reported results as special cases. Specifically, the mathematical framework is firstly developed under the umbrella of the weighted harmonic mean of two Gamma-distributed variables in conjunction with the theory of Pad\\\\\\'e approximants. Then, general expressions for the received signal-to-noise ratios of the users in one-way/two-way relaying systems and the corresponding moments, moment generation function, and cumulative density function are established. Subsequently, the mathematical framework is applied to analyze, compare, and gain insights into system performance of one-way and two-way relaying techniques, in terms of outage probability, average symbol error probability, and achievable data rate. All analytical results are corroborated by simulation results as well as previously reported results whenever available, and they are shown to be efficient tools to evaluate and compare system performance of one-way and two-way relaying.

  9. Performance Analysis of AODV Routing Protocol for Wireless Sensor Network based Smart Metering

    International Nuclear Information System (INIS)

    Farooq, Hasan; Jung, Low Tang

    2013-01-01

    Today no one can deny the need for Smart Grid and it is being considered as of utmost importance to upgrade outdated electric infrastructure to cope with the ever increasing electric load demand. Wireless Sensor Network (WSN) is considered a promising candidate for internetworking of smart meters with the gateway using mesh topology. This paper investigates the performance of AODV routing protocol for WSN based smart metering deployment. Three case studies are presented to analyze its performance based on four metrics of (i) Packet Delivery Ratio, (ii) Average Energy Consumption of Nodes (iii) Average End-End Delay and (iv) Normalized Routing Load.

  10. ACHIEVING MATURITY (AND MEASURING PERFORMANCE THROUGH MODEL-BASED PROCESS IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Jose Marcelo Almeida Prado Cestari

    2013-08-01

    Full Text Available This paper presents the approach adopted by a software development unit in order to achieve the maturity level 3 of CMMI-DEV and therefore obtaining better performance. Through historical research and secondary data analysis of the organization, the paper intends to answer the following research question: "Could the adoption of maturity/best practices models bring better performance results to small and medium organizations?" Data and analysis conducted show that, besides the creation of indicator’s based management, there are some quantitative performance improvements in indicators such as: Schedule Deviation Rate, Effort Deviation Rate, Percent Late Delivery, Productivity Deviation and Internal Rework Rate

  11. Machine learning-based analysis of MR radiomics can help to improve the diagnostic performance of PI-RADS v2 in clinically relevant prostate cancer

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jing [CFDA, Center for Medical Device Evaluation, Beijing (China); Wu, Chen-Jiang; Zhang, Jing; Wang, Xiao-Ning; Zhang, Yu-Dong [First Affiliated Hospital with Nanjing Medical University, Department of Radiology, Nanjing, Jiangsu Province (China); Bao, Mei-Ling [First Affiliated Hospital with Nanjing Medical University, Department of Pathology, Nanjing (China)

    2017-10-15

    To investigate whether machine learning-based analysis of MR radiomics can help improve the performance PI-RADS v2 in clinically relevant prostate cancer (PCa). This IRB-approved study included 54 patients with PCa undergoing multi-parametric (mp) MRI before prostatectomy. Imaging analysis was performed on 54 tumours, 47 normal peripheral (PZ) and 48 normal transitional (TZ) zone based on histological-radiological correlation. Mp-MRI was scored via PI-RADS, and quantified by measuring radiomic features. Predictive model was developed using a novel support vector machine trained with: (i) radiomics, (ii) PI-RADS scores, (iii) radiomics and PI-RADS scores. Paired comparison was made via ROC analysis. For PCa versus normal TZ, the model trained with radiomics had a significantly higher area under the ROC curve (Az) (0.955 [95% CI 0.923-0.976]) than PI-RADS (Az: 0.878 [0.834-0.914], p < 0.001). The Az between them was insignificant for PCa versus PZ (0.972 [0.945-0.988] vs. 0.940 [0.905-0.965], p = 0.097). When radiomics was added, performance of PI-RADS was significantly improved for PCa versus PZ (Az: 0.983 [0.960-0.995]) and PCa versus TZ (Az: 0.968 [0.940-0.985]). Machine learning analysis of MR radiomics can help improve the performance of PI-RADS in clinically relevant PCa. (orig.)

  12. Machine learning-based analysis of MR radiomics can help to improve the diagnostic performance of PI-RADS v2 in clinically relevant prostate cancer

    International Nuclear Information System (INIS)

    Wang, Jing; Wu, Chen-Jiang; Zhang, Jing; Wang, Xiao-Ning; Zhang, Yu-Dong; Bao, Mei-Ling

    2017-01-01

    To investigate whether machine learning-based analysis of MR radiomics can help improve the performance PI-RADS v2 in clinically relevant prostate cancer (PCa). This IRB-approved study included 54 patients with PCa undergoing multi-parametric (mp) MRI before prostatectomy. Imaging analysis was performed on 54 tumours, 47 normal peripheral (PZ) and 48 normal transitional (TZ) zone based on histological-radiological correlation. Mp-MRI was scored via PI-RADS, and quantified by measuring radiomic features. Predictive model was developed using a novel support vector machine trained with: (i) radiomics, (ii) PI-RADS scores, (iii) radiomics and PI-RADS scores. Paired comparison was made via ROC analysis. For PCa versus normal TZ, the model trained with radiomics had a significantly higher area under the ROC curve (Az) (0.955 [95% CI 0.923-0.976]) than PI-RADS (Az: 0.878 [0.834-0.914], p < 0.001). The Az between them was insignificant for PCa versus PZ (0.972 [0.945-0.988] vs. 0.940 [0.905-0.965], p = 0.097). When radiomics was added, performance of PI-RADS was significantly improved for PCa versus PZ (Az: 0.983 [0.960-0.995]) and PCa versus TZ (Az: 0.968 [0.940-0.985]). Machine learning analysis of MR radiomics can help improve the performance of PI-RADS in clinically relevant PCa. (orig.)

  13. A Preliminary Model for Spacecraft Propulsion Performance Analysis Based on Nuclear Gain and Subsystem Mass-Power Balances

    Science.gov (United States)

    Chakrabarti, Suman; Schmidt, George R.; Thio, Y. C.; Hurst, Chantelle M.

    1999-01-01

    A preliminary model for spacecraft propulsion performance analysis based on nuclear gain and subsystem mass-power balances are presented in viewgraph form. For very fast missions with straight-line trajectories, it has been shown that mission trip time is proportional to the cube root of alpha. Analysis of spacecraft power systems via a power balance and examination of gain vs. mass-power ratio has shown: 1) A minimum gain is needed to have enough power for thruster and driver operation; and 2) Increases in gain result in decreases in overall mass-power ratio, which in turn leads to greater achievable accelerations. However, subsystem mass-power ratios and efficiencies are crucial: less efficient values for these can partially offset the effect of nuclear gain. Therefore, it is of interest to monitor the progress of gain-limited subsystem technologies and it is also possible that power-limited systems with sufficiently low alpha may be competitive for such ambitious missions. Topics include Space flight requirements; Spacecraft energy gain; Control theory for performance; Mission assumptions; Round trips: Time and distance; Trip times; Vehicle acceleration; and Minimizing trip times.

  14. Analysis of business process maturity and organisational performance relations

    Directory of Open Access Journals (Sweden)

    Kalinowski T. Bartosz

    2016-12-01

    Full Text Available The paper aims to present results of the study on business process maturity in relation to organisational performance. A two-phase methodology, based on literature review and survey was used. The literature is a source of knowledge about business process maturity and organisational performance, whereas the research on process maturity vs organisational performance in Polish Enterprises provides findings based on 84 surveyed companies. The main areas of the research covered: identification and analysis of maturity related variables and identification of organisational performance perspectives and its relation to process maturity. The study shows that there is a significant positive relation between process maturity and organisational performance. Although there are research on such relation available, they are scarce and have some significant limitations in terms of research sample or the scope of maturity or organisational performance covered. This publication is part of a project funded by the National Science Centre awarded by decision number DEC-2011/01/D/HS4/04070.

  15. Performance analysis of switching systems

    NARCIS (Netherlands)

    Berg, van den R.A.

    2008-01-01

    Performance analysis is an important aspect in the design of dynamic (control) systems. Without a proper analysis of the behavior of a system, it is impossible to guarantee that a certain design satisfies the system’s requirements. For linear time-invariant systems, accurate performance analyses are

  16. Performance of Loaded Thermal Storage Unit with a Commercial Phase Change Materials based on Energy and Exergy Analysis

    Directory of Open Access Journals (Sweden)

    Abdullah Nasrallh Olimat

    2017-11-01

    Article History: Received July 6th 2017; Received in revised form September 15th 2017; Accepted 25th Sept 2017; Available online How to Cite This Article: Olimat, A.N., Awad, A.S., Al-Gathain, F.M., and Shaban, N.A.. (2017 Performance of Loaded Thermal Storage Unit With A Commercial Phase Change Materials Based on Energy and Exergy Analysis. International Journal of Renewable Energy Develeopment, 6(3,283-290. https://doi.org/10.14710/ijred.6.3.283-290

  17. Performance Analysis of Korean Liquid metal type TBM based on Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. H.; Han, B. S.; Park, H. J.; Park, D. K. [Seoul National Univ., Seoul (Korea, Republic of)

    2007-01-15

    The objective of this project is to analyze a nuclear performance of the Korean HCML(Helium Cooled Molten Lithium) TBM(Test Blanket Module) which will be installed in ITER(International Thermonuclear Experimental Reactor). This project is intended to analyze a neutronic design and nuclear performances of the Korean HCML ITER TBM through the transport calculation of MCCARD. In detail, we will conduct numerical experiments for analyzing the neutronic design of the Korean HCML TBM and the DEMO fusion blanket, and improving the nuclear performances. The results of the numerical experiments performed in this project will be utilized further for a design optimization of the Korean HCML TBM. In this project, Monte Carlo transport calculations for evaluating TBR (Tritium Breeding Ratio) and EMF (Energy Multiplication factor) were conducted to analyze a nuclear performance of the Korean HCML TBM. The activation characteristics and shielding performances for the Korean HCML TBM were analyzed using ORIGEN and MCCARD. We proposed the neutronic methodologies for analyzing the nuclear characteristics of the fusion blanket, which was applied to the blanket analysis of a DEMO fusion reactor. In the results, the TBR of the Korean HCML ITER TBM is 0.1352 and the EMF is 1.362. Taking into account a limitation for the Li amount in ITER TBM, it is expected that tritium self-sufficiency condition can be satisfied through a change of the Li quantity and enrichment. In the results of activation and shielding analysis, the activity drops to 1.5% of the initial value and the decay heat drops to 0.02% of the initial amount after 10 years from plasma shutdown.

  18. Integrated design and performance analysis of the KO HCCR TBM for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won, E-mail: dwlee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon; Lee, Cheol Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young; Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Integrated analysis is performed with the conventional CFD code (ANSYS-CFX). • Overall pressure drop and coolant flow scheme are investigated. • Manifold design is being performed considering flow distribution. - Abstract: To develop tritium breeding technology for a Fusion Reactor, Korea has participated in the Test Blanket Module (TBM) program in ITER. The He Cooled Ceramic Reflector (HCCR) TBM consists of functional components such as First Wall (FW), Breeding Zone (BZ), Side Wall (SW), and Back Manifold (BM) and it was designed based on the separate analyses for each component in 2012. Based on the each component analysis model, the integrated model is prepared and thermal-hydraulic analysis for the HCCR TBM is performed in the present study. The coolant flow distribution from BM and SW to FW and BZ, and resulted structure temperatures are obtained with the integrated model. It is found that the non-uniform flow rate occurs at FW and BZ and it causes excess of the design limit (550 °C) at some region. Based on this integrated model, we will perform the design optimization for obtaining uniform flow distribution for satisfying the design requirements.

  19. A Case Study for Student Performance Analysis based on Educational Data Mining (EDM)

    OpenAIRE

    Daxa Kundariya; Prof. Vaseem Ghada

    2016-01-01

    Educational Data Mining (EDM) is a study methodology and an application of data mining techniques related to student’s data from academic database. Like other domain, educational domain also produce vast amount of studying data. To enhance the quality of education system student performance analysis plays an important role for decision support. This paper elaborates a study on various Educational data mining technique and how they could be used to educational system to analysis student perfor...

  20. Analysis of tag-based Recommendation Performance for a Semantic Wiki

    DEFF Research Database (Denmark)

    Durao, Frederico; Dolog, Peter

    2009-01-01

    Recommendations play a very important role for revealing related topics addressed in the wikis beyond the currently viewed page. In this paper, we extend KiWi, a semantic wiki with three different recommendation approaches. The first approach is implemented as a traditional tag-based retrieval......, the second takes into account external factors such as tag popularity, tag representativeness and the affinity between user and tag and the third approach recommends pages in grouped by tag. The experiment evaluates the wiki performance in different scenarios regarding the amount of pages, tags and users...

  1. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  2. Performance Analysis of a Cluster-Based MAC Protocol for Wireless Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Jesús Alonso-Zárate

    2010-01-01

    Full Text Available An analytical model to evaluate the non-saturated performance of the Distributed Queuing Medium Access Control Protocol for Ad Hoc Networks (DQMANs in single-hop networks is presented in this paper. DQMAN is comprised of a spontaneous, temporary, and dynamic clustering mechanism integrated with a near-optimum distributed queuing Medium Access Control (MAC protocol. Clustering is executed in a distributed manner using a mechanism inspired by the Distributed Coordination Function (DCF of the IEEE 802.11. Once a station seizes the channel, it becomes the temporary clusterhead of a spontaneous cluster and it coordinates the peer-to-peer communications between the clustermembers. Within each cluster, a near-optimum distributed queuing MAC protocol is executed. The theoretical performance analysis of DQMAN in single-hop networks under non-saturation conditions is presented in this paper. The approach integrates the analysis of the clustering mechanism into the MAC layer model. Up to the knowledge of the authors, this approach is novel in the literature. In addition, the performance of an ad hoc network using DQMAN is compared to that obtained when using the DCF of the IEEE 802.11, as a benchmark reference.

  3. Dynamic performances analysis of a real vehicle driving

    Science.gov (United States)

    Abdullah, M. A.; Jamil, J. F.; Salim, M. A.

    2015-12-01

    Vehicle dynamic is the effects of movement of a vehicle generated from the acceleration, braking, ride and handling activities. The dynamic behaviours are determined by the forces from tire, gravity and aerodynamic which acting on the vehicle. This paper emphasizes the analysis of vehicle dynamic performance of a real vehicle. Real driving experiment on the vehicle is conducted to determine the effect of vehicle based on roll, pitch, and yaw, longitudinal, lateral and vertical acceleration. The experiment is done using the accelerometer to record the reading of the vehicle dynamic performance when the vehicle is driven on the road. The experiment starts with weighing a car model to get the center of gravity (COG) to place the accelerometer sensor for data acquisition (DAQ). The COG of the vehicle is determined by using the weight of the vehicle. A rural route is set to launch the experiment and the road conditions are determined for the test. The dynamic performance of the vehicle are depends on the road conditions and driving maneuver. The stability of a vehicle can be controlled by the dynamic performance analysis.

  4. Heat transfer and performance analysis of thermoelectric stoves

    International Nuclear Information System (INIS)

    Najjar, Yousef S.H.; Kseibi, Musaab M.

    2016-01-01

    Highlights: • Design and testing of a thermo electric stove. • Three biofuels namely: wood, peat and manure are used. • Heat transfer analysis is detailed. • Resulting thermoelectric energy for vital purposes in remote poor regions. • Evaluation of performance of the stove subcomponents. - Abstract: Access to electricity is one of the important challenges for remote poor regions of the world. Adding TEG (thermoelectric generators) to stoves can provide electricity for the basic benefits such as: operating radio, light, phones, medical instruments and other small electronic devices. Heat transfer analysis of a multi-purpose stove coupled with 12 TEG modules is presented. This analysis comprises a well aerodynamically designed combustor, finned TEG base plate, cooker and water heater beside the outer surface for space heating. Heat transfer analysis was also carried out for all the subcomponents of the stove, and performance predicted against the experimental results. It was found that the maximum power obtained is about 7.88 W using wood, manure or peat with an average overall efficiency of the stove about 60%.

  5. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression.......The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...

  6. Analysis of ESG indicators for measuring enterprise performance

    Directory of Open Access Journals (Sweden)

    Zuzana Chvátalová

    2013-01-01

    Full Text Available In this article authors focus on the analysis of the whole set of environmental, social and corporate governance (ESG indicators for the elimination of double or triple effects within the next construction of methods for measuring corporate performance. They build on their previously published results (in Acta univ. agric. et silvic. Mendel. Brun., 2012. The partial actual selected results of a recently undertaken currently project entitled ‘Construction of Methods for Multifactorial Assessment of Company Complex Performance in Selected Sectors’ were used. This project was solved the research teams of the Faculty of Business and Management of Brno University Technology and Faculty of Business and Economics of Mendel University in Brno since 2011. Further theoretical resources in the environmental, social and corporate governance area, known indicator databases (namely Global Reporting Initiative, comparative analysis, resp. syntheses for identifying possible of common indicator properties were identified to classify indicator subsets to preclude double or even triple effect based on mathematical set theory (Venn diagrams. The indicator analysis in constructed multi-factorial methods contributes to precise decision making in management to improve corporate performance.

  7. CASAS: Cancer Survival Analysis Suite, a web based application.

    Science.gov (United States)

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.

  8. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  9. PREPS2 - a PC-based computer program for performing economic analysis of capital projects

    International Nuclear Information System (INIS)

    Blake, M.W.; Brand, D.O.; Chastain, E.T.; Johnson, E.D.

    1990-01-01

    In these times of increased spending to finance new capacity and to meet clean air act legislation, many electric utilities are giving a high priority to controlling capital expenditures at existing generating facilities. Determining the level of capital expenditures which are economically justified is very difficult; units which have higher capacity factors are worth more to the utility. Therefore, the utility can more readily justify higher capital expenditures to improve or maintain reliability and heat rate than on units with lower capacity factors. This paper describes a PC-based computer program (PREPS2) which performs an economic analysis of individual capital projects. The program incorporates tables which describe the worth to the system of making improvements in each unit. This computer program is currently being used by the six Southern Company operating companies to evaluate all production capital projects over $50,000. Approximately 500 projects representing about $300 million are being analyzed each year

  10. Frequency domain performance analysis of marginally stable LTI systems with saturation

    NARCIS (Netherlands)

    Berg, van den R.A.; Pogromski, A.Y.; Rooda, J.E.; Leonov, G.; Nijmeijer, H.; Pogromsky, A.; Fradkov, A.

    2009-01-01

    In this paper we discuss the frequency domain performance analysis of a marginally stable linear time-invariant (LTI) system with saturation in the feedback loop. We present two methods, both based on the notion of convergent systems, that allow to evaluate the performance of this type of systems in

  11. Performance Analysis of MYSEA

    Science.gov (United States)

    2012-09-01

    Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties

  12. Improving the performance of a filling line based on simulation

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  13. Sports Stars: Analyzing the Performance of Astronomers at Visualization-based Discovery

    Science.gov (United States)

    Fluke, C. J.; Parrington, L.; Hegarty, S.; MacMahon, C.; Morgan, S.; Hassan, A. H.; Kilborn, V. A.

    2017-05-01

    In this data-rich era of astronomy, there is a growing reliance on automated techniques to discover new knowledge. The role of the astronomer may change from being a discoverer to being a confirmer. But what do astronomers actually look at when they distinguish between “sources” and “noise?” What are the differences between novice and expert astronomers when it comes to visual-based discovery? Can we identify elite talent or coach astronomers to maximize their potential for discovery? By looking to the field of sports performance analysis, we consider an established, domain-wide approach, where the expertise of the viewer (i.e., a member of the coaching team) plays a crucial role in identifying and determining the subtle features of gameplay that provide a winning advantage. As an initial case study, we investigate whether the SportsCode performance analysis software can be used to understand and document how an experienced Hi astronomer makes discoveries in spectral data cubes. We find that the process of timeline-based coding can be applied to spectral cube data by mapping spectral channels to frames within a movie. SportsCode provides a range of easy to use methods for annotation, including feature-based codes and labels, text annotations associated with codes, and image-based drawing. The outputs, including instance movies that are uniquely associated with coded events, provide the basis for a training program or team-based analysis that could be used in unison with discipline specific analysis software. In this coordinated approach to visualization and analysis, SportsCode can act as a visual notebook, recording the insight and decisions in partnership with established analysis methods. Alternatively, in situ annotation and coding of features would be a valuable addition to existing and future visualization and analysis packages.

  14. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  15. Experience with performance based training of nuclear criticality safety engineers

    International Nuclear Information System (INIS)

    Taylor, R.G.

    1993-01-01

    Historically, new entrants to the practice of nuclear criticality safety have learned their job primarily by on-the-job training (OJT) often by association with an experienced nuclear criticality safety engineer who probably also learned their job by OJT. Typically, the new entrant learned what he/she needed to know to solve a particular problem and accumulated experience as more problems were solved. It is likely that more formalism will be required in the future. Current US Department of Energy requirements for those positions which have to demonstrate qualification indicate that it should be achieved by using a systematic approach such as performance based training (PBT). Assuming that PBT would be an acceptable mechanism for nuclear criticality safety engineer training in a more formal environment, a site-specific analysis of the nuclear criticality safety engineer job was performed. Based on this analysis, classes are being developed and delivered to a target audience of newer nuclear criticality safety engineers. Because current interest is in developing training for selected aspects of the nuclear criticality safety engineer job, the analysis i's incompletely developed in some areas. Details of this analysis are provided in this report

  16. Optimization of cooling tower performance analysis using Taguchi method

    Directory of Open Access Journals (Sweden)

    Ramkumar Ramakrishnan

    2013-01-01

    Full Text Available This study discuss the application of Taguchi method in assessing maximum cooling tower effectiveness for the counter flow cooling tower using expanded wire mesh packing. The experiments were planned based on Taguchi’s L27 orthogonal array .The trail was performed under different inlet conditions of flow rate of water, air and water temperature. Signal-to-noise ratio (S/N analysis, analysis of variance (ANOVA and regression were carried out in order to determine the effects of process parameters on cooling tower effectiveness and to identity optimal factor settings. Finally confirmation tests verified this reliability of Taguchi method for optimization of counter flow cooling tower performance with sufficient accuracy.

  17. Diagnostic Performance of Mammographic Texture Analysis in the Differential Diagnosis of Benign and Malignant Breast Tumors.

    Science.gov (United States)

    Li, Zhiming; Yu, Lan; Wang, Xin; Yu, Haiyang; Gao, Yuanxiang; Ren, Yande; Wang, Gang; Zhou, Xiaoming

    2017-11-09

    The purpose of this study was to investigate the diagnostic performance of mammographic texture analysis in the differential diagnosis of benign and malignant breast tumors. Digital mammography images were obtained from the Picture Archiving and Communication System at our institute. Texture features of mammographic images were calculated. Mann-Whitney U test was used to identify differences between the benign and malignant group. The receiver operating characteristic (ROC) curve analysis was used to assess the diagnostic performance of texture features. Significant differences of texture features of histogram, gray-level co-occurrence matrix (GLCM) and run length matrix (RLM) were found between the benign and malignant breast group (P  .05). The AUROCs of imaging-based diagnosis, texture analysis, and imaging-based diagnosis combined with texture analysis were 0.873, 0.863, and 0.961, respectively. When imaging-based diagnosis was combined with texture analysis, the AUROC was higher than that of imaging-based diagnosis or texture analysis (P benign and malignant breast tumors. Furthermore, the combination of imaging-based diagnosis and texture analysis can significantly improve diagnostic performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  19. Induction Machine with Improved Operating Performances for Electric Trucks. A FEM-Based Analysis

    Directory of Open Access Journals (Sweden)

    MUNTEANU, A.

    2010-05-01

    Full Text Available The paper presents a study concerning the performance developed by induction motors destined for motorization of heavy electric vehicles such as trucks. Taking into consideration the imposed restrictions, one presents, in a comparative manner, the main geometrical parameters which come of the classical design algorithms. A special attention is dedicated to the winding design, since it has to ensure two synchronous speeds corresponding to 16 and 8 poles, respectively. Moreover, the influence of the rotor slots shape for the improvement of the start-up is analyzed. Finally, a FEM-based study (approach based on finite element method is performed to put in view specific torque and slip values such as rated, start-up and pull-out ones.

  20. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  1. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  2. Performance analysis of automated evaluation of Crithidia luciliae-based indirect immunofluorescence tests in a routine setting - strengths and weaknesses.

    Science.gov (United States)

    Hormann, Wymke; Hahn, Melanie; Gerlach, Stefan; Hochstrate, Nicola; Affeldt, Kai; Giesen, Joyce; Fechner, Kai; Damoiseaux, Jan G M C

    2017-11-27

    Antibodies directed against dsDNA are a highly specific diagnostic marker for the presence of systemic lupus erythematosus and of particular importance in its diagnosis. To assess anti-dsDNA antibodies, the Crithidia luciliae-based indirect immunofluorescence test (CLIFT) is one of the assays considered to be the best choice. To overcome the drawback of subjective result interpretation that inheres indirect immunofluorescence assays in general, automated systems have been introduced into the market during the last years. Among these systems is the EUROPattern Suite, an advanced automated fluorescence microscope equipped with different software packages, capable of automated pattern interpretation and result suggestion for ANA, ANCA and CLIFT analysis. We analyzed the performance of the EUROPattern Suite with its automated fluorescence interpretation for CLIFT in a routine setting, reflecting the everyday life of a diagnostic laboratory. Three hundred and twelve consecutive samples were collected, sent to the Central Diagnostic Laboratory of the Maastricht University Medical Centre with a request for anti-dsDNA analysis over a period of 7 months. Agreement between EUROPattern assay analysis and the visual read was 93.3%. Sensitivity and specificity were 94.1% and 93.2%, respectively. The EUROPattern Suite performed reliably and greatly supported result interpretation. Automated image acquisition is readily performed and automated image classification gives a reliable recommendation for assay evaluation to the operator. The EUROPattern Suite optimizes workflow and contributes to standardization between different operators or laboratories.

  3. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  4. Performance-based regulation. Panel Discussion

    International Nuclear Information System (INIS)

    Youngblood, Robert; Bier, Vicki M.; Bukowski, Richard W.; Prasad Kadambi, N.; Koonce, James F.

    2001-01-01

    Full text of publication follows: Performance-based regulation is a part of the NRC's Strategic Plan and is realizing steady progress in conceptual development for actual applications. For example, high-level, conceptual guidelines have been proposed that would apply to reactors, materials, and waste areas. Performance-based approaches are also being applied in other regulated industries such as FAA and OSHA. The discussion will include comments from speakers from different parts of the nuclear industry and other industries regarding benefits and weaknesses of performance-based regulation. (authors)

  5. Optimization of cooling tower performance analysis using Taguchi method

    OpenAIRE

    Ramkumar Ramakrishnan; Ragupathy Arumugam

    2013-01-01

    This study discuss the application of Taguchi method in assessing maximum cooling tower effectiveness for the counter flow cooling tower using expanded wire mesh packing. The experiments were planned based on Taguchi’s L27 orthogonal array .The trail was performed under different inlet conditions of flow rate of water, air and water temperature. Signal-to-noise ratio (S/N) analysis, analysis of variance (ANOVA) and regression were carried out in order to determine the effects of process...

  6. SCALEA-G: A Unified Monitoring and Performance Analysis System for the Grid

    Directory of Open Access Journals (Sweden)

    Hong-Linh Truong

    2004-01-01

    Full Text Available This paper describes SCALEA-G, a unified monitoring and performance analysis system for the Grid. SCALEA-G is implemented as a set of grid services based on the Open Grid Services Architecture (OGSA. SCALEA-G provides an infrastructure for conducting online monitoring and performance analysis of a variety of Grid services including computational and network resources, and Grid applications. Both push and pull models are supported, providing flexible and scalable monitoring and performance analysis. Source code and dynamic instrumentation are implemented to perform profiling and monitoring of Grid applications. A novel instrumentation request language for dynamic instrumentation and a standardized intermediate representation for binary code have been developed to facilitate the interaction between client and instrumentation services.

  7. Techno-economic analysis of a 2.1 kW rooftop photovoltaic-grid-tied system based on actual performance

    International Nuclear Information System (INIS)

    Adaramola, Muyiwa S.

    2015-01-01

    Highlights: • The economic analysis of rooftop PV grid-tied installation is examined. • Based on actual performance, the LCOE of the system is estimated as US$0.246/kW h. • Feed-in-tariff of US$0.356/kW h is estimated with no financial support. • To encourage installation of PV system, financial support of up to 40% of the investment is suggested. - Abstract: As more attention is being focus on the development of renewable energy resources globally, technical and economic assessments of these resources are crucial to ascertain their viability. These assessments can be more meaningful, if they are based on field and actual performance of the renewable energy conversion systems. This study presents the economic analysis of a rooftop 2.07 kW grid-connected photovoltaic energy system installation located in Ås (59.65°N and longitude 10.76°E, and about 105 m above sea level), Norway. Both the annual and monthly costs of energy produced by the system are determined. In addition, the feed-in tariff that can give internal rate of return of about 7.5% on investment on this installation was examined. Based on assumptions used in this study, feed-in-tariff of US$0.356/kW h is estimated for a project with economic life of 25 years with no other financial support. This translates to US$0.110/kW h premium over the levelized cost of energy of US$0.246/kW h generated by the system. However, if the financial support is more than 45% of the initial investment cost, no further premium fee is necessary to support this type of system

  8. Reprint of “Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS”

    International Nuclear Information System (INIS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2013-01-01

    Highlights: ► A model-sized superconducting VSC-HVDC system was designed and fabricated. ► A real-time simulation using Real Time Digital Simulator has been performed. ► The AC loss characteristics of HTS DC power cable caused by harmonics were analyzed. ► The AC loss of the HTS DC power cable will be used as a parameter to design the cable cooling system. -- Abstract: The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail

  9. Reprint of “Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS”

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Minh-Chau, E-mail: thanchau7787@gmail.com [Changwon National University, 9 Sarim-Dong, Changwon 641-733 (Korea, Republic of); Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon [Changwon National University, 9 Sarim-Dong, Changwon 641-733 (Korea, Republic of); Yu, In-Keun, E-mail: yuik@changwon.ac.kr [Changwon National University, 9 Sarim-Dong, Changwon 641-733 (Korea, Republic of)

    2013-01-15

    Highlights: ► A model-sized superconducting VSC-HVDC system was designed and fabricated. ► A real-time simulation using Real Time Digital Simulator has been performed. ► The AC loss characteristics of HTS DC power cable caused by harmonics were analyzed. ► The AC loss of the HTS DC power cable will be used as a parameter to design the cable cooling system. -- Abstract: The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  10. Analysis of Human Error Types and Performance Shaping Factors in the Next Generation Main Control Room

    International Nuclear Information System (INIS)

    Sin, Y. C.; Jung, Y. S.; Kim, K. H.; Kim, J. H.

    2008-04-01

    Main control room of nuclear power plants has been computerized and digitalized in new and modernized plants, as information and digital technologies make great progresses and become mature. Survey on human factors engineering issues in advanced MCRs: Model-based approach, Literature survey-based approach. Analysis of human error types and performance shaping factors is analysis of three human errors. The results of project can be used for task analysis, evaluation of human error probabilities, and analysis of performance shaping factors in the HRA analysis

  11. Performance Analysis of ARQ-Based RF-FSO Links

    KAUST Repository

    Makki, Behrooz

    2017-02-22

    We study the performance of hybrid radio-frequency (RF) and free-space optical (FSO) links using automatic repeat request (ARQ). We derive closed-form expressions for the throughput and outage probability with different channel models. We also evaluate the effect of adaptive power allocation between the ARQ retransmissions on the system performance. The results show that joint implementation of the RF and FSO links leads to substantial performance improvement, compared to the cases with only the RF or the FSO link.

  12. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  13. Performance analysis: a study using data envelopment analysis in 26 Brazilian hospitals.

    Science.gov (United States)

    Guerra, Mariana; de Souza, Antônio Artur; Moreira, Douglas Rafael

    2012-01-01

    This article describes a proposal for analyzing the performance of public Brazilian hospitals using financial and non-financial rates (i.e., operational rates), and thereby highlights the effectiveness (or otherwise) of the financial management of organizations in this study. A total of 72 hospitals in the Brazilian Unified Health Care System (in Portuguese, Sistema Unico de Saúde-SUS), were selected for accessibility and completeness of their data. Twenty-six organizations were used for the study sample, consisting of entities that had publicly disclosed financial statements for the period from 2008 (in particular, via the Internet) and whose operational data could be found in the SUS database. Our proposal, based on models using the method of Data Envelopment Analysis (DEA), was the construction of six initial models that were later compiled into a standard model. The relations between the rates that comprised the models were based on the variables and the notes of: Schuhmann, McCue and Nayar, Barnum and Kutzin, Younis, Younies, and Okojie, Marinho, Moreno, and Cavalini, and Ersoy, Kavuncubasi, Ozcan, and Harris II. We put forward an enhanced grant proposal applicable to Brazil aiming to (i) confirm or refute the rates that show the effectiveness or ineffectiveness of financial management of national hospitals; and (ii) determine the best performances, which could be used as a reference for future studies. Obtained results: (i) for all financial indicators considered, only one showed no significance in all models; and (ii) for operational indicators, the results were not relevant when the number of occupied beds was considered. Though the analysis was related to only services provided by SUS, we conclude that our study has great potential for analyzing the financial management performance of Brazilian hospitals in general, for the following reasons: (i) it shows the relationship of financial and operational rates that can be used to analyze the performance of

  14. Performance analysis of a threshold-based parallel multiple beam selection scheme for WDM-based systems for Gamma-Gamma distributions

    KAUST Repository

    Nam, Sung Sik; Yoon, Chang Seok; Alouini, Mohamed-Slim

    2017-01-01

    In this paper, we statistically analyze the performance of a threshold-based parallel multiple beam selection scheme (TPMBS) for Free-space optical (FSO) based system with wavelength division multiplexing (WDM) in cases where a pointing error has

  15. Performance Analysis of ARQ-Based RF-FSO Links

    KAUST Repository

    Makki, Behrooz; Svensson, Tommy; Eriksson, Thomas; Alouini, Mohamed-Slim

    2017-01-01

    evaluate the effect of adaptive power allocation between the ARQ retransmissions on the system performance. The results show that joint implementation of the RF and FSO links leads to substantial performance improvement, compared to the cases with only

  16. Discriminant analysis in Polish manufacturing sector performance assessment

    Directory of Open Access Journals (Sweden)

    Józef Dziechciarz

    2004-01-01

    Full Text Available This is a presentation of the preliminary results of a larger project on the determination of the attractiveness of manufacturing branches. Results of the performance assessment of Polish manufacturing branches in 2000 (section D „Manufacturing” – based on NACE – Nomenclatures des Activites de Communite Europeene are shown. In the research, the classical (Fisher’s linear discriminant analysis technique was used for the analysis of the profit generation ability by the firms belonging to a certain production branch. For estimation, the data describing group level was used – for cross-validation, the classes data.

  17. Performance Analysis of Virtual MIMO Relaying Schemes Based on Detect–Split–Forward

    KAUST Repository

    Al-Basit, Suhaib M.

    2014-10-29

    © 2014, Springer Science+Business Media New York. Virtual multi-input multi-output (vMIMO) schemes in wireless communication systems improve coverage, throughput, capacity, and quality of service. In this paper, we propose three uplink vMIMO relaying schemes based on detect–split–forward (DSF). In addition, we investigate the effect of several physical parameters such as distance, modulation type and number of relays. Furthermore, an adaptive vMIMO DSF scheme based on VBLAST and STBC is proposed. In order to do that, we provide analytical tools to evaluate the performance of the propose vMIMO relaying scheme.

  18. Performance Analysis of Virtual MIMO Relaying Schemes Based on Detect–Split–Forward

    KAUST Repository

    Al-Basit, Suhaib M.; Al-Ghadhban, Samir; Zummo, Salam A.

    2014-01-01

    © 2014, Springer Science+Business Media New York. Virtual multi-input multi-output (vMIMO) schemes in wireless communication systems improve coverage, throughput, capacity, and quality of service. In this paper, we propose three uplink vMIMO relaying schemes based on detect–split–forward (DSF). In addition, we investigate the effect of several physical parameters such as distance, modulation type and number of relays. Furthermore, an adaptive vMIMO DSF scheme based on VBLAST and STBC is proposed. In order to do that, we provide analytical tools to evaluate the performance of the propose vMIMO relaying scheme.

  19. Risk Management and Simulation Based Live Fire Test and Evaluation in the Performance Based Defense Business Environment

    National Research Council Canada - National Science Library

    Brown, R

    1999-01-01

    The objective of this analysis is to reduce the policy and management process costs of Congressionally mandated Live Fire Test and Evaluation procedures in the new Performance Based Defense Acquisition environment...

  20. A trigeneration system based on polymer electrolyte fuel cell and desiccant wheel – Part B: Overall system design and energy performance analysis

    International Nuclear Information System (INIS)

    Intini, M.; De Antonellis, S.; Joppolo, C.M.; Casalegno, A.

    2015-01-01

    Highlights: • Seasonal simulation of a trigeneration system for building air-conditioning. • Effects of technical constraints on trigeneration system power consumption. • Optimal PEMFC unit size for maximizing trigeneration primary energy savings. - Abstract: This paper represents the second part of a major work focusing on a trigeneration system integrating a low temperature polymer electrolyte fuel cell (PEMFC) and a desiccant wheel-based air handling unit. Low temperature PEMFC systems have a significant potential in combined heating, cooling and power applications. However cogenerated heat temperature is relatively low (up to 65–70 °C), resulting in low efficiency of the cooling process, and the fuel processor is far from being flexible, hindering the operation of the system at low load conditions. Therefore a trigeneration system based on PEMFC should be carefully designed through accurate simulation tools. In the current paper a detailed analysis of the energy performance of the trigenerative system is provided, taking into account constraints of real applications, such as PEMFC part load behavior, desiccant wheel effectiveness, heat storage losses and air handling unit electrical consumptions. The methodology adopted to model system components is deeply described. Energy simulations are performed on yearly basis with variable building air conditioning loads and climate conditions, in order to investigate the optimal trigenerative unit size. A sensitivity analysis on crucial design parameters is provided. It is shown that constrains of actual applications have relevant effects on system energy consumption, which is significantly far from expected values based on a simplified analysis. Primary energy savings can be positive in winter time if the ratio of PEMFC heating capacity to air conditioning peak heating load is close to 0.15. Instead on yearly basis primary energy savings cannot be achieved with present components performance. Positive savings

  1. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  2. Systematic review on sports performance in beach volleyball from match analysis

    Directory of Open Access Journals (Sweden)

    Alexandre Igor Araripe Medeiros

    2014-10-01

    Full Text Available DOI: http://dx.doi.org/10.5007/1980-0037.2014v16n6p698   The present article aimed to perform a systematic review of the available literature in relation to the analysis of sports performance in beach volleyball from match analysis. Web of Science, SportDiscus®, PubMed, Scopus and Academic Search Complete databases were used to identify peer-reviewed published articles. The authors conducted a content analysis according to goals, variables of analysis and methods used in studies. In general, three research lines were determined: analysis of the functional dependence of the game actions and their relation with success, performance according to gender, and the effect of changing the rules on the game performance. In relation to methodology, an evolution from descriptive studies to studies of comparative nature can be seen and, more recently, there has been a focus on predictive nature. This new trend breaks with the research based on simple cause and effect relations, and focuses on the analysis of the game events, namely related to tactical-technical performance indicators, in a non-linear and interactive way, considering the game as a complex and dynamic system. The limitations of the studies analyzed show the need for further studies to investigate the identification of game patterns for the different game levels; integration of situational variables in the study of the performance of teams (such as match status and the quality of opposition.

  3. Funding Ohio Community Colleges: An Analysis of the Performance Funding Model

    Science.gov (United States)

    Krueger, Cynthia A.

    2013-01-01

    This study examined Ohio's community college performance funding model that is based on seven student success metrics. A percentage of the regular state subsidy is withheld from institutions; funding is earned back based on the three-year average of success points achieved in comparison to other community colleges in the state. Analysis of…

  4. Financial analysis and forecasting of the results of small businesses performance based on regression model

    Directory of Open Access Journals (Sweden)

    Svetlana O. Musienko

    2017-03-01

    Full Text Available Objective to develop the economicmathematical model of the dependence of revenue on other balance sheet items taking into account the sectoral affiliation of the companies. Methods using comparative analysis the article studies the existing approaches to the construction of the company management models. Applying the regression analysis and the least squares method which is widely used for financial management of enterprises in Russia and abroad the author builds a model of the dependence of revenue on other balance sheet items taking into account the sectoral affiliation of the companies which can be used in the financial analysis and prediction of small enterprisesrsquo performance. Results the article states the need to identify factors affecting the financial management efficiency. The author analyzed scientific research and revealed the lack of comprehensive studies on the methodology for assessing the small enterprisesrsquo management while the methods used for large companies are not always suitable for the task. The systematized approaches of various authors to the formation of regression models describe the influence of certain factors on the company activity. It is revealed that the resulting indicators in the studies were revenue profit or the company relative profitability. The main drawback of most models is the mathematical not economic approach to the definition of the dependent and independent variables. Basing on the analysis it was determined that the most correct is the model of dependence between revenues and total assets of the company using the decimal logarithm. The model was built using data on the activities of the 507 small businesses operating in three spheres of economic activity. Using the presented model it was proved that there is direct dependence between the sales proceeds and the main items of the asset balance as well as differences in the degree of this effect depending on the economic activity of small

  5. 3D Massive MIMO Systems: Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-07-30

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. Recently, the trend is to enhance system performance by exploiting the channel’s degrees of freedom in the elevation, which necessitates the characterization of 3D channels. We present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles. Based on this model, we provide analytical expression for the cumulative density function (CDF) of the mutual information (MI) for systems with a single receive and finite number of transmit antennas in the general signalto- interference-plus-noise-ratio (SINR) regime. The result is extended to systems with finite receive antennas in the low SINR regime. A Gaussian approximation to the asymptotic behavior of MI distribution is derived for the large number of transmit antennas and paths regime. We corroborate our analysis with simulations that study the performance gains realizable through meticulous selection of the transmit antenna downtilt angles, confirming the potential of elevation beamforming to enhance system performance. The results are directly applicable to the analysis of 5G 3D-Massive MIMO-systems.

  6. performance performance analysis of gsm networks in minna

    African Journals Online (AJOL)

    eobe

    in terms of key performance indicators (KPI) based on statistics performance indicators ... in this study. Keywords: GSM Network, Drive Test, KPI and Radio Frequency Network Optimization. 1. .... message (SMS) traffic or in scenarios where so.

  7. Effect of Manager’s Role in Performance Based Pay on Employee Outcomes

    Directory of Open Access Journals (Sweden)

    Azman, I

    2014-12-01

    Full Text Available According to the recent literature pertaining on Islamic based organizational compensation, performance based pay consists of two essential features: communication and performance appraisal. Recent studies in this field highlights that the ability of managers to appropriately communicate pay information and appraise employee performance may have a significant impact on employee outcomes, especially job satisfaction and organizational commitment. Therefore, this study was undertaken to assess the relationship between manager’s role in performance based pay and employee outcomes using self-administered questionnaires collected from employees at a district council in Peninsular Malaysia. The outcomes of the SmartPLS path model analysis showed that pay communication does not act as an important determinant of job satisfaction, but performance appraisal does act as an important determinant of job satisfaction. Conversely, pay communication and performance appraisal act as important determinants of organizational commitment. In addition, this study provides discussion, implications and conclusion

  8. Analysis of Economic Performance in Mergers and Acquisition

    Institute of Scientific and Technical Information of China (English)

    王立杰; 孙涛

    2003-01-01

    Based on the methods of financial analysis, the direct earnings in mergers and acquisition M&A, profit or loss from stock price fluctuation, influence on the earning per stock(EPS) and revenue growth after M&A were analyzed in detail. And several quantitative models were established in relevant part accordingly. It can be useful to improve the present low efficiency in the M&A performance in Chinese capital market.

  9. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed. Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena, occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. This multiphysics behavior is often tightly coupled, a well known example being the thermomechanical behavior. Adding to this complexity, important aspects of fuel behavior are inherently

  10. Relative performance of academic departments using DEA with sensitivity analysis.

    Science.gov (United States)

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  11. Notes on human performance analysis

    International Nuclear Information System (INIS)

    Hollnagel, E.; Pedersen, O.M.; Rasmussen, J.

    1981-06-01

    This paper contains a framework for the integration of observation and analysis of human performance in nuclear environments - real or simulated. It identifies four main sources of data, and describes the characteristic data types and methods of analysis for each source in relation to a common conceptual background. The general conclusion is that it is highly useful to combine the knowledge and experience from different contexts into coherent picture of how nuclear operators perform under varying circumstances. (author)

  12. Quantitative analysis of the security performance in wireless LANs

    Directory of Open Access Journals (Sweden)

    Poonam Jindal

    2017-07-01

    Full Text Available A comprehensive experimental study to analyze the security performance of a WLAN based on IEEE 802.11 b/g/n standards in various network scenarios is presented in this paper. By setting-up an experimental testbed we have measured results for a layered security model in terms of throughput, response time, encryption overheads, frame loss and jitter. Through numerical results obtained from the testbed, we have presented quantitative as well as realistic findings for both security mechanisms and network performance. It establishes the fact that there is always a tradeoff between the security strength and the associated network performance. It is observed that the non-roaming network always performs better than the roaming network under all network scenarios. To analyze the benefits offered by a particular security protocol a relative security strength index model is demonstrated. Further we have presented the statistical analysis of our experimental data. We found that different security protocols have different robustness against mobility. By choosing the robust security protocol, network performance can be improved. The presented analysis is significant and useful with reference to the assessment of the suitability of security protocols for given real time application.

  13. Error performance analysis in downlink cellular networks with interference management

    KAUST Repository

    Afify, Laila H.

    2015-05-01

    Modeling aggregate network interference in cellular networks has recently gained immense attention both in academia and industry. While stochastic geometry based models have succeeded to account for the cellular network geometry, they mostly abstract many important wireless communication system aspects (e.g., modulation techniques, signal recovery techniques). Recently, a novel stochastic geometry model, based on the Equivalent-in-Distribution (EiD) approach, succeeded to capture the aforementioned communication system aspects and extend the analysis to averaged error performance, however, on the expense of increasing the modeling complexity. Inspired by the EiD approach, the analysis developed in [1] takes into consideration the key system parameters, while providing a simple tractable analysis. In this paper, we extend this framework to study the effect of different interference management techniques in downlink cellular network. The accuracy of the proposed analysis is verified via Monte Carlo simulations.

  14. Cost/Performance Ratio Achieved by Using a Commodity-Based Cluster

    Science.gov (United States)

    Lopez, Isaac

    2001-01-01

    Researchers at the NASA Glenn Research Center acquired a commodity cluster based on Intel Corporation processors to compare its performance with a traditional UNIX cluster in the execution of aeropropulsion applications. Since the cost differential of the clusters was significant, a cost/performance ratio was calculated. After executing a propulsion application on both clusters, the researchers demonstrated a 9.4 cost/performance ratio in favor of the Intel-based cluster. These researchers utilize the Aeroshark cluster as one of the primary testbeds for developing NPSS parallel application codes and system software. The Aero-shark cluster provides 64 Intel Pentium II 400-MHz processors, housed in 32 nodes. Recently, APNASA - a code developed by a Government/industry team for the design and analysis of turbomachinery systems was used for a simulation on Glenn's Aeroshark cluster.

  15. Performance of laser ablation. Quadrupole-based ICP-MS coupling for the analysis of single micrometric uranium particles

    International Nuclear Information System (INIS)

    Fabien Pointurier; Amelie Hubert; Anne-Claire Pottin

    2013-01-01

    In this paper we describe the application of laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) coupling to particle analysis, i.e., the determination of the isotopic composition of micrometric uranium particles. The performances of this analysis technique are compared with those of the two reference particle analysis techniques: secondary ion mass spectrometry (SIMS) and fission track-thermo-ionization mass spectrometry (FT-TIMS), based on the measurement of the isotopic ratios of 235 U/ 238 U in particles present in an inter-comparison particulate sample. The agreement of the results obtained using LA-ICP-MS with target values and with the results obtained using FT-TIMS and SIMS was good. Accuracy was equivalent to that of the other two techniques (±3 % deviation). However, relative experimental uncertainties present with LA-ICP-MS (7 %) were higher than those present with FT-TIMS (4.5 %) and SIMS (3 %). Furthermore, measurement yield of LA-ICP-MS coupling was close to that obtained with the same quadrupole ICP-MS for the measurement of a liquid sample (∼10 -4 ), but lower than that obtained with FT-TIMS and SIMS, respectively, by a factor of 10 and 20, although the particles analyzed using LA-ICP-MS were most likely smaller (diameter ∼0.6 μm, containing 4-7 fg of 235 U). Nevertheless, thanks to the brevity of the signals obtained, the detection capacity for low isotopic concentrations by LA-ICP-MS coupling is equivalent to that of FT-TIMS, although it remains well below that of SIMS (x 15). However, with more sensitive double focusing ICP-MS, performances equivalent to those achieved using SIMS could be obtained. (author)

  16. An Evidence-Based Videotaped Running Biomechanics Analysis.

    Science.gov (United States)

    Souza, Richard B

    2016-02-01

    Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium

    Energy Technology Data Exchange (ETDEWEB)

    Arpaia, P., E-mail: pasquale.arpaia@unina.it [Department of Electrical Engineering and Information Technology, University of Napoli Federico II, Naples (Italy); Technology Department, European Organization for Nuclear Research (CERN), Geneva (Switzerland); Girone, M., E-mail: mario.girone@cern.ch [Technology Department, European Organization for Nuclear Research (CERN), Geneva (Switzerland); Department of Engineering, University of Sannio, Benevento (Italy); Liccardo, A., E-mail: annalisa.liccardo@unina.it [Department of Electrical Engineering and Information Technology, University of Napoli Federico II, Naples (Italy); Pezzetti, M., E-mail: marco.pezzetti@cern.ch [Technology Department, European Organization for Nuclear Research (CERN), Geneva (Switzerland); Piccinelli, F., E-mail: fabio.piccinelli@cern.ch [Department of Mechanical Engineering, University of Brescia, Brescia (Italy)

    2015-12-15

    The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sources most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.

  18. Contribution analysis as an evaluation strategy in the context of a sector-wide approach: Performance-based health financing in Rwanda

    Directory of Open Access Journals (Sweden)

    Martin Noltze

    2014-12-01

    Full Text Available Sector-wide approaches (SWAps emerged as a response to donor fragmentation and non-adjusted and parallel programming. In the health sector, SWAps have received considerable support by the international donor community due to their potential to reduce inefficiencies through alignment to common procedures and hence to increase development effectiveness. Evaluating development cooperation in the context of a SWAp, however, translates into methodological challenges for evaluators who have to disentangle the cumulative effects in strongly donor-aligned, complex sector environments. In this article the authors discussed the application of a methodological strategy for evaluating development interventions in complex settings – for example in the context of a SWAp –and reflected the suitability of the approach. The authors conducted a contribution analysis, a theory-based approach to evaluation, and exemplified the approach for an intervention of performance-based financing for Rwandan health workers supported by the Rwanda-German cooperation. The findings suggested that the Rwandan system of performance based financing increased service orientation and outputs of health professionals, but also indicated that negative motivational side effects and resource constraints are real. With regard to the methodological approach, the authors conclude that contribution analysis has a high potential to evaluate development cooperation in the context of a SWAp dueto its high flexibility to use different data collection tools and its capability to assess risks and rival explanations. Challenges can be identified with regard to the efficiency of the evaluation strategy and a remaining trade-off between scope and causal strength ofevidence.

  19. Performance analysis of a threshold-based parallel multiple beam selection scheme for WDM-based systems for Gamma-Gamma distributions

    KAUST Repository

    Nam, Sung Sik

    2017-03-02

    In this paper, we statistically analyze the performance of a threshold-based parallel multiple beam selection scheme (TPMBS) for Free-space optical (FSO) based system with wavelength division multiplexing (WDM) in cases where a pointing error has occurred for practical consideration over independent identically distributed (i.i.d.) Gamma-Gamma fading conditions. Specifically, we statistically analyze the characteristics in operation under conventional heterodyne detection (HD) scheme for both adaptive modulation (AM) case in addition to non-AM case (i.e., coherentnon-coherent binary modulation). Then, based on the statistically derived results, we evaluate the outage probability (CDF) of a selected beam, the average spectral efficiency (ASE), the average number of selected beams (ANSB), and the average bit error rate (BER). Some selected results shows that we can obtain the higher spectral efficiency and simultaneously reduce the potential increasing of the complexity of implementation caused by applying the selection based beam selection scheme without a considerable performance loss.

  20. Performance and safety analysis of WP-cave concept

    International Nuclear Information System (INIS)

    Skagius, K.; Svemar, C.

    1989-08-01

    The report presents a performance safety, and cost analysis of the WP-cave, WPC, concept. In the performance analysis, questions specific to the WPC have been addressed which have been identified to require more detailed studies. Based on the outcome of this analysis, a safety analysis has been made which comprises of the modeling and calculation of radionuclide transport from the repository to the biosphere and the resulting dose exposure to man. The result of the safety analysis indicates that the present design of a WPC repository may give unacceptably high doses. By improving the properties of the bentonite/sand barrier such that the hydraulic conductivity is reduced, or by changing the short-lived steel canisters to more long-lived canisters, e.g. copper canisters, it is judged possible to achieve a sufficiently low level of dose exposure rates to man. The cost for a WPC repository of the studied design is significantly higher than for a KBS-3 repository considering the Swedish conditions and the Swedish amount of spent fuel. The major costs are connected to the excavation and backfilling of the bentonite/sand barrier. The potential for cost savings is high but it is not judged possible to account for savings in such a way that the WPC concept shows lower cost than the KBS-3 concept. (34 figs., 33 tabs., 29 refs.)

  1. Cost and performance analysis of physical security systems

    International Nuclear Information System (INIS)

    Hicks, M.J.; Yates, D.; Jago, W.H.; Phillips, A.W.

    1998-04-01

    Analysis of cost and performance of physical security systems can be a complex, multi-dimensional problem. There are a number of point tools that address various aspects of cost and performance analysis. Increased interest in cost tradeoffs of physical security alternatives has motivated development of an architecture called Cost and Performance Analysis (CPA), which takes a top-down approach to aligning cost and performance metrics. CPA incorporates results generated by existing physical security system performance analysis tools, and utilizes an existing cost analysis tool. The objective of this architecture is to offer comprehensive visualization of complex data to security analysts and decision-makers

  2. Nitride fuels irradiation performance data base

    International Nuclear Information System (INIS)

    Brozak, D.E.; Thomas, J.K.; Peddicord, K.L.

    1987-01-01

    An irradiation performance data base for nitride fuels has been developed from an extensive literature search and review that emphasized uranium nitride, but also included performance data for mixed nitrides [(U,Pu)N] and carbonitrides [(U,Pu)C,N] to increase the quantity and depth of pin data available. This work represents a very extensive effort to systematically collect and organize irradiation data for nitride-based fuels. The data base has many potential applications. First, it can facilitate parametric studies of nitride-based fuels to be performed using a wide range of pin designs and operating conditions. This should aid in the identification of important parameters and design requirements for multimegawatt and SP-100 fuel systems. Secondly, the data base can be used to evaluate fuel performance models. For detailed studies, it can serve as a guide to selecting a small group of pin specimens for extensive characterization. Finally, the data base will serve as an easily accessible and expandable source of irradiation performance information for nitride fuels

  3. Perbandingan Kualitas Layanan Ritel Swalayan Menggunakan Competitive Zone of Tolerance Based dan Importance-Performance Analysis

    Directory of Open Access Journals (Sweden)

    Arfan Bakhtiar

    2017-07-01

    Full Text Available Increased retail economic has had an impact on the Indonesian economy. Carrefour as an international company has a local company's competitor, Hypermart. In order for both companies to know the position of competing, then benchmarking done between them using CZIPA (Competitive Zone of Tolerance based Importance-Performance Analysis method. The use of CZIPA methods is done to determine the priority of each self-service in making improvements. The dimension used is the retail dimension called RSQS (Retail Service Quality Scale. The goal to be achieved is to know the superior service quality indicators owned by Carrefour and Hypermart and to make priority services to be improved on both self-service using RSQS dimensions. Data collected through questionnaire with purposive sampling method to 133 people. Research finds the main problem facing Carrefour is self-service layout that does not facilitate consumers to find the desired product. At Hypermart, the main problem is that the products in the catalog are not always available.

  4. Risk-based decision analysis for groundwater operable units

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1995-01-01

    This document proposes a streamlined approach and methodology for performing risk assessment in support of interim remedial measure (IRM) decisions involving the remediation of contaminated groundwater on the Hanford Site. This methodology, referred to as ''risk-based decision analysis,'' also supports the specification of target cleanup volumes and provides a basis for design and operation of the groundwater remedies. The risk-based decision analysis can be completed within a short time frame and concisely documented. The risk-based decision analysis is more versatile than the qualitative risk assessment (QRA), because it not only supports the need for IRMs, but also provides criteria for defining the success of the IRMs and provides the risk-basis for decisions on final remedies. For these reasons, it is proposed that, for groundwater operable units, the risk-based decision analysis should replace the more elaborate, costly, and time-consuming QRA

  5. TACO: fuel pin performance analysis

    International Nuclear Information System (INIS)

    Stoudt, R.H.; Buchanan, D.T.; Buescher, B.J.; Losh, L.L.; Wilson, H.W.; Henningson, P.J.

    1977-08-01

    The thermal performance of fuel in an LWR during its operational lifetime must be described for LOCA analysis as well as for other safety analyses. The determination of stored energy in the LOCA analysis, for example, requires a conservative fuel pin thermal performance model that is capable of calculating fuel and cladding behavior, including the gap conductance between the fuel and cladding, as a function of burnup. The determination of parameters that affect the fuel and cladding performance, such as fuel densification, fission gas release, cladding dimensional changes, fuel relocation, and thermal expansion, should be accounted for in the model. Babcock and Wilcox (B and W) has submitted a topical report, BAW-10087P, December 1975, which describes their thermal performance model TACO. A summary of the elements that comprise the TACO model and an evaluation are presented

  6. Studies on Dairy Cattle Reproduction Performances in Morocco Based on Analysis of Artificial Insemination Data

    Directory of Open Access Journals (Sweden)

    Sraïri, MT.

    2001-01-01

    Full Text Available The main objective of this study is to assess dairy cattle reproduction performances from artificial insemination (Al database, using inseminators' records from 1992 to 1998, in three Al circuits established in Settat province in Morocco. Simultaneously a field survey was conducted in the same region, from January to April 1999, to determine main structural parameters of dairy farms which influence Al. Data set analysis has shown an increase in total number of Al performed from an average of 160 to 640 per circuit. Average conception rate was 48.1 %, with a continuous increase from 44.3 to 58.6 %, despite growing number of performed Al. Statistical analysis reveal a significant variation of conception rate between years, in agreement with previous works on cattle reproduction performances in harsh conditions. Mean calving interval was 404.8 days. It was significantly different between circuits (P <0.05. This resuit was explained by Al history in the three circuits (date of implementation and by their structural characteristics (number of cows and length in km. The overall improvement of Al activity (more Al performed and better conception rate could be explained by a greater inseminators' adaptation to their working environment, combined to the progressive elimination of farms with poor dairy cattle reproduction management. This trend was confirmed by discriminant analysis of field survey results, as cattle breeders with real specialisation in milk production (more than 65 % of total land devoted to forages and few sheep have been found to be fervent Al demanders, whereas farms with more interest in cereals and sheep often stop Al. Those observations show that a continuous Al programs evaluation is urgent, in order to select dairy breeders which are really interested in that technique and to avoid the dissipation of the inseminators limited time and resources.

  7. MO-FG-202-06: Improving the Performance of Gamma Analysis QA with Radiomics- Based Image Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wootton, L; Nyflot, M; Ford, E [University of Washington Department of Radiation Oncology, Seattle, WA (United States); Chaovalitwongse, A [University of Washington Department of Industrial and Systems Engineering, Seattle, Washington (United States); University of Washington Department of Radiology, Seattle, WA (United States); Li, N [University of Washington Department of Industrial and Systems Engineering, Seattle, Washington (United States)

    2016-06-15

    Purpose: The use of gamma analysis for IMRT quality assurance has well-known limitations. Traditionally, a simple thresholding technique is used to evaluated passing criteria. However, like any image the gamma distribution is rich in information which thresholding mostly discards. We therefore propose a novel method of analyzing gamma images that uses quantitative image features borrowed from radiomics, with the goal of improving error detection. Methods: 368 gamma images were generated from 184 clinical IMRT beams. For each beam the dose to a phantom was measured with EPID dosimetry and compared to the TPS dose calculated with and without normally distributed (2mm sigma) errors in MLC positions. The magnitude of 17 intensity histogram and size-zone radiomic features were derived from each image. The features that differed most significantly between image sets were determined with ROC analysis. A linear machine-learning model was trained on these features to classify images as with or without errors on 180 gamma images.The model was then applied to an independent validation set of 188 additional gamma distributions, half with and half without errors. Results: The most significant features for detecting errors were histogram kurtosis (p=0.007) and three size-zone metrics (p<1e-6 for each). The sizezone metrics detected clusters of high gamma-value pixels under mispositioned MLCs. The model applied to the validation set had an AUC of 0.8, compared to 0.56 for traditional gamma analysis with the decision threshold restricted to 98% or less. Conclusion: A radiomics-based image analysis method was developed that is more effective in detecting error than traditional gamma analysis. Though the pilot study here considers only MLC position errors, radiomics-based methods for other error types are being developed, which may provide better error detection and useful information on the source of detected errors. This work was partially supported by a grant from the Agency for

  8. Simultaneous Analysis of Ursolic Acid and Oleanolic Acid in Guava Leaves Using QuEChERS-Based Extraction Followed by High-Performance Liquid Chromatography

    OpenAIRE

    Xu, Chang; Liao, Yiyi; Fang, Chunyan; Tsunoda, Makoto; Zhang, Yingxia; Song, Yanting; Deng, Shiming

    2017-01-01

    In this paper, a novel method of QuEChERS-based extraction coupled with high-performance liquid chromatography has been developed for the simultaneous determination of ursolic acid (UA) and oleanolic acid (OA) in guava leaves. The QuEChERS-based extraction parameters, including the amount of added salt, vortex-assisted extraction time, and absorbent amount, and the chromatographic conditions were investigated for the analysis of UA and OA in guava leaves. Under the optimized conditions, the m...

  9. Diagnostic performance of 18F-dihydroxyphenylalanine positron emission tomography in patients with paraganglioma: a meta-analysis

    International Nuclear Information System (INIS)

    Treglia, Giorgio; Cocciolillo, Fabrizio; Castaldi, Paola; Rufini, Vittoria; Giordano, Alessandro; De Waure, Chiara; Di Nardo, Francesco; Gualano, Maria Rosaria

    2012-01-01

    The aim of this study was to systematically review and conduct a meta-analysis of published data about the diagnostic performance of 18 F-dihydroxyphenylalanine (DOPA) positron emission tomography (PET) in patients with paraganglioma (PG). A comprehensive computer literature search of studies published through 30 June 2011 regarding 18 F-DOPA PET or PET/computed tomography (PET/CT) in patients with PG was performed in PubMed/MEDLINE, Embase and Scopus databases. Pooled sensitivity and specificity of 18 F-DOPA PET or PET/CT in patients with PG on a per patient- and on a per lesion-based analysis were calculated. The area under the receiver-operating characteristic (ROC) curve was calculated to measure the accuracy of 18 F-DOPA PET or PET/CT in patients with PG. Furthermore, a sub-analysis taking into account the different genetic mutations in PG patients was also performed. Eleven studies comprising 275 patients with suspected PG were included in this meta-analysis. The pooled sensitivity of 18 F-DOPA PET and PET/CT in detecting PG was 91% [95% confidence interval (CI) 87-94%] on a per patient-based analysis and 79% (95% CI 76-81%) on a per lesion-based analysis. The pooled specificity of 18 F-DOPA PET and PET/CT in detecting PG was 95% (95% CI 86-99%) on a per patient-based analysis and 95% (95% CI 84-99%) on a per lesion-based analysis. The area under the ROC curve was 0.95 on a per patient- and 0.94 on a per lesion-based analysis. Heterogeneity between the studies about sensitivity of 18 F-DOPA PET or PET/CT was found. A significant increase in sensitivity of 18 F-DOPA PET or PET/CT was observed when a sub-analysis excluding patients with succinate dehydrogenase subunit B (SDHB) gene mutations was performed. In patients with suspected PG 18 F-DOPA PET or PET/CT demonstrated high sensitivity and specificity. 18 F-DOPA PET or PET/CT are accurate methods in this setting. Nevertheless, possible sources of false-negative results should be kept in mind. Furthermore

  10. Diagnostic performance of 18F-dihydroxyphenylalanine positron emission tomography in patients with paraganglioma: a meta-analysis.

    Science.gov (United States)

    Treglia, Giorgio; Cocciolillo, Fabrizio; de Waure, Chiara; Di Nardo, Francesco; Gualano, Maria Rosaria; Castaldi, Paola; Rufini, Vittoria; Giordano, Alessandro

    2012-07-01

    The aim of this study was to systematically review and conduct a meta-analysis of published data about the diagnostic performance of (18)F-dihydroxyphenylalanine (DOPA) positron emission tomography (PET) in patients with paraganglioma (PG). A comprehensive computer literature search of studies published through 30 June 2011 regarding (18)F-DOPA PET or PET/computed tomography (PET/CT) in patients with PG was performed in PubMed/MEDLINE, Embase and Scopus databases. Pooled sensitivity and specificity of (18)F-DOPA PET or PET/CT in patients with PG on a per patient- and on a per lesion-based analysis were calculated. The area under the receiver-operating characteristic (ROC) curve was calculated to measure the accuracy of (18)F-DOPA PET or PET/CT in patients with PG. Furthermore, a sub-analysis taking into account the different genetic mutations in PG patients was also performed. Eleven studies comprising 275 patients with suspected PG were included in this meta-analysis. The pooled sensitivity of (18)F-DOPA PET and PET/CT in detecting PG was 91% [95% confidence interval (CI) 87-94%] on a per patient-based analysis and 79% (95% CI 76-81%) on a per lesion-based analysis. The pooled specificity of (18)F-DOPA PET and PET/CT in detecting PG was 95% (95% CI 86-99%) on a per patient-based analysis and 95% (95% CI 84-99%) on a per lesion-based analysis. The area under the ROC curve was 0.95 on a per patient- and 0.94 on a per lesion-based analysis. Heterogeneity between the studies about sensitivity of (18)F-DOPA PET or PET/CT was found. A significant increase in sensitivity of (18)F-DOPA PET or PET/CT was observed when a sub-analysis excluding patients with succinate dehydrogenase subunit B (SDHB) gene mutations was performed. In patients with suspected PG (18)F-DOPA PET or PET/CT demonstrated high sensitivity and specificity. (18)F-DOPA PET or PET/CT are accurate methods in this setting. Nevertheless, possible sources of false-negative results should be kept in mind

  11. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review

    Science.gov (United States)

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760

  12. Performance Analysis of a Threshold-Based Parallel Multiple Beam Selection Scheme for WDM FSO Systems

    KAUST Repository

    Nam, Sung Sik; Alouini, Mohamed-Slim; Ko, Young-Chai

    2018-01-01

    In this paper, we statistically analyze the performance of a threshold-based parallel multiple beam selection scheme for a free-space optical (FSO) based system with wavelength division multiplexing (WDM) in cases where a pointing error has occurred

  13. Grading the Metrics: Performance-Based Funding in the Florida State University System

    Science.gov (United States)

    Cornelius, Luke M.; Cavanaugh, Terence W.

    2016-01-01

    A policy analysis of Florida's 10-factor Performance-Based Funding system for state universities. The focus of the article is on the system of performance metrics developed by the state Board of Governors and their impact on institutions and their missions. The paper also discusses problems and issues with the metrics, their ongoing evolution, and…

  14. Objective evaluation of analyzer performance based on a retrospective meta-analysis of instrument validation studies: point-of-care hematology analyzers.

    Science.gov (United States)

    Cook, Andrea M; Moritz, Andreas; Freeman, Kathleen P; Bauer, Natali

    2017-06-01

    Information on quality requirements and objective evaluation of performance of veterinary point-of-care analyzers (POCAs) is scarce. The study was aimed at assessing observed total errors (TE obs s) for veterinary hematology POCAs via meta-analysis and comparing TE obs to allowable total error (TE a ) specifications based on experts' opinions. The TE obs for POCAs (impedance and laser-based) was calculated based on data from instrument validation studies published between 2006 and 2013 as follows: TE obs = 2 × CV [%] + bias [%]. The CV was taken from published studies; the bias was estimated from the regression equation at 2 different concentration levels of measurands. To fulfill quality requirements, TE obs should be 60% of analyzers showed TE obs hematology variables, respectively. For the CBC, TE obs was TE a (data from 3 analyzers). This meta-analysis is considered a pilot study. Experts' requirements (TE obs < TE a ) were fulfilled for most measurands except HGB (due to instrument-related bias for the ADVIA 2120) and platelet counts. Available data on the WBC differential count suggest an analytic bias, so nonstatistical quality control is recommended. © 2017 American Society for Veterinary Clinical Pathology.

  15. Performance analysis of visual tracking algorithms for motion-based user interfaces on mobile devices

    Science.gov (United States)

    Winkler, Stefan; Rangaswamy, Karthik; Tedjokusumo, Jefry; Zhou, ZhiYing

    2008-02-01

    Determining the self-motion of a camera is useful for many applications. A number of visual motion-tracking algorithms have been developed till date, each with their own advantages and restrictions. Some of them have also made their foray into the mobile world, powering augmented reality-based applications on phones with inbuilt cameras. In this paper, we compare the performances of three feature or landmark-guided motion tracking algorithms, namely marker-based tracking with MXRToolkit, face tracking based on CamShift, and MonoSLAM. We analyze and compare the complexity, accuracy, sensitivity, robustness and restrictions of each of the above methods. Our performance tests are conducted over two stages: The first stage of testing uses video sequences created with simulated camera movements along the six degrees of freedom in order to compare accuracy in tracking, while the second stage analyzes the robustness of the algorithms by testing for manipulative factors like image scaling and frame-skipping.

  16. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  17. Evaluation of new control rooms by operator performance analysis

    International Nuclear Information System (INIS)

    Mori, M; Tomizawa, T.; Tai, I.; Monta, K.; Yoshimura, S.; Hattori, Y.

    1987-01-01

    An advanced supervisory and control system called PODIA TM (Plant Operation by Displayed Information and Automation) was developed by Toshiba. Since this system utilizes computer driven CRTs as a main device for information transfer to operators, thorough system integration tests were performed at the factory and evaluations were made of operators' assessment from the initial experience of the system. The PODIA system is currently installed at two BWR power plants. Based on the experiences from the development of PODIA, a more advanced man-machine interface, Advanced-PODIA (A-PODIA), is developed. A-PODIA enhances the capabilities of PODIA in automation, diagnosis, operational guidance and information display. A-PODIA has been validated by carrying out systematic experiments with a full-scope simulator developed for the validation. The results of the experiments have been analyzed by the method of operator performance analysis and applied to further improvement of the A-PODIA system. As a feedback from actual operational experience, operator performance data in simulator training is an important source of information to evaluate human factors of a control room. To facilitate analysis of operator performance, a performance evaluation system has been developed by applying AI techniques. The knowledge contained in the performance evaluation system was elicited from operator training experts and represented as rules. The rules were implemented by employing an object-oriented paradigm to facilitate knowledge management. In conclusion, it is stated that the feedback from new control room operation can be obtained at an early stage by validation tests and also continuously by comprehensive evaluation (with the help of automated tools) of operator performance in simulator training. The results of operator performance analysis can be utilized for improvement of system design as well as operator training. (author)

  18. Performance Analysis of the Consensus-Based Distributed LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Gonzalo Mateos

    2009-01-01

    Full Text Available Low-cost estimation of stationary signals and reduced-complexity tracking of nonstationary processes are well motivated tasks than can be accomplished using ad hoc wireless sensor networks (WSNs. To this end, a fully distributed least mean-square (D-LMS algorithm is developed in this paper, in which sensors exchange messages with single-hop neighbors to consent on the network-wide estimates adaptively. The novel approach does not require a Hamiltonian cycle or a special bridge subset of sensors, while communications among sensors are allowed to be noisy. A mean-square error (MSE performance analysis of D-LMS is conducted in the presence of a time-varying parameter vector, which adheres to a first-order autoregressive model. For sensor observations that are related to the parameter vector of interest via a linear Gaussian model and after adopting simplifying independence assumptions, exact closed-form expressions are derived for the global and sensor-level MSE evolution as well as its steady-state (s.s. values. Mean and MSE-sense stability of D-LMS are also established. Interestingly, extensive numerical tests demonstrate that for small step-sizes the results accurately extend to the pragmatic setting whereby sensors acquire temporally correlated, not necessarily Gaussian data.

  19. Environmental performance, mechanical and microstructure analysis of concrete containing oil-based drilling cuttings pyrolysis residues of shale gas.

    Science.gov (United States)

    Wang, Chao-Qiang; Lin, Xiao-Yan; He, Ming; Wang, Dan; Zhang, Si-Lan

    2017-09-15

    The overall objective of this research project is to investigate the feasibility of incorporating oil-based drilling cuttings pyrolysis residues (ODPR) and fly ash serve as replacements for fine aggregates and cementitious materials in concrete. Mechanical and physical properties, detailed environmental performances, and microstructure analysis were carried out. Meanwhile, the early hydration process and hydrated products of ODPR concrete were analyzed with X-ray diffraction (XRD), Fourier transform infrared (FT-IR), scanning electron microscopy (SEM) and energy-dispersive X-ray spectroscopy (EDX). The results indicated that ODPR could not be categorize into hazardous wastes. ODPR had specific pozzolanic characteristic and the use of ODPR had certain influence on slump and compressive strength of concrete. The best workability and optimal compressive strength were achieved with the help of 35% ODPR. Environmental performance tests came to conclusion that ODPR as recycled aggregates and admixture for the preparation of concrete, from the technique perspective, were the substance of mere environmental contamination. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  1. The effect of problem-based and lecture-based instructional strategies on learner problem solving performance, problem solving processes, and attitudes

    Science.gov (United States)

    Visser, Yusra Laila

    This study compared the effect of lecture-based instruction to that of problem-based instruction on learner performance (on near-transfer and far-transfer problems), problem solving processes (reasoning strategy usage and reasoning efficiency), and attitudes (overall motivation and learner confidence) in a Genetics course. The study also analyzed the effect of self-regulatory skills and prior-academic achievement on performance for both instructional strategies. Sixty 11th grade students at a public math and science academy were assigned to either a lecture-based instructional strategy or a problem-based instructional strategy. Both treatment groups received 18 weeks of Genetics instruction through the assigned instructional strategy. In terms of problem solving performance, results revealed that the lecture-based group performed significantly better on near-transfer post-test problems. The problem-based group performed significantly better on far-transfer post-test problems. In addition, results indicated the learners in the lecture-based instructional treatment were significantly more likely to employ data-driven reasoning in the solving of problems, whereas learners in the problem-based instructional treatment were significantly more likely to employ hypothesis-driven reasoning in problem solving. No significant differences in reasoning efficiency were uncovered between treatment groups. Preliminary analysis of the motivation data suggested that there were no significant differences in motivation between treatment groups. However, a post-research exploratory analysis suggests that overall motivation was significantly higher in the lecture-based instructional treatment than in the problem-based instructional treatment. Learner confidence was significantly higher in the lecture-based group than in the problem-based group. A significant positive correlation was detected between self-regulatory skills scores and problem solving performance scores in the problem-based

  2. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    International Nuclear Information System (INIS)

    Kenley, C. Robert; Collins, John W.; Beck, John M.; Heydt, Harold J.; Garcia, Chad B.

    2001-01-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  3. Fabrication and performance analysis of MEMS-based Variable Emissivity Radiator for Space Applications

    International Nuclear Information System (INIS)

    Lee, Changwook; Oh, Hyung-Ung; Kim, Taegyu

    2014-01-01

    All Louver was typically representative as the thermal control device. The louver was not suitable to be applied to small satellite, because it has the disadvantage of increase in weight and volume. So MEMS-based variable radiator was developed to support the disadvantage of the louver MEMS-based variable emissivity radiator was designed for satellite thermal control. Because of its immediate response and low power consumption. Also MEMS- based variable emissivity radiator has been made smaller by using MEMS process, it could be solved the problem of the increase in weight and volume, and it has a high reliability and immediate response by using electrical control. In this study, operation validation of the MEMS radiator had been carried out, resulting that emissivity could be controlled. Numerical model was also designed to predict the thermal control performance of MEMS-based variable emissivity radiator

  4. Some Linguistic-based and temporal analysis on Wikipedia

    International Nuclear Information System (INIS)

    Yasseri, T.

    2010-01-01

    Wikipedia as a web-based, collaborative, multilingual encyclopaedia project is a very suitable field to carry out research on social dynamics and to investigate the complex concepts of conflict, collaboration, competition, dispute, etc in a large community (∼26 Million) of Wikipedia users. The other face of Wikipedia as a productive society, is its output, consisting of (∼17) Millions of articles written unsupervised by unprofessional editors in more than 270 different languages. In this talk we report some analysis performed on Wikipedia in two different approaches: temporal analysis to characterize disputes and controversies among users and linguistic-based analysis to characterize linguistic features of English texts in Wikipedia. (author)

  5. An enhanced performance through agent-based secure approach for mobile ad hoc networks

    Science.gov (United States)

    Bisen, Dhananjay; Sharma, Sanjeev

    2018-01-01

    This paper proposes an agent-based secure enhanced performance approach (AB-SEP) for mobile ad hoc network. In this approach, agent nodes are selected through optimal node reliability as a factor. This factor is calculated on the basis of node performance features such as degree difference, normalised distance value, energy level, mobility and optimal hello interval of node. After selection of agent nodes, a procedure of malicious behaviour detection is performed using fuzzy-based secure architecture (FBSA). To evaluate the performance of the proposed approach, comparative analysis is done with conventional schemes using performance parameters such as packet delivery ratio, throughput, total packet forwarding, network overhead, end-to-end delay and percentage of malicious detection.

  6. LMI-based stability and performance conditions for continuous-time nonlinear systems in Takagi-Sugeno's form.

    Science.gov (United States)

    Lam, H K; Leung, Frank H F

    2007-10-01

    This correspondence presents the stability analysis and performance design of the continuous-time fuzzy-model-based control systems. The idea of the nonparallel-distributed-compensation (non-PDC) control laws is extended to the continuous-time fuzzy-model-based control systems. A nonlinear controller with non-PDC control laws is proposed to stabilize the continuous-time nonlinear systems in Takagi-Sugeno's form. To produce the stability-analysis result, a parameter-dependent Lyapunov function (PDLF) is employed. However, two difficulties are usually encountered: 1) the time-derivative terms produced by the PDLF will complicate the stability analysis and 2) the stability conditions are not in the form of linear-matrix inequalities (LMIs) that aid the design of feedback gains. To tackle the first difficulty, the time-derivative terms are represented by some weighted-sum terms in some existing approaches, which will increase the number of stability conditions significantly. In view of the second difficulty, some positive-definitive terms are added in order to cast the stability conditions into LMIs. In this correspondence, the favorable properties of the membership functions and nonlinear control laws, which allow the introduction of some free matrices, are employed to alleviate the two difficulties while retaining the favorable properties of PDLF-based approach. LMI-based stability conditions are derived to ensure the system stability. Furthermore, based on a common scalar performance index, LMI-based performance conditions are derived to guarantee the system performance. Simulation examples are given to illustrate the effectiveness of the proposed approach.

  7. Performance analysis of switch-based multiuser scheduling schemes with adaptive modulation in spectrum sharing systems

    KAUST Repository

    Qaraqe, Marwa

    2014-04-01

    This paper focuses on the development of multiuser access schemes for spectrum sharing systems whereby secondary users are allowed to share the spectrum with primary users under the condition that the interference observed at the primary receiver is below a predetermined threshold. In particular, two scheduling schemes are proposed for selecting a user among those that satisfy the interference constraint and achieve an acceptable signal-to-noise ratio level. The first scheme focuses on optimizing the average spectral efficiency by selecting the user that reports the best channel quality. In order to alleviate the relatively high feedback required by the first scheme, a second scheme based on the concept of switched diversity is proposed, where the base station (BS) scans the secondary users in a sequential manner until a user whose channel quality is above an acceptable predetermined threshold is found. We develop expressions for the statistics of the signal-to-interference and noise ratio as well as the average spectral efficiency, average feedback load, and the delay at the secondary BS. We then present numerical results for the effect of the number of users and the interference constraint on the optimal switching threshold and the system performance and show that our analysis results are in perfect agreement with the numerical results. © 2014 John Wiley & Sons, Ltd.

  8. Importance-Performance Analysis of Service Attributes based on Customers Segmentation with a Data Mining Approach: a Study in the Mobile Telecommunication Market in Yazd Province

    Directory of Open Access Journals (Sweden)

    Seyed Yaghoub Hosseini

    2012-12-01

    Full Text Available In customer relationship management (CRM systems, importance and performance of the attributes that define a service is very important. Importance-Performance analysis is an effective tool for prioritizing service attributes based on customer needs and expectations and also for identifying strengths and weaknesses of organization in the market. In this study with the purpose of increasing reliability and accuracy of results, customers are segmented based on their demographic characteristics and perception of service attributes performance and then individual IPA matrixes are developed for each segment. Self-Organizing Maps (SOM has been used for segmentation and a feed forward neural network has been used to estimate the importance of attributes. Research findings show that mobile subscribers in Yazd province can be categorized in three segments. Individual IPA matrixes have been provided for each of these segments. Based on these results, recommendations are offered to companies providing mobile phone services.

  9. Parameter Selection and Performance Analysis of Mobile Terminal Models Based on Unity3D

    Institute of Scientific and Technical Information of China (English)

    KONG Li-feng; ZHAO Hai-ying; XU Guang-mei

    2014-01-01

    Mobile platform is now widely seen as a promising multimedia service with a favorable user group and market prospect. To study the influence of mobile terminal models on the quality of scene roaming, a parameter setting platform of mobile terminal models is established to select the parameter selection and performance index on different mobile platforms in this paper. This test platform is established based on model optimality principle, analyzing the performance curve of mobile terminals in different scene models and then deducing the external parameter of model establishment. Simulation results prove that the established test platform is able to analyze the parameter and performance matching list of a mobile terminal model.

  10. PERFORMANCE INDICATORS: A COMPARATIVE ANALYSIS BETWEEN PUBLIC AND PRIVATE COLLEGES IN BRAZIL

    Directory of Open Access Journals (Sweden)

    Átila de Melo Lira

    2015-06-01

    Full Text Available A comparative analysis between the use of performance indicators to public and private organizations have always been required to examine the scenario related to both. This study seeks to analyze the use of Balanced Scorecard (BSC to identify and understand the main differences and similarities in public and private higher education institutions (HEIs in Brazil in relation to the use of other organizations performance indicators. A quantitative and exploratory approach was adopted using institutional documents analysis. Data was searched on the websites of Brazilian higher education public and private organizations in order to accomplish this analysis comparative. The results showed that even reviewing few public institutions the use of performance indicators appears to be more efficient than those applied to the private ones. Private universities should observe and improve their processes and performance indicators based on those used in Brazilian public universities. This initial research still opens a horizon so that other studies be developed within this thought stream.

  11. Three-dimensional analysis of free-electron laser performance using brightness scaled variables

    Directory of Open Access Journals (Sweden)

    M. Gullans

    2008-06-01

    Full Text Available A three-dimensional analysis of radiation generation in a free-electron laser (FEL is performed in the small signal regime. The analysis includes beam conditioning, harmonic generation, flat beams, and a new scaling of the FEL equations using the six-dimensional beam brightness. The six-dimensional beam brightness is an invariant under Liouvillian flow; therefore, any nondissipative manipulation of the phase space, performed, for example, in order to optimize FEL performance, must conserve this brightness. This scaling is more natural than the commonly used scaling with the one-dimensional growth rate. The brightness-scaled equations allow for the succinct characterization of the optimal FEL performance under various additional constraints. The analysis allows for the simple evaluation of gain enhancement schemes based on beam phase space manipulations such as emittance exchange and conditioning. An example comparing the gain in the first and third harmonics of round or flat and conditioned or unconditioned beams is presented.

  12. Performance Analysis using Coloured Petri Nets

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    Performance is often a central issue in the design, development, and configuration of systems. It is not always enough to know that systems work properly, they must also work effectively. There are numerous studies, e.g. in the areas of computer and telecommunication systems, manufacturing......, military, health care, and transportation, that have shown that time, money, and even lives can be saved if the performance of a system is improved. Performance analysis studies are conducted to evaluate existing or planned systems, to compare alternative configurations, or to find an optimal configuration...... of a system. There are three alternative techniques for analysing the performance of a system: measurement, analytical models, and simulation models. This dissertation focuses on the the use of coloured Petri nets for simulationbased performance analysis of industrial-sized systems. Coloured Petri nets...

  13. Policy design and performance of emissions trading markets: an adaptive agent-based analysis.

    Science.gov (United States)

    Bing, Zhang; Qinqin, Yu; Jun, Bi

    2010-08-01

    Emissions trading is considered to be a cost-effective environmental economic instrument for pollution control. However, the pilot emissions trading programs in China have failed to bring remarkable success in the campaign for pollution control. The policy design of an emissions trading program is found to have a decisive impact on its performance. In this study, an artificial market for sulfur dioxide (SO2) emissions trading applying the agent-based model was constructed. The performance of the Jiangsu SO2 emissions trading market under different policy design scenario was also examined. Results show that the market efficiency of emissions trading is significantly affected by policy design and existing policies. China's coal-electricity price system is the principal factor influencing the performance of the SO2 emissions trading market. Transaction costs would also reduce market efficiency. In addition, current-level emissions discharge fee/tax and banking mechanisms do not distinctly affect policy performance. Thus, applying emissions trading in emission control in China should consider policy design and interaction with other existing policies.

  14. Reliability and validity of match performance analysis in soccer : a multidimensional qualitative evaluation of opponent interaction

    OpenAIRE

    Tenga, Albin

    2010-01-01

    Avhandling (doktorgrad) – Norges idrettshøgskole, 2010. Match performance analysis is widely used as a method for studying technical, tactical and physical aspects of player and team performance in a soccer match. Therefore, ensuring the validity and reliability of the collected data is important for match performance analysis to meet its intents and purposes effectively. However, most studies on soccer match performance use unidimensional frequency data based on analyses done ...

  15. Base Station Performance Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  16. PERFORMANCE ANALYSIS OF DISTINCT SECURED AUTHENTICATION PROTOCOLS USED IN THE RESOURCE CONSTRAINED PLATFORM

    Directory of Open Access Journals (Sweden)

    S. Prasanna

    2014-03-01

    Full Text Available Most of the e-commerce and m-commerce applications in the current e-business world, has adopted asymmetric key cryptography technique in their authentication protocol to provide an efficient authentication of the involved parties. This paper exhibits the performance analysis of distinct authentication protocol which implements the public key cryptography like RSA, ECC and HECC. The comparison is made based on key generation, sign generation and sign verification processes. The results prove that the performance achieved through HECC based authentication protocol is better than the ECC- and RSA based authentication protocols.

  17. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  18. Performance measurement in transport sector analysis

    Directory of Open Access Journals (Sweden)

    M. Išoraitė

    2004-06-01

    Full Text Available The article analyses the following issues: 1. Performance measurement in literature. The performance measurement has an important role to play in the efficient and effective management of organizations. Kaplan and Johnson highlighted the failure of the financial measures to reflect changes in the competitive circumstances and strategies of modern organizations. Many authors have focused attention on how organizations can design more appropriate measurement systems. Based on literature, consultancy experience and action research, numerous processes have been developed that organizations can follow in order to design and implement systems. Many frameworks have been proposed that support these processes. The objective of such frameworks is to help organizations define a set of measures that reflect their objectives and assess their performance appropriately. 2. Transport sector performance and its impacts measuring. The purpose of transport measurement is to identify opportunities enhancing transport performance. Successful transport sector management requires a system to analyze its efficiency and effectiveness as well as plan interventions if transport sector performance needs improvement. Transport impacts must be measurable and monitorable so that the person responsible for the project intervention can decide when and how to influence them. Performance indicators provide a means to measure and monitor impacts. These indicators essentially reflect quantitative and qualitative aspects of impacts at given time and places. 3. Transport sector output and input. Transport sector inputs are the resources required to deliver transport sector outputs. Transport sector inputs are typically: human resources, particularly skilled resources (including specialists consulting inputs; technology processes such as equipment and work; and finance, both public and private. 4. Transport sector policy and institutional framework; 5. Cause – effect linkages; 6

  19. Safety analysis of MOX fuels by fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2002-12-01

    Performance of plutonium rick mixed oxide fuels specified for the Reduced-Moderation Water Reactor (RMWR) has been analysed by modified fuel performance code. Thermodynamic properties of these fuels up to 120 GWd/t burnup have not been measured and estimated using existing uranium fuel models. Fission product release, pressure rise inside fuel rods and mechanical loads of fuel cans due to internal pressure have been preliminarily assessed based on assumed axial power distribution history, which show the integrity of fuel performance. Detailed evaluation of fuel-cladding interactions due to thermal expansion or swelling of fuel pellets due to high burnup will be required for safety analysis of mixed oxide fuels. Thermal conductivity and swelling of plutonium rich mixed oxide fuels shall be taken into consideration. (T. Tanaka)

  20. Analysis of the ability of junior high school students’ performance in science in STEM project-based learning

    Science.gov (United States)

    Suryana, A.; Sinaga, P.; Suwarma, I. R.

    2018-05-01

    The challenges in 21st century demands the high competitiveness. The way of thinking ability, determine how it work ability and choose instrument be part of the skills will need in the 21st century. The competence it can be supported by learning involving the student performance skills. Based on the preliminary studies at one junior high school in Bandung found that the learning involving of performance skill is low.This is supported by data from respondent in received the opportunity to make devise a sketch in of learning especially based on practices or projects, the results are 75 % students said rarely and 18,75 % students said never. In addition seen also how the student activities in project based learning in class the results stated that 68,75 % of students said less, and 6.25 % of students said never. Therefore, we did a result to uncover profile performance on the design process and the performance process of junior high school student performances to the matter optical by using STEM project based learning. From this result. From the research obtained the average score classes in the activities of the design process is as much as 2,49 or dipersentasikan become 62,41 % are in the good category and the average score classes in the process of the performance of activities receive is 3,13 or 78,28 % are in the good category.

  1. Diagnostic performance of {sup 18}F-dihydroxyphenylalanine positron emission tomography in patients with paraganglioma: a meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Treglia, Giorgio; Cocciolillo, Fabrizio; Castaldi, Paola; Rufini, Vittoria; Giordano, Alessandro [Catholic University of the Sacred Heart, Institute of Nuclear Medicine, Rome (Italy); De Waure, Chiara; Di Nardo, Francesco; Gualano, Maria Rosaria [Catholic University of the Sacred Heart, Institute of Hygiene, Rome (Italy)

    2012-07-15

    The aim of this study was to systematically review and conduct a meta-analysis of published data about the diagnostic performance of {sup 18}F-dihydroxyphenylalanine (DOPA) positron emission tomography (PET) in patients with paraganglioma (PG). A comprehensive computer literature search of studies published through 30 June 2011 regarding {sup 18}F-DOPA PET or PET/computed tomography (PET/CT) in patients with PG was performed in PubMed/MEDLINE, Embase and Scopus databases. Pooled sensitivity and specificity of {sup 18}F-DOPA PET or PET/CT in patients with PG on a per patient- and on a per lesion-based analysis were calculated. The area under the receiver-operating characteristic (ROC) curve was calculated to measure the accuracy of {sup 18}F-DOPA PET or PET/CT in patients with PG. Furthermore, a sub-analysis taking into account the different genetic mutations in PG patients was also performed. Eleven studies comprising 275 patients with suspected PG were included in this meta-analysis. The pooled sensitivity of {sup 18}F-DOPA PET and PET/CT in detecting PG was 91% [95% confidence interval (CI) 87-94%] on a per patient-based analysis and 79% (95% CI 76-81%) on a per lesion-based analysis. The pooled specificity of {sup 18}F-DOPA PET and PET/CT in detecting PG was 95% (95% CI 86-99%) on a per patient-based analysis and 95% (95% CI 84-99%) on a per lesion-based analysis. The area under the ROC curve was 0.95 on a per patient- and 0.94 on a per lesion-based analysis. Heterogeneity between the studies about sensitivity of {sup 18}F-DOPA PET or PET/CT was found. A significant increase in sensitivity of {sup 18}F-DOPA PET or PET/CT was observed when a sub-analysis excluding patients with succinate dehydrogenase subunit B (SDHB) gene mutations was performed. In patients with suspected PG {sup 18}F-DOPA PET or PET/CT demonstrated high sensitivity and specificity. {sup 18}F-DOPA PET or PET/CT are accurate methods in this setting. Nevertheless, possible sources of false

  2. Mindfulness-based and acceptance-based interventions in sport and performance contexts.

    Science.gov (United States)

    Gardner, Frank L; Moore, Zella E

    2017-08-01

    Since mindfulness-based and acceptance-based practice models were first conceptualized and applied in sport in an attempt to enhance performance and overall well-being of athletes and performers, these state-of-the-art theoretical and practice models have substantially broadened our knowledge base and have been successfully incorporated into sport and performance practice domains worldwide. Evolving from a sound empirical foundation, mindfulness-based and acceptance-based models in sport psychology have accumulated a strong basic and applied empirical foundation. In the nearly 20 years since their incorporation in the context of sport, empirical findings have demonstrated efficacious outcomes associated with performance and personal well-being, as well as supported their theorized mechanisms of change. Particularly as sport and performance environments increasingly call upon practitioners to provide more comprehensive care to clientele, including a range of care from performance enhancement and maintenance, to general personal well-being, to subclinical and clinical issues, mindfulness-based and acceptance-based practitioners have the tools to offer robust, empirically informed interventions that can enhance skills and quality of life, and/or ameliorate personal struggles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. System based practice: a concept analysis

    Directory of Open Access Journals (Sweden)

    SHAHRAM YAZDANI

    2016-04-01

    Full Text Available Introduction: Systems-Based Practice (SBP is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods: Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results: Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion: The identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis.

  4. Space Launch System Base Heating Test: Sub-Scale Rocket Engine/Motor Design, Development & Performance Analysis

    Science.gov (United States)

    Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan

    2014-01-01

    ATA-002 Technical Team has successfully designed, developed, tested and assessed the SLS Pathfinder propulsion systems for the Main Base Heating Test Program. Major Outcomes of the Pathfinder Test Program: Reach 90% of full-scale chamber pressure Achieved all engine/motor design parameter requirements Reach steady plume flow behavior in less than 35 msec Steady chamber pressure for 60 to 100 msec during engine/motor operation Similar model engine/motor performance to full-scale SLS system Mitigated nozzle throat and combustor thermal erosion Test data shows good agreement with numerical prediction codes Next phase of the ATA-002 Test Program Design & development of the SLS OML for the Main Base Heating Test Tweak BSRM design to optimize performance Tweak CS-REM design to increase robustness MSFC Aerosciences and CUBRC have the capability to develop sub-scale propulsion systems to meet desired performance requirements for short-duration testing.

  5. Performation Metrics Development Analysis for Information and Communications Technology Outsourcing: A Case Study

    Science.gov (United States)

    Travis, James L., III

    2014-01-01

    This study investigated how and to what extent the development and use of the OV-5a operational architecture decomposition tree (OADT) from the Department of Defense (DoD) Architecture Framework (DoDAF) affects requirements analysis with respect to complete performance metrics for performance-based services acquisition of ICT under rigid…

  6. Human performance analysis in the frame of probabilistic safety assessment of research reactors

    International Nuclear Information System (INIS)

    Farcasiu, Mita; Nitoi, Mirela; Apostol, Minodora; Turcu, I.; Florescu, Gh.

    2005-01-01

    Full text: The analysis of operating experience has identified the importance of human performance in reliability and safety of research reactors. In Probabilistic Safety Assessment (PSA) of nuclear facilities, human performance analysis (HPA) is used in order to estimate human error contribution to the failure of system components or functions. HPA is a qualitative and quantitative analysis of human actions identified for error-likely situations or accident-prone situations. Qualitative analysis is used to identify all man-machine interfaces that can lead to an accident, types of human interactions which may mitigate or exacerbate the accident, types of human errors and performance shaping factors. Quantitative analysis is used to develop estimates of human error probability as effects of human performance in reliability and safety. The goal of this paper is to accomplish a HPA in the PSA frame for research reactors. Human error probabilities estimated as results of human actions analysis could be included in system event tree and/or system fault tree. The achieved sensitivity analyses determine human performance sensibility at systematically variations both for dependencies level between human actions and for operator stress level. The necessary information was obtained from operating experience of research reactor TRIGA from INR Pitesti. The required data were obtained from generic data bases. (authors)

  7. Process-based interpretation of conceptual hydrological model performance using a multinational catchment set

    Science.gov (United States)

    Poncelet, Carine; Merz, Ralf; Merz, Bruno; Parajka, Juraj; Oudin, Ludovic; Andréassian, Vazken; Perrin, Charles

    2017-08-01

    Most of previous assessments of hydrologic model performance are fragmented, based on small number of catchments, different methods or time periods and do not link the results to landscape or climate characteristics. This study uses large-sample hydrology to identify major catchment controls on daily runoff simulations. It is based on a conceptual lumped hydrological model (GR6J), a collection of 29 catchment characteristics, a multinational set of 1103 catchments located in Austria, France, and Germany and four runoff model efficiency criteria. Two analyses are conducted to assess how features and criteria are linked: (i) a one-dimensional analysis based on the Kruskal-Wallis test and (ii) a multidimensional analysis based on regression trees and investigating the interplay between features. The catchment features most affecting model performance are the flashiness of precipitation and streamflow (computed as the ratio of absolute day-to-day fluctuations by the total amount in a year), the seasonality of evaporation, the catchment area, and the catchment aridity. Nonflashy, nonseasonal, large, and nonarid catchments show the best performance for all the tested criteria. We argue that this higher performance is due to fewer nonlinear responses (higher correlation between precipitation and streamflow) and lower input and output variability for such catchments. Finally, we show that, compared to national sets, multinational sets increase results transferability because they explore a wider range of hydroclimatic conditions.

  8. Optimal Sizing and Performance Evaluation of a Renewable Energy Based Microgrid in Future Seaports

    DEFF Research Database (Denmark)

    Baizura Binti Ahamad, Nor; Othman @ Marzuki, Muzaidi Bin; Quintero, Juan Carlos Vasquez

    2018-01-01

    This paper presents the optimal design and specifies the dimension, energy planning and evaluates the performance of a microgrid to supply the electricity to the load by using integrated microgrid. The integrated system consists of PV, wind turbine and a battery for grid-connected. This paper also...... analyzes the performance of the designed system based on seaport located in Copenhagen, Denmark as a case study. The analysis is performed by using Hybrid Optimization Model for Electric Renewables (HOMER) software which includes optimization and sensitivity analysis result. The simulation result indicates...

  9. PERFORMANCE BASED PAY AS A DETERMINANT OF JOB SATISFACTION: A STUDY IN MALAYSIA GIATMARA CENTERS

    Directory of Open Access Journals (Sweden)

    Azman ISMAIL

    2011-01-01

    Full Text Available Compensation management literature highlights that performance based payhas two major characteristics: participation in pay systems and adequacy ofpay. The ability of management to properly implement such pay systems maylead to increased job satisfaction in organizations. Though, the nature of thisrelationship is interesting, little is known about the influence of performancebased pay on job satisfaction in compensation management literature.Therefore, this study was conducted to examine the relationship between payfor performance and job satisfaction in Malaysian GIATMARA centers. Theresults of exploratory factor analysis confirmed that measurement scalesused in this study satisfactorily met the standards of validity and reliabilityanalyses. An outcome of stepwise regression analysis shows thatdeterminant of job satisfaction is performance based pay. Further, this resultconfirms that pay for performance is an important antecedent for jobsatisfaction in the studied organizations.

  10. Comparative Analysis of Reward and Employee Performance Based on Gender at Central Bank of Bank Sulut

    OpenAIRE

    Pandowo, Merinda; Lapian, S.L.H.V Joyce; Oroh, Ryan Vitaly

    2014-01-01

    Employee are the big asset of organization. Employee Performance are have big effect of organization performance. The best way to attract employee performance is rewarding employee. Reward can be tangible such money, bonuses or called extrinsic reward and intangible reward such promotion, holiday or called intrinsic reward. The purpose of this research are to analyzed significant differences in employee performance based on reward between male and female. To achieve the objectives researcher ...

  11. Positioning performance analysis of the time sum of arrival algorithm with error features

    Science.gov (United States)

    Gong, Feng-xun; Ma, Yan-qiu

    2018-03-01

    The theoretical positioning accuracy of multilateration (MLAT) with the time difference of arrival (TDOA) algorithm is very high. However, there are some problems in practical applications. Here we analyze the location performance of the time sum of arrival (TSOA) algorithm from the root mean square error ( RMSE) and geometric dilution of precision (GDOP) in additive white Gaussian noise (AWGN) environment. The TSOA localization model is constructed. Using it, the distribution of location ambiguity region is presented with 4-base stations. And then, the location performance analysis is started from the 4-base stations with calculating the RMSE and GDOP variation. Subsequently, when the location parameters are changed in number of base stations, base station layout and so on, the performance changing patterns of the TSOA location algorithm are shown. So, the TSOA location characteristics and performance are revealed. From the RMSE and GDOP state changing trend, the anti-noise performance and robustness of the TSOA localization algorithm are proved. The TSOA anti-noise performance will be used for reducing the blind-zone and the false location rate of MLAT systems.

  12. Practical Switching-Based Hybrid FSO/RF Transmission and Its Performance Analysis

    KAUST Repository

    Usman, Muneer

    2014-10-01

    Hybrid free-space optical (FSO)/radio-frequency (RF) systems have emerged as a promising solution for high-data-rate wireless backhaul. We present and analyze a switching-based transmission scheme for the hybrid FSO/RF system. Specifically, either the FSO or RF link will be active at a certain time instance, with the FSO link enjoying a higher priority. We considered both a single-threshold case and a dual-threshold case for FSO link operation. Analytical expressions have been obtained for the outage probability, average bit error rate, and ergodic capacity for the resulting system. Numerical examples are presented to compare the performance of the hybrid scheme with the FSO-only scenario.

  13. Practical Switching-Based Hybrid FSO/RF Transmission and Its Performance Analysis

    KAUST Repository

    Usman, Muneer; Hong-Chuan Yang; Alouini, Mohamed-Slim

    2014-01-01

    Hybrid free-space optical (FSO)/radio-frequency (RF) systems have emerged as a promising solution for high-data-rate wireless backhaul. We present and analyze a switching-based transmission scheme for the hybrid FSO/RF system. Specifically, either the FSO or RF link will be active at a certain time instance, with the FSO link enjoying a higher priority. We considered both a single-threshold case and a dual-threshold case for FSO link operation. Analytical expressions have been obtained for the outage probability, average bit error rate, and ergodic capacity for the resulting system. Numerical examples are presented to compare the performance of the hybrid scheme with the FSO-only scenario.

  14. Mass and performance optimization of an airplane wing leading edge structure against bird strike using Taguchi-based grey relational analysis

    Directory of Open Access Journals (Sweden)

    Hassan Pahange

    2016-08-01

    Full Text Available Collisions between birds and aircraft are one of the most dangerous threats to flight safety. In this study, smoothed particles hydrodynamics (SPH method is used for simulating the bird strike to an airplane wing leading edge structure. In order to verify the model, first, experiment of bird strike to a flat aluminum plate is simulated, and then bird impact on an airplane wing leading edge structure is investigated. After that, considering dimensions of wing internal structural components like ribs, skin and spar as design variables, we try to minimize structural mass and wing skin deformation simultaneously. To do this, bird strike simulations to 18 different wing structures are made based on Taguchi’s L18 factorial design of experiment. Then grey relational analysis is used to minimize structural mass and wing skin deformation due to the bird strike. The analysis of variance (ANOVA is also applied and it is concluded that the most significant parameter for the performance of wing structure against impact is the skin thickness. Finally, a validation simulation is conducted under the optimal condition to show the improvement of performance of the wing structure.

  15. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  16. Design and Performance Analysis of Laser Displacement Sensor Based on Position Sensitive Detector (PSD)

    International Nuclear Information System (INIS)

    Song, H X; Wang, X D; Ma, L Q; Cai, M Z; Cao, T Z

    2006-01-01

    By using PSD as sensitive element, and laser diode as emitting element, laser displacement sensor based on triangulation method has been widely used. From the point of view of design, sensor and its performance were studied. Two different sensor configurations were described. Determination of the dimension, sensing resolution and comparison of the two different configurations were presented. The factors affecting the performance of the laser displacement sensor were discussed and two methods, which can eliminate the affection of dark current and environment light, are proposed

  17. Performance-based, cost- and time-effective PCB analytical methodology

    International Nuclear Information System (INIS)

    Alvarado, J. S.

    1998-01-01

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the new sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval

  18. Permanent Magnet Flux-Switching Machine, Optimal Design and Performance Analysis

    Directory of Open Access Journals (Sweden)

    Liviu Emilian Somesan

    2013-01-01

    Full Text Available In this paper an analytical sizing-design procedure for a typical permanent magnet flux-switching machine (PMFSM with 12 stator and respectively 10 rotor poles is presented. An optimal design, based on Hooke-Jeeves method with the objective functions of maximum torque density, is performed. The results were validated via two dimensions finite element analysis (2D-FEA applied on the optimized structure. The influence of the permanent magnet (PM dimensions and type, respectively of the rotor poles' shape on the machine performance were also studied via 2D-FEA.

  19. Comprehensive analysis of transport aircraft flight performance

    Science.gov (United States)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance

  20. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    Science.gov (United States)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  1. Evaluating Service Quality from Patients' Perceptions: Application of Importance-performance Analysis Method.

    Science.gov (United States)

    Mohebifar, Rafat; Hasani, Hana; Barikani, Ameneh; Rafiei, Sima

    2016-08-01

    Providing high service quality is one of the main functions of health systems. Measuring service quality is the basic prerequisite for improving quality. The aim of this study was to evaluate the quality of service in teaching hospitals using importance-performance analysis matrix. A descriptive-analytic study was conducted through a cross-sectional method in six academic hospitals of Qazvin, Iran, in 2012. A total of 360 patients contributed to the study. The sampling technique was stratified random sampling. Required data were collected based on a standard questionnaire (SERVQUAL). Data analysis was done through SPSS version 18 statistical software and importance-performance analysis matrix. The results showed a significant gap between importance and performance in all five dimensions of service quality (p quality gap and "responsiveness" had the lowest gap (1.97). Also, according to findings, reliability and assurance were in Quadrant (I), empathy was in Quadrant (II), and tangibles and responsiveness were in Quadrant (IV) of the importance-performance matrix. The negative gap in all dimensions of quality shows that quality improvement is necessary in all dimensions. Using quality and diagnosis measurement instruments such as importance-performance analysis will help hospital managers with planning of service quality improvement and achieving long-term goals.

  2. Development of Performance Analysis Program for an Axial Compressor with Meanline Analysis

    International Nuclear Information System (INIS)

    Park, Jun Young; Park, Moo Ryong; Choi, Bum Suk; Song, Je Wook

    2009-01-01

    Axial-flow compressor is one of the most important parts of gas turbine units with axial turbine and combustor. Therefore, precise prediction of performance is very important for development of new compressor or modification of existing one. Meanline analysis is a simple, fast and powerful method for performance prediction of axial-flow compressors with different geometries. So, Meanline analysis is frequently used in preliminary design stage and performance analysis for given geometry data. Much correlations for meanline analysis have been developed theoretically and experimentally for estimating various types of losses and flow deviation angle for long time. In present study, meanline analysis program was developed to estimate compressor losses, incidence angles, deviation angles, stall and surge conditions with many correlations. Performance prediction of one stage axial compressors is conducted with this meanline analysis program. The comparison between experimental and numerical results show a good agreement. This meanline analysis program can be used for various types of single stage axial-flow compressors with different geometries, as well as multistage axial-flow compressors

  3. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.

    2017-07-04

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  4. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.; Sicat, R.; Baum, D.; Wodo, O.; Hadwiger, Markus

    2017-01-01

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  5. Commissioning and Performance Analysis of WhisperGen Stirling Engine

    Science.gov (United States)

    Pradip, Prashant Kaliram

    Stirling engine based cogeneration systems have potential to reduce energy consumption and greenhouse gas emission, due to their high cogeneration efficiency and emission control due to steady external combustion. To date, most studies on this unit have focused on performance based on both experimentation and computer models, and lack experimental data for diversified operating ranges. This thesis starts with the commissioning of a WhisperGen Stirling engine with components and instrumentation to evaluate power and thermal performance of the system. Next, a parametric study on primary engine variables, including air, diesel, and coolant flowrate and temperature were carried out to further understand their effect on engine power and efficiency. Then, this trend was validated with the thermodynamic model developed for the energy analysis of a Stirling cycle. Finally, the energy balance of the Stirling engine was compared without and with heat recovery from the engine block and the combustion chamber exhaust.

  6. PERFORMANCE ANALYSIS OF AI BASED QOS SCHEDULER FOR MOBILE WIMAX

    Directory of Open Access Journals (Sweden)

    D. David Neels Pon Kumar

    2012-09-01

    Full Text Available Interest in broadband wireless access (BWA has been growing due to increased user mobility and the need for data access at all times. IEEE 802.16e based WiMAX networks promise the best available quality of experience for mobile data service users. WiMAX networks incorporate several Quality of Service (QoS mechanisms at the Media Access Control (MAC level for guaranteed services for multimedia viz. data, voice and video. The problem of assuring QoS is how to allocate available resources among users to meet the QoS criteria such as delay, delay jitter, fairness and throughput requirements. IEEE standard does not include a standard scheduling mechanism and leaves it for various implementer differentiations. Although a lot of the real-time and non real-time packet scheduling schemes has been proposed, it needs to be modified to apply to Mobile WiMAX system that supports five kinds of service classes. In this paper, we propose a novel Priority based Scheduling scheme that uses Artificial Intelligence to support various services by considering the QoS constraints of each class. The simulation results show that slow mobility does not affect the performances and faster mobility and the increment in users beyond a particular load have their say in defining average throughput, average per user throughput, fairness index, average end to end delay and average delay jitter. Nevertheless the results are encouraging that the proposed scheme provides QoS support for each class efficiently.

  7. Preliminary Analysis of Remote Monitoring and Robotic Concepts for Performance Confirmation

    International Nuclear Information System (INIS)

    McAffee, D.A.

    1997-01-01

    main Performance Confirmation monitoring needs and requirements during the post-emplacement preclosure period. This includes radiological, non-radiological, host rock, and infrastructure performance monitoring needs. It also includes monitoring for possible off-normal events. (Presented in Section 7.3). (3) Identify general approaches and methods for obtaining performance information from within the emplacement drifts for Performance Confirmation. (Presented in Section 7.4) (4)Review and discuss available technologies and design strategies that may permit the use of remotely operated systems within the hostile thermal and radiation environment expected within the emplacement drifts. (Presented in Section 7.5). (5) Based on Performance Confirmation monitoring needs and available technologies, identify potential application areas for remote systems and robotics for post-emplacement preclosure Performance Confirmation activities (Presented in Section 7.6). (6) Develop preliminary remote monitoring and robotic concepts for post-emplacement, preclosure Performance Confirmation activities. (Presented in Section 7.7) This analysis is being performed very early in the systems engineering cycle, even as issues related to the Performance Confirmation program planning phase are being formulated and while the associated needs, constraints and objectives are yet to be fully determined and defined. This analysis is part of an issue formulation effort and is primarily concerned with identification and description of key issues related to remotely monitoring repository performance for Performance Confirmation. One of the purposes of this analysis is to provide an early investigation of potential design challenges that may have a high impact on future design concepts. This analysis can be used to guide future concept development and help access what is feasible and achievable by application of remote systems technology. Future design and systems engineering analysis with applicable

  8. Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models

    Science.gov (United States)

    Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon

    2010-01-01

    Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510

  9. How reliable are geometry-based building indices as thermal performance indicators?

    International Nuclear Information System (INIS)

    Rodrigues, Eugénio; Amaral, Ana Rita; Gaspar, Adélio Rodrigues; Gomes, Álvaro

    2015-01-01

    Highlights: • Geometry-based building indices are tested in different European climate regions. • Building design programs are used to randomly generate sets of simulation models. • Some indices correlate in specific climates and design programs. • Shape-based Relative Compactness presented the best correlation of all indices. • Window-to-Surface Ratio was the window-based index with best correlation. - Abstract: Architects and urban planners have been relying on geometry-based indices to design more energy efficient buildings for years. The advantage of such indices is their ease of use and capability to capture the relation of a few geometric variables with the building’s performance. However, such relation is usually found using only a few simple building models and considering only a few climate regions. This paper presents the analysis of six geometry-based building indices to determine their adequacy in eight different climate regions in Europe. For each location, three residential building design programs were used as building specifications. Two algorithms were employed to randomly generate and assess the thermal performance of three sets of 500 alternative building models. The results show that geometry-based indices only correlate with the buildings’ thermal performance according to specific climate regions and building design programs

  10. Performance Analysis: Control of Hazardous Energy

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, Connie E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Freeman, Jeff W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kerr, Christine E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-10-06

    LLNL experienced 26 occurrences related to the control of hazardous energy from January 1, 2008 through August 2010. These occurrences were 17% of the total number of reported occurrences during this 32-month period. The Performance Analysis and Reporting Section of the Contractor Assurance Office (CAO) routinely analyzes reported occurrences and issues looking for patterns that may indicate changes in LLNL’s performance and early indications of performance trends. It became apparent through these analyses that LLNL might have experienced a change in the control of hazardous energy and that these occurrences should be analyzed in more detail to determine if the perceived change in performance was real, whether that change is significant and if the causes of the occurrences are similar. This report documents the results of this more detailed analysis.

  11. CONPAS 1.0 (CONtainment Performance Analysis System). User's manual

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Jin, Young Ho

    1996-04-01

    CONPAS (CONtainment Performance Analysis System) is a verified computer code package to integrate the numerical, graphical, and results-operation aspects of Level 2 probabilistic safety assessments (PSA) for nuclear power plants automatically under a PC window environment. Compared with the existing DOS-based computer codes for Level 2 PSA, the most important merit of the window-based computer code is that user can easily describe and quantify the accident progression models, and manipulate the resultant outputs in a variety of ways. As a main logic for accident progression analysis, CONPAS employs a concept of the small containment phenomenological event tree (CPET) helpful to trace out visually individual accident progressions and of the large supporting event tree (LSET) for its detailed quantification. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules; (1) ET Editor for construction of several event tree models describing the accident progressions, (2) Computer for quantification of the constructed event trees and graphical display of the resultant outputs, (3) Text Editor for preparation of input decks for quanification and utilization of calculational results, and (4) Mechanistic Code Plotter for utilization of results obtained from severe accident analysis codes. Compared with other existing computer codes for Level 2 PSA, the CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friend interface. 10 refs. (Author) .new

  12. Structure-based capacitance modeling and power loss analysis for the latest high-performance slant field-plate trench MOSFET

    Science.gov (United States)

    Kobayashi, Kenya; Sudo, Masaki; Omura, Ichiro

    2018-04-01

    Field-plate trench MOSFETs (FP-MOSFETs), with the features of ultralow on-resistance and very low gate–drain charge, are currently the mainstream of high-performance applications and their advancement is continuing as low-voltage silicon power devices. However, owing to their structure, their output capacitance (C oss), which leads to main power loss, remains to be a problem, especially in megahertz switching. In this study, we propose a structure-based capacitance model of FP-MOSFETs for calculating power loss easily under various conditions. Appropriate equations were modeled for C oss curves as three divided components. Output charge (Q oss) and stored energy (E oss) that were calculated using the model corresponded well to technology computer-aided design (TCAD) simulation, and we validated the accuracy of the model quantitatively. In the power loss analysis of FP-MOSFETs, turn-off loss was sufficiently suppressed, however, mainly Q oss loss increased depending on switching frequency. This analysis reveals that Q oss may become a significant issue in next-generation high-efficiency FP-MOSFETs.

  13. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  14. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  15. Performance-based planning and programming guidebook.

    Science.gov (United States)

    2013-09-01

    "Performance-based planning and programming (PBPP) refers to the application of performance management principles within the planning and programming processes of transportation agencies to achieve desired performance outcomes for the multimodal tran...

  16. Thermodynamic performance analysis and algorithm model of multi-pressure heat recovery steam generators (HRSG) based on heat exchangers layout

    International Nuclear Information System (INIS)

    Feng, Hongcui; Zhong, Wei; Wu, Yanling; Tong, Shuiguang

    2014-01-01

    Highlights: • A general model of multi-pressure HRSG based on heat exchangers layout is built. • The minimum temperature difference is introduced to replace pinch point analysis. • Effects of layout on dual pressure HRSG thermodynamic performances are analyzed. - Abstract: Changes of heat exchangers layout in heat recovery steam generator (HRSG) will modify the amount of waste heat recovered from flue gas; this brings forward a desire for the optimization of the design of HRSG. In this paper the model of multi-pressure HRSG is built, and an instance of a dual pressure HRSG under three different layouts of Taihu Boiler Co., Ltd. is discussed, with specified values of inlet temperature, mass flow rate, composition of flue gas and water/steam parameters as temperature, pressure etc., steam mass flow rate and heat efficiency of different heat exchangers layout of HRSG are analyzed. This analysis is based on the laws of thermodynamics and incorporated into the energy balance equations for the heat exchangers. In the conclusion, the results of the steam mass flow rate, heat efficiency obtained for three heat exchangers layout of HRSGs are compared. The results show that the optimization of heat exchangers layout of HRSGs has a great significance for waste heat recovery and energy conservation

  17. Modeling and performance analysis of cambered wing-based piezoaeroelastic energy harvesters

    International Nuclear Information System (INIS)

    Abdelkefi, Abdessattar; Nuhait, Abdullah O

    2013-01-01

    We investigate the effects of aerodynamic loads on the performance of wing-based piezoaeroelastic energy harvesters. The rigid airfoil consists of pitch and plunge degrees of freedom supported by flexural and torsional springs with a piezoelectric coupling attached to the plunge degree of freedom. The effects of aerodynamic loads are investigated by considering a camber in the airfoil. A two-dimensional unsteady vortex-lattice method (UVLM) is used to model the unsteady aerodynamic loads. An iterative scheme based on Hamming’s fourth-order predictor–corrector method is employed to solve the governing equations simultaneously and interactively. The effects of varying the camber, its location, and the nonlinear torsional spring coefficient are determined. The results show that, for small values of the camber location, the flutter speed changes greatly on increasing the camber of the airfoil. On the other hand, for large values of the camber location, the variation of the flutter speed when changing the camber is very negligible. We demonstrate that the symmetric airfoil case is the best configuration to design enhanced wing-based piezoaeroelastic energy harvesters. Furthermore, the results show that an increase in the camber results in a decrease in the level of the harvested power. For cambered airfoils, we demonstrate that an increase in the camber location leads to an increase in the level of the harvested power. The results show that an increase in the airfoil camber delays the appearance of a secondary Hopf bifurcation. (paper)

  18. Frequency scanning-based stability analysis method for grid-connected inverter system

    DEFF Research Database (Denmark)

    Wang, Yanbo; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion with conside......This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion...... with consideration of the inverter nonlinearities. Small current disturbance is injected into grid-connected inverter system in a particular frequency range, and the impedance is computed according to the harmonic-frequency response using Fourier analysis, and then the stability is predicted on the basis...... of the impedance stability criterion. The stability issues of grid-connected inverters with grid-current feedback and the converter-current feedback are addressed using the proposed method. The results obtained from simulation and experiments validate the effectiveness of the method. The frequency scanning...

  19. Crew Exploration Vehicle Launch Abort Controller Performance Analysis

    Science.gov (United States)

    Sparks, Dean W., Jr.; Raney, David L.

    2007-01-01

    This paper covers the simulation and evaluation of a controller design for the Crew Module (CM) Launch Abort System (LAS), to measure its ability to meet the abort performance requirements. The controller used in this study is a hybrid design, including features developed by the Government and the Contractor. Testing is done using two separate 6-degree-of-freedom (DOF) computer simulation implementations of the LAS/CM throughout the ascent trajectory: 1) executing a series of abort simulations along a nominal trajectory for the nominal LAS/CM system; and 2) using a series of Monte Carlo runs with perturbed initial flight conditions and perturbed system parameters. The performance of the controller is evaluated against a set of criteria, which is based upon the current functional requirements of the LAS. Preliminary analysis indicates that the performance of the present controller meets (with the exception of a few cases) the evaluation criteria mentioned above.

  20. Performance Analysis of Segmentation of Hyperspectral Images Based on Color Image Segmentation

    Directory of Open Access Journals (Sweden)

    Praveen Agarwal

    2017-06-01

    Full Text Available Image segmentation is a fundamental approach in the field of image processing and based on user’s application .This paper propose an original and simple segmentation strategy based on the EM approach that resolves many informatics problems about hyperspectral images which are observed by airborne sensors. In a first step, to simplify the input color textured image into a color image without texture. The final segmentation is simply achieved by a spatially color segmentation using feature vector with the set of color values contained around the pixel to be classified with some mathematical equations. The spatial constraint allows taking into account the inherent spatial relationships of any image and its color. This approach provides effective PSNR for the segmented image. These results have the better performance as the segmented images are compared with Watershed & Region Growing Algorithm and provide effective segmentation for the Spectral Images & Medical Images.

  1. Development of performance analysis support modules in Younggwang NPP unit 3 and 4

    International Nuclear Information System (INIS)

    Heo, G. Y.; Lee, S. J.; Jang, S. H.; Choi, S. S.; Choi, Q. H.; Gee, M. H.; Heo, I.

    2003-01-01

    This paper covers the unmeasured parameter estimation module, the performance degradation estimation module, and the performance degradation diagnosis module that are the supporting modules of the thermal performance analysis program of Younggwang nuclear power plant unit 3 and 4, PERUPS (PERformance UPgrade System), which was developed by KHNP, KEPRI, Hoseo university, and ACT. The unmeasured parameter estimation module plays a role in the generation of estimated values for the parameters that are necessary but there are no sensors. The performance degradation estimation module shows the impact for electric gain and loss according to the parameters related to turbine cycle performance. The performance degradation diagnosis module provides the belief according degradation causes considering the measurement uncertainty, analysis uncertainty, and correlation among parameters. Reference data for the development of each module was prepared by a turbine cycle simulation tool, PEPSE. The unmeasured parameter estimation module and the performance degradation estimation module make the estimation correlations by the regression analysis using the reference data. In the performance degradation diagnosis module, Bayesian network is used for the modeling of uncertainty and knowledge-base. The validation of the developed modules was performed by the test data generated by PEPSE, and accomplished by the actual data again

  2. TAP 2, Performance-Based Training Manual

    Energy Technology Data Exchange (ETDEWEB)

    1991-07-01

    Training programs at DOE nuclear facilities should provide well- trained, qualified personnel to safely and efficiently operate the facilities in accordance with DOE requirements. A need has been identified for guidance regarding analysis, design, development, implementation, and evaluation of consistent and reliable performance-based training programs. Accreditation of training programs at Category A reactors and high-hazard and selected moderate-hazard nonreactor facilities will assure consistent, appropriate, and cost-effective training of personnel responsible for the operation, maintenance, and technical support of these facilities. Training programs that are designed and based on systematically job requirements, instead of subjective estimation of trainee needs, yield training activities that are consistent and develop or improve knowledge, skills, and abilities that can be directly related to the work setting. Because the training is job-related, the content of these programs more efficiently and effectively meets the needs of the employee. Besides a better trained work force, a greater level of operational reactor safety can be realized. This manual is intended to provide an overview of the accreditation process and a brief description of the elements necessary to construct and maintain training programs that are based on the requirements of the job. Two comparison manuals provide additional information to assist contractors in their efforts to accredit training programs.

  3. TAP 2, Performance-Based Training Manual

    International Nuclear Information System (INIS)

    1991-07-01

    Training programs at DOE nuclear facilities should provide well- trained, qualified personnel to safely and efficiently operate the facilities in accordance with DOE requirements. A need has been identified for guidance regarding analysis, design, development, implementation, and evaluation of consistent and reliable performance-based training programs. Accreditation of training programs at Category A reactors and high-hazard and selected moderate-hazard nonreactor facilities will assure consistent, appropriate, and cost-effective training of personnel responsible for the operation, maintenance, and technical support of these facilities. Training programs that are designed and based on systematically job requirements, instead of subjective estimation of trainee needs, yield training activities that are consistent and develop or improve knowledge, skills, and abilities that can be directly related to the work setting. Because the training is job-related, the content of these programs more efficiently and effectively meets the needs of the employee. Besides a better trained work force, a greater level of operational reactor safety can be realized. This manual is intended to provide an overview of the accreditation process and a brief description of the elements necessary to construct and maintain training programs that are based on the requirements of the job. Two comparison manuals provide additional information to assist contractors in their efforts to accredit training programs

  4. Do Performance-Based Codes Support Universal Design in Architecture?

    DEFF Research Database (Denmark)

    Grangaard, Sidse; Frandsen, Anne Kathrine

    2016-01-01

    – Universal Design (UD). The empirical material consists of input from six workshops to which all 700 Danish Architectural firms were invited, as well as eight group interviews. The analysis shows that the current prescriptive requirements are criticized for being too homogenous and possibilities...... for differentiation and zoning are required. Therefore, a majority of professionals are interested in a performance-based model because they think that such a model will support ‘accessibility zoning’, achieving flexibility because of different levels of accessibility in a building due to its performance. The common...... of educational objectives is suggested as a tool for such a boost. The research project has been financed by the Danish Transport and Construction Agency....

  5. Incorporating Traffic Control and Safety Hardware Performance Functions into Risk-based Highway Safety Analysis

    Directory of Open Access Journals (Sweden)

    Zongzhi Li

    2017-04-01

    Full Text Available Traffic control and safety hardware such as traffic signs, lighting, signals, pavement markings, guardrails, barriers, and crash cushions form an important and inseparable part of highway infrastructure affecting safety performance. Significant progress has been made in recent decades to develop safety performance functions and crash modification factors for site-specific crash predictions. However, the existing models and methods lack rigorous treatments of safety impacts of time-deteriorating conditions of traffic control and safety hardware. This study introduces a refined method for computing the Safety Index (SI as a means of crash predictions for a highway segment that incorporates traffic control and safety hardware performance functions into the analysis. The proposed method is applied in a computation experiment using five-year data on nearly two hundred rural and urban highway segments. The root-mean square error (RMSE, Chi-square, Spearman’s rank correlation, and Mann-Whitney U tests are employed for validation.

  6. Diagnostic performance of FDG PET or PET/CT in prosthetic infection after arthroplasty: a meta-analysis

    International Nuclear Information System (INIS)

    Jin, H.; Yuan, L.; Li, C.; Kan, Y.; Yang, J.; Hao, R.

    2014-01-01

    The purpose of this study was to systematically review and perform a meta-analysis of published data regarding the diagnostic performance of positron emission tomography (PET) or PET/computed tomography (PET/CT) in prosthetic infection after arthroplasty. A comprehensive computer literature search of studies published through May 31, 2012 regarding PET or PET/CT in patients suspicious of prosthetic infection was performed in PubMed/MEDLINE, Embase and Scopus databases. Pooled sensitivity and specificity of PET or PET/CT in patients suspicious of prosthetic infection on a per prosthesis-based analysis were calculated. The area under the receiver-operating characteristic (ROC) curve was calculated to measure the accuracy of PET or PET/CT in patients with suspicious of prosthetic infection. Fourteen studies comprising 838 prosthesis with suspicious of prosthetic infection after arthroplasty were included in this meta-analysis. The pooled sensitivity of PET or PET/CT in detecting prosthetic infection was 86% (95% confidence interval [CI] 82-90%) on a per prosthesis-based analysis. The pooled specificity of PET or PET/CT in detecting prosthetic infection was 86% (95% CI 83-89%) on a per prosthesis-based analysis. The area under the ROC curve was 0.93 on a per prosthesis-based analysis. In patients suspicious of prosthetic infection, FDG PET or PET/CT demonstrated high sensitivity and specificity. FDG PET or PET/CT are accurate methods in this setting. Nevertheless, possible sources of false positive results and influcing factors should kept in mind.

  7. Diagnostic performance of FDG PET or PET/CT in prosthetic infection after arthroplasty: a meta-analysis.

    Science.gov (United States)

    Jin, H; Yuan, L; Li, C; Kan, Y; Hao, R; Yang, J

    2014-03-01

    The purpose of this study was to systematically review and perform a meta-analysis of published data regarding the diagnostic performance of positron emission tomography (PET) or PET/computed tomography (PET/CT) in prosthetic infection after arthroplasty. A comprehensive computer literature search of studies published through May 31, 2012 regarding PET or PET/CT in patients suspicious of prosthetic infection was performed in PubMed/MEDLINE, Embase and Scopus databases. Pooled sensitivity and specificity of PET or PET/CT in patients suspicious of prosthetic infection on a per prosthesis-based analysis were calculated. The area under the receiver-operating characteristic (ROC) curve was calculated to measure the accuracy of PET or PET/CT in patients with suspicious of prosthetic infection. Fourteen studies comprising 838 prosthesis with suspicious of prosthetic infection after arthroplasty were included in this meta-analysis. The pooled sensitivity of PET or PET/CT in detecting prosthetic infection was 86% (95% confidence interval [CI] 82-90%) on a per prosthesis-based analysis. The pooled specificity of PET or PET/CT in detecting prosthetic infection was 86% (95% CI 83-89%) on a per prosthesis-based analysis. The area under the ROC curve was 0.93 on a per prosthesis-based analysis. In patients suspicious of prosthetic infection, FDG PET or PET/CT demonstrated high sensitivity and specificity. FDG PET or PET/CT are accurate methods in this setting. Nevertheless, possible sources of false positive results and influcing factors should kept in mind.

  8. Analisis Kepuasan Pelanggan Berdasarkan Dimensi Servqual Menggunakan Metode Importance Performance Analysis (Studi Pada Indihome Witel Bandung

    Directory of Open Access Journals (Sweden)

    Irma Mardiana

    2017-07-01

    Full Text Available PT. Telkom is the largest telecommunications company in Indonesia that plays an important role in the development of the national telecommunications. In 2012, PT Telkom launched IndiHome (Indonesia Digital Home, that is is a multi product bundling packages of Telkom consisting landlines, internet and interactive television services (USee Tv. As the largest telecommunications company in Indonesia, PT. Telkom should provides satisfaction through good quality services in accordance with expected by the customer.The purpose of this study is to know how satisfied customers IndiHome Witel Bandung by comparing the level of customer expectations and IndiHome Witel Bandung’s performance levels based method SERVQUAL. Data collection method is done by distributing questionnaires to 100 customers of IndiHome Witel Bandung that currently active subscription (existing. The results of data from questionnaires, processed by using Gap analysis to know the level of customer satisfaction, after that reprocessed by using Important Performance Analysis (IPA or also called Kartesius Diagram analysis to know which attributes or dimensions that need to be a priority IndiHome Witel Bandung for improved performance levels and any attributes or dimensions that need to be retained by the company. Based on the results of data processing, it is known customer’s appraisal on 21 of performance attributes and 21 of expectations attributes indicate that all the attributes or dimensions have a gap between customers' perception of indihome’s performance and customer expectations. As well as with the level of conformity obtained is still below 100%, ie 79.5%. It means that Indihome’s performance still has not filled customer expectations. Based on the Kartesius diagram on Important Performance Analysis (IPA, there are eight indicators and two dimensions, they are reliability dimensions and responsiveness dimensions that Indihome should be the focus to be immediately

  9. Performance Analysis of Unsupervised Clustering Methods for Brain Tumor Segmentation

    Directory of Open Access Journals (Sweden)

    Tushar H Jaware

    2013-10-01

    Full Text Available Medical image processing is the most challenging and emerging field of neuroscience. The ultimate goal of medical image analysis in brain MRI is to extract important clinical features that would improve methods of diagnosis & treatment of disease. This paper focuses on methods to detect & extract brain tumour from brain MR images. MATLAB is used to design, software tool for locating brain tumor, based on unsupervised clustering methods. K-Means clustering algorithm is implemented & tested on data base of 30 images. Performance evolution of unsupervised clusteringmethods is presented.

  10. Benchmarking energy performance of residential buildings using two-stage multifactor data envelopment analysis with degree-day based simple-normalization approach

    International Nuclear Information System (INIS)

    Wang, Endong; Shen, Zhigang; Alp, Neslihan; Barry, Nate

    2015-01-01

    Highlights: • Two-stage DEA model is developed to benchmark building energy efficiency. • Degree-day based simple normalization is used to neutralize the climatic noise. • Results of a real case study validated the benefits of this new model. - Abstract: Being able to identify detailed meta factors of energy performance is essential for creating effective residential energy-retrofitting strategies. Compared to other benchmarking methods, nonparametric multifactor DEA (data envelopment analysis) is capable of discriminating scale factors from management factors to reveal more details to better guide retrofitting practices. A two-stage DEA energy benchmarking method is proposed in this paper. This method includes (1) first-stage meta DEA which integrates the common degree day metrics for neutralizing noise energy effects of exogenous climatic variables; and (2) second-stage Tobit regression for further detailed efficiency analysis. A case study involving 3-year longitudinal panel data of 189 residential buildings indicated the proposed method has advantages over existing methods in terms of its efficiency in data processing and results interpretation. The results of the case study also demonstrated high consistency with existing linear regression based DEA.

  11. Thermodynamic performance analysis of ramjet engine at wide working conditions

    Science.gov (United States)

    Ou, Min; Yan, Li; Tang, Jing-feng; Huang, Wei; Chen, Xiao-qian

    2017-03-01

    Although ramjet has the advantages of high-speed flying and higher specific impulse, the performance parameters will decline seriously with the increase of flight Mach number and flight height. Therefore, the investigation on the thermodynamic performance of ramjet is very crucial for broadening the working range. In the current study, a typical ramjet model has been employed to investigate the performance characteristics at wide working conditions. First of all, the compression characteristic analysis is carried out based on the Brayton cycle. The obtained results show that the specific cross-section area (A2 and A5) and the air-fuel ratio (f) have a great influence on the ramjet performance indexes. Secondly, the thermodynamic calculation process of ramjet is given from the view of the pneumatic thermal analysis. Then, the variable trends of the ramjet performance indexes with the flow conditions, the air-fuel ratio (f), the specific cross-sectional area (A2 and A5) under the fixed operating condition, equipotential dynamic pressure condition and variable dynamic pressure condition have been discussed. Finally, the optimum value of the specific cross-sectional area (A5) and the air-fuel ratio (f) of the ramjet model at a fixed work condition (Ma=3.5, H=12 km) are obtained.

  12. Global sensitivity analysis using emulators, with an example analysis of large fire plumes based on FDS simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kelsey, Adrian [Health and Safety Laboratory, Harpur Hill, Buxton (United Kingdom)

    2015-12-15

    Uncertainty in model predictions of the behaviour of fires is an important issue in fire safety analysis in nuclear power plants. A global sensitivity analysis can help identify the input parameters or sub-models that have the most significant effect on model predictions. However, to perform a global sensitivity analysis using Monte Carlo sampling might require thousands of simulations to be performed and therefore would not be practical for an analysis based on a complex fire code using computational fluid dynamics (CFD). An alternative approach is to perform a global sensitivity analysis using an emulator. Gaussian process emulators can be built using a limited number of simulations and once built a global sensitivity analysis can be performed on an emulator, rather than using simulations directly. Typically reliable emulators can be built using ten simulations for each parameter under consideration, therefore allowing a global sensitivity analysis to be performed, even for a complex computer code. In this paper we use an example of a large scale pool fire to demonstrate an emulator based approach to global sensitivity analysis. In that work an emulator based global sensitivity analysis was used to identify the key uncertain model inputs affecting the entrainment rates and flame heights in large Liquefied Natural Gas (LNG) fire plumes. The pool fire simulations were performed using the Fire Dynamics Simulator (FDS) software. Five model inputs were varied: the fire diameter, burn rate, radiative fraction, computational grid cell size and choice of turbulence model. The ranges used for these parameters in the analysis were determined from experiment and literature. The Gaussian process emulators used in the analysis were created using 127 FDS simulations. The emulators were checked for reliability, and then used to perform a global sensitivity analysis and uncertainty analysis. Large-scale ignited releases of LNG on water were performed by Sandia National

  13. Sprinting performance and resistance based training interventions: A systematic review with meta-analysis

    OpenAIRE

    Bolger, Richard; Kenny, Ian; Lyons, Mark; Harrison, Andrew J.

    2014-01-01

    peer-reviewed Introduction Much of the research which focuses on improving sprinting performance has been carried out with team sport athletes or endurance athletes (Berryman, Maurel, & Bosquet, 2010; Esteve-Lanao, Rhea, Fleck, & Lucia, 2008; Hanon, Bernard, Rabate, & Claire, 2012; Rhea, Kenn, & Dermody, 2009; Shalfawi, Haugen, Jakobsen, Enoksen, & T??nnessen, 2013; West et al., 2013). There is little consensus with the prescription of resistance based training within this body of resea...

  14. The Effects of Performance-Based Assessment Criteria on Student Performance and Self-Assessment Skills

    Science.gov (United States)

    Fastre, Greet Mia Jos; van der Klink, Marcel R.; van Merrienboer, Jeroen J. G.

    2010-01-01

    This study investigated the effect of performance-based versus competence-based assessment criteria on task performance and self-assessment skills among 39 novice secondary vocational education students in the domain of nursing and care. In a performance-based assessment group students are provided with a preset list of performance-based…

  15. Multi-master profibus dp modelling and worst case analysis-based evaluation

    OpenAIRE

    Salvatore Monforte; Eduardo Tovar; Francisco Vasques; Salvatore Cavalieri

    2002-01-01

    This paper provides an analysis of the real-time behaviour of the multi-master Profibus DP network. The analysis is based on the evaluation of the worst-case message response time and the results obtained are compared with those present in literature, pointing out its capability to perform a more accurate evaluation of the performance of the Profibus network. Copyright © 2002 IFAC.

  16. Energy Saving Performance Analysis of An Inverter-based Regenerative Power Re-utilization Device for Urban Rail Transit

    Science.gov (United States)

    Li, Jin; Qiu, Zhiling; Hu, Leilei

    2018-04-01

    The inverter-based regenerative braking power utilization devices can re-utilize the regenerative energy, thus reduce the energy consumption of urban rail transit. In this paper the power absorption principle of the inverter-based device is introduced, then the key influencing factors of energy saving performance are analyzed based on the absorption model. The field operation data verified that the control DC voltage plays an important role and lower control DC voltage yields more energy saving. Also, the one year energy saving performance data of an inverter-based re-utilization device located in NanJing S8 line is provided, and more than 1.2 million kWh energy is recovered in the one year operation.

  17. Comparative performance analysis of Thyristor and IGBT based ...

    African Journals Online (AJOL)

    The paper systematically investigates and compares the characteristics of a variable voltage fed induction motor drive for two different types of soft starters; one based on IGBT and another based on Thyristor. Experimental validation is done using ... Keywords: Variable speed electric drives, Soft starter, Thyristor, IGBT ...

  18. Evaluation of Road Performance Based on International Roughness Index and Falling Weight Deflectometer

    Science.gov (United States)

    Hasanuddin; Setyawan, A.; Yulianto, B.

    2018-03-01

    Assessment to the performance of road pavement is deemed necessary to improve the management quality of road maintenance and rehabilitation. This research to evaluate the road base on functional and structural and recommendations handling done. Assessing the pavement performance is conducted with functional and structural evaluation. Functional evaluation of pavement is based on the value of IRI (International Roughness Index) which among others is derived from reading NAASRA for analysis and recommended road handling. Meanwhile, structural evaluation of pavement is done by analyzing deflection value based on FWD (Falling Weight Deflectometer) data resulting in SN (Structural Number) value. The analysis will result in SN eff (Structural Number Effective) and SN f (Structural Number Future) value obtained from comparing SN eff to SN f value that leads to SCI (Structural Condition Index) value. SCI value implies the possible recommendation for handling pavement. The study done to Simpang Tuan-Batas Kota Jambi road segment was based on functional analysis. The study indicated that the road segment split into 12 segments in which segment 1, 3, 5, 7, 9, and 11 were of regular maintenance, segment 2, 4, 8, 10, 12 belonged to periodic maintenance, and segment 6 was of rehabilitation. The structural analysis resulted in 8 segments consisting of segment 1 and 2 recommended for regular maintenance, segment 3, 4, 5, and 7 for functional overlay, and 6 and 8 were of structural overlay.

  19. Comparative Analysis of Performance Measures for Network Screening: A Case Study of Brazilian Urban Areas

    Directory of Open Access Journals (Sweden)

    Vanessa Jamille Mesquita Xavier

    2017-01-01

    Full Text Available The overall effectiveness of the roadway safety management process relies on a robust method for identifying and ranking sites with major potential for safety improvements. In Brazil, guidelines for hotspot identification are usually based only on crash frequency and Crash Rate as safety performance measures. This study presents a comparative analysis of safety performance measures, considering its limitations of applicability in a sample of signalized intersections from Fortaleza city, Brazil. The performance of each measure to rank the sample intersection was obtained through the rank difference between each safety performance measure and the Excess Expected Average Crash Frequency with EB Adjustment (EEB. In addition, it has taken a temporal analysis based on the consistency of safety performance measures during subsequent time periods. The results have suggested a reasonable matching between the most comprehensive safety performance measure (EEB and very simple safety performance measures such as crash frequency and Crash Rate. It is recommended to investigate the consistency of the results for longer observation period as well as for a different jurisdiction in Brazil.

  20. Data-base tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    The authors use a commercial data-base software package to create several data-base products that enhance the ability of experimental physicists to analyze data from the TMX-U experiment. This software resides on a Dec-20 computer in M-Divisions's user service center (USC), where data can be analyzed separately from the main acquisition computers. When these data-base tools are combined with interactive data analysis programs, physicists can perform automated (batch-style) processing or interactive data analysis on the computers in the USC or on the supercomputers of the NMFECC, in addition to the normal processing done on the acquisition system. One data-base tool provides highly reduced data for searching and correlation analysis of several diagnostic signals for a single shot or many shots. A second data-base tool provides retrieval and storage of unreduced data for detailed analysis of one or more diagnostic signals. The authors report how these data-base tools form the core of an evolving off-line data-analysis environment on the USC computers

  1. Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models

    Directory of Open Access Journals (Sweden)

    Tai-hoon Kim

    2010-12-01

    Full Text Available Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained.

  2. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  3. Energy-Saving Performance of Flap-Adjustment-Based Centrifugal Fan

    Directory of Open Access Journals (Sweden)

    Genglin Chen

    2018-01-01

    Full Text Available The current paper mainly focuses on finding a more appropriate way to enhance the fan performance at off-design conditions. The centrifugal fan (CF based on flap-adjustment (FA has been investigated through theoretical, experimental, and finite element methods. To obtain a more predominant performance of CF from the different adjustments, we carried out a comparative analysis on FA and leading-adjustment (LA in aerodynamic performances, which included the adjusted angle of blades, total pressure, efficiency, system-efficiency, adjustment-efficiency, and energy-saving rate. The contribution of this paper is the integrated performance curve of the CF. Finally, the results showed that the effects of FA and LA on economic performance and energy savings of the fan varied with the blade angles. Furthermore, FA was feasible, which is more sensitive than LA. Moreover, the CF with FA offered a more extended flow-range of high economic characteristic in comparison with LA. Finally, when the operation flow-range extends, energy-saving rate of the fan with FA would have improvement.

  4. OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Greiner, Annette; Cholia, Shreyas; Louie, Katherine; Bethel, E. Wes; Northen, Trent R.; Bowen, Benjamin P.

    2013-10-02

    Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data access (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.

  5. Modeling and performance analysis of CCHP (combined cooling, heating and power) system based on co-firing of natural gas and biomass gasification gas

    International Nuclear Information System (INIS)

    Wang, Jiangjiang; Mao, Tianzhi; Sui, Jun; Jin, Hongguang

    2015-01-01

    Co-firing biomass and fossil energy is a cost-effective and reliable way to use renewable energy and offer advantages in flexibility, conversion efficiency and commercial possibility. This study proposes a co-fired CCHP (combined cooling, heating and power) system based on natural gas and biomass gasification gas that contains a down-draft gasifier, ICE (internal combustion engine), absorption chiller and heat exchangers. Thermodynamic models are constructed based on a modifying gasification thermochemical equilibrium model and co-fired ICE model for electricity and heat recovery. The performance analysis for the volumetric mixture ratio of natural gas and product gas indicates that the energy and exergy efficiencies are improved by 9.5% and 13.7%, respectively, for an increasing mixture ratio of 0–1.0. Furthermore, the costs of multi-products, including electricity, chilled water and hot water, based on exergoeconomic analysis are analyzed and discussed based on the influences of the mixture ratio of the two gas fuels, investment cost and biomass cost. - Highlights: • Propose a co-fired CCHP system by natural gas and biomass gasification gas. • Modify biomass gasification and co-fired ICE models. • Present the thermodynamic analysis of the volumetric mixture ratios of two gas fuels. • Energy and exergy efficiencies are improved 9.5% and 13.7%. • Discuss multi-products’ costs influenced by investment and fuel costs.

  6. Examination of the Properties of a Spent Fuel based Electricity Generation System - Scintillator Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Haneol; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    Gammavoltaic was proposed by Karl Scharf in 1960. The low efficiency resulted in gammavoltaic being used as a radiation detector. In the 1990s the efficiency of gammavoltaic increased by the use of a scintillator. Gammavoltaic was further studied as a power source for spent fuel transportation and a nuclear battery in the 2000s Haneol Lee and Man-Sung Yim also suggested electricity generation system based on spent fuel stored inside the fuel pool of a nuclear power plant. This study proposed the systematic design of an electricity conversion system using CsI(Tl) scintillator and a-Si photovoltaic cell. As such, this study is selected to be a reference paper. The results of this paper indicate a self-absorption effect from the reference model. This effect is negligible while the irradiation degradation has to be considered. Two main ways to reduce radiation induced degradation are scintillator shielding and replacing scintillator material with a material having higher radiation resistance. The analysis of the scintillator used in the 'electricity generation system using gamma radiation from spent fuel' was performed to evaluate the ideal electricity generation in the reference research.

  7. Examination of the Properties of a Spent Fuel based Electricity Generation System - Scintillator Performance Analysis

    International Nuclear Information System (INIS)

    Lee, Haneol; Yim, Man-Sung

    2016-01-01

    Gammavoltaic was proposed by Karl Scharf in 1960. The low efficiency resulted in gammavoltaic being used as a radiation detector. In the 1990s the efficiency of gammavoltaic increased by the use of a scintillator. Gammavoltaic was further studied as a power source for spent fuel transportation and a nuclear battery in the 2000s Haneol Lee and Man-Sung Yim also suggested electricity generation system based on spent fuel stored inside the fuel pool of a nuclear power plant. This study proposed the systematic design of an electricity conversion system using CsI(Tl) scintillator and a-Si photovoltaic cell. As such, this study is selected to be a reference paper. The results of this paper indicate a self-absorption effect from the reference model. This effect is negligible while the irradiation degradation has to be considered. Two main ways to reduce radiation induced degradation are scintillator shielding and replacing scintillator material with a material having higher radiation resistance. The analysis of the scintillator used in the 'electricity generation system using gamma radiation from spent fuel' was performed to evaluate the ideal electricity generation in the reference research

  8. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Performance analysis of NOAA tropospheric signal delay model

    International Nuclear Information System (INIS)

    Ibrahim, Hassan E; El-Rabbany, Ahmed

    2011-01-01

    Tropospheric delay is one of the dominant global positioning system (GPS) errors, which degrades the positioning accuracy. Recent development in tropospheric modeling relies on implementation of more accurate numerical weather prediction (NWP) models. In North America one of the NWP-based tropospheric correction models is the NOAA Tropospheric Signal Delay Model (NOAATrop), which was developed by the US National Oceanic and Atmospheric Administration (NOAA). Because of its potential to improve the GPS positioning accuracy, the NOAATrop model became the focus of many researchers. In this paper, we analyzed the performance of the NOAATrop model and examined its effect on ionosphere-free-based precise point positioning (PPP) solution. We generated 3 year long tropospheric zenith total delay (ZTD) data series for the NOAATrop model, Hopfield model, and the International GNSS Services (IGS) final tropospheric correction product, respectively. These data sets were generated at ten IGS reference stations spanning Canada and the United States. We analyzed the NOAATrop ZTD data series and compared them with those of the Hopfield model. The IGS final tropospheric product was used as a reference. The analysis shows that the performance of the NOAATrop model is a function of both season (time of the year) and geographical location. However, its performance was superior to the Hopfield model in all cases. We further investigated the effect of implementing the NOAATrop model on the ionosphere-free-based PPP solution convergence and accuracy. It is shown that the use of the NOAATrop model improved the PPP solution convergence by 1%, 10% and 15% for the latitude, longitude and height components, respectively

  10. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  11. Performance Analysis of Waste Heat Driven Pressurized Adsorption Chiller

    KAUST Repository

    LOH, Wai Soong

    2010-01-01

    This article presents the transient modeling and performance of waste heat driven pressurized adsorption chillers for refrigeration at subzero applications. This innovative adsorption chiller employs pitch-based activated carbon of type Maxsorb III (adsorbent) with refrigerant R134a as the adsorbent-adsorbate pair. It consists of an evaporator, a condenser and two adsorber/desorber beds, and it utilizes a low-grade heat source to power the batch-operated cycle. The ranges of heat source temperatures are between 55 to 90°C whilst the cooling water temperature needed to reject heat is at 30°C. A parametric analysis is presented in the study where the effects of inlet temperature, adsorption/desorption cycle time and switching time on the system performance are reported in terms of cooling capacity and coefficient of performance. © 2010 by JSME.

  12. An analysis of HANARO operating performance of the year 2001

    International Nuclear Information System (INIS)

    Yoon, D. B.; Choi, H. Y.; Lim, I. C.; Hwang, S. Y.

    2002-01-01

    For the evaluation of operating performance of the HANARO, operation data of the year 2001 were analyzed. Power output, delay times for full-power arrival and shutdown were considered as the representative measures of operating performance. The analysis results show that the total thermal power output is 3770MWD, which is the best record since the startup of the HANARO. The mean values of the delay time for full-power arrival and shutdown are calculated as 3.56 hours and 2.49 hours, respectively. The major causes for the delay of full-power arrival and shutdown are found to be the retardation of the fuel inspection, and unscheduled work for maintenance and experiment. In order to enhance the operating performance, based on the analysis results, biweekly-prearranged plan for working and experiment will be prepared in advance. The starting time of the reactor has been moved up by 1 hour for reaching the full power before the scheduled time. In addition, we will make effort so as to reduce the number of fuels that have to be inspected

  13. Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals

    Science.gov (United States)

    Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam

    A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.

  14. Research on the comparison of performance-based concept and force-based concept

    Science.gov (United States)

    Wu, Zeyu; Wang, Dongwei

    2011-03-01

    There are two ideologies about structure design: force-based concept and performance-based concept. Generally, if the structure operates during elastic stage, the two philosophies usually attain the same results. But beyond that stage, the shortage of force-based method is exposed, and the merit of performance-based is displayed. Pros and cons of each strategy are listed herein, and then which structure is best suitable to each method analyzed. At last, a real structure is evaluated by adaptive pushover method to verify that performance-based method is better than force-based method.

  15. Comparative study on DuPont analysis and DEA models for measuring stock performance using financial ratio

    Science.gov (United States)

    Arsad, Roslah; Shaari, Siti Nabilah Mohd; Isa, Zaidi

    2017-11-01

    Determining stock performance using financial ratio is challenging for many investors and researchers. Financial ratio can indicate the strengths and weaknesses of a company's stock performance. There are five categories of financial ratios namely liquidity, efficiency, leverage, profitability and market ratios. It is important to interpret the ratio correctly for proper financial decision making. The purpose of this study is to compare the performance of listed companies in Bursa Malaysia using Data Envelopment Analysis (DEA) and DuPont analysis Models. The study is conducted in 2015 involving 116 consumer products companies listed in Bursa Malaysia. The estimation method of Data Envelopment Analysis computes the efficiency scores and ranks the companies accordingly. The Alirezaee and Afsharian's method of analysis based Charnes, Cooper and Rhodes (CCR) where Constant Return to Scale (CRS) is employed. The DuPont analysis is a traditional tool for measuring the operating performance of companies. In this study, DuPont analysis is used to evaluate three different aspects such as profitability, efficiency of assets utilization and financial leverage. Return on Equity (ROE) is also calculated in DuPont analysis. This study finds that both analysis models provide different rankings of the selected samples. Hypothesis testing based on Pearson's correlation, indicates that there is no correlation between rankings produced by DEA and DuPont analysis. The DEA ranking model proposed by Alirezaee and Asharian is unstable. The method cannot provide complete ranking because the values of Balance Index is equal and zero.

  16. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  17. Performance-based shape optimization of continuum structures

    International Nuclear Information System (INIS)

    Liang Qingquan

    2010-01-01

    This paper presents a performance-based optimization (PBO) method for optimal shape design of continuum structures with stiffness constraints. Performance-based design concepts are incorporated in the shape optimization theory to achieve optimal designs. In the PBO method, the traditional shape optimization problem of minimizing the weight of a continuum structure with displacement or mean compliance constraints is transformed to the problem of maximizing the performance of the structure. The optimal shape of a continuum structure is obtained by gradually eliminating inefficient finite elements from the structure until its performance is maximized. Performance indices are employed to monitor the performance of optimized shapes in an optimization process. Performance-based optimality criteria are incorporated in the PBO method to identify the optimum from the optimization process. The PBO method is used to produce optimal shapes of plane stress continuum structures and plates in bending. Benchmark numerical results are provided to demonstrate the effectiveness of the PBO method for generating the maximum stiffness shape design of continuum structures. It is shown that the PBO method developed overcomes the limitations of traditional shape optimization methods in optimal design of continuum structures. Performance-based optimality criteria presented can be incorporated in any shape and topology optimization methods to obtain optimal designs of continuum structures.

  18. THE PERFORMANCE ANALYSIS OF A UAV BASED MOBILE MAPPING SYSTEM PLATFORM

    Directory of Open Access Journals (Sweden)

    M. L. Tsai

    2013-08-01

    Full Text Available To facilitate applications such as environment detection or disaster monitoring, the development of rapid low cost systems for collecting near real-time spatial information is very critical. Rapid spatial information collection has become an emerging trend for remote sensing and mapping applications. This study develops a Direct Georeferencing (DG based fixed-wing Unmanned Aerial Vehicle (UAV photogrammetric platform where an Inertial Navigation System (INS/Global Positioning System (GPS integrated Positioning and Orientation System (POS system is implemented to provide the DG capability of the platform. The performance verification indicates that the proposed platform can capture aerial images successfully. A flight test is performed to verify the positioning accuracy in DG mode without using Ground Control Points (GCP. The preliminary results illustrate that horizontal DG positioning accuracies in the x and y axes are around 5 m with 300 m flight height. The positioning accuracy in the z axis is less than 10 m. Such accuracy is good for near real-time disaster relief. The DG ready function of proposed platform guarantees mapping and positioning capability even in GCP free environments, which is very important for rapid urgent response for disaster relief. Generally speaking, the data processing time for the DG module, including POS solution generalization, interpolation, Exterior Orientation Parameters (EOP generation, and feature point measurements, is less than one hour.

  19. The Performance Analysis of a Uav Based Mobile Mapping System Platform

    Science.gov (United States)

    Tsai, M. L.; Chiang, K. W.; Lo, C. F.; Ch, C. H.

    2013-08-01

    To facilitate applications such as environment detection or disaster monitoring, the development of rapid low cost systems for collecting near real-time spatial information is very critical. Rapid spatial information collection has become an emerging trend for remote sensing and mapping applications. This study develops a Direct Georeferencing (DG) based fixed-wing Unmanned Aerial Vehicle (UAV) photogrammetric platform where an Inertial Navigation System (INS)/Global Positioning System (GPS) integrated Positioning and Orientation System (POS) system is implemented to provide the DG capability of the platform. The performance verification indicates that the proposed platform can capture aerial images successfully. A flight test is performed to verify the positioning accuracy in DG mode without using Ground Control Points (GCP). The preliminary results illustrate that horizontal DG positioning accuracies in the x and y axes are around 5 m with 300 m flight height. The positioning accuracy in the z axis is less than 10 m. Such accuracy is good for near real-time disaster relief. The DG ready function of proposed platform guarantees mapping and positioning capability even in GCP free environments, which is very important for rapid urgent response for disaster relief. Generally speaking, the data processing time for the DG module, including POS solution generalization, interpolation, Exterior Orientation Parameters (EOP) generation, and feature point measurements, is less than one hour.

  20. Performance analysis of multi-primary color display based on OLEDs/PLEDs

    Science.gov (United States)

    Xiong, Yan; Deng, Fei; Xu, Shan; Gao, Shufang

    2017-09-01

    A multi-primary color display, such as the six-primary color format, is a solution in expanding the color gamut of a full-color flat panel display. The performance of a multi-primary color display based on organic/polymer light-emitting diodes was analyzed in this study using the fitting curves of the characteristics of devices (i.e., current density, voltage, luminance). A white emitter was introduced into a six-primary color format to form a seven-primary color format that contributes to energy saving, and the ratio of power efficiency of a seven-primary color display to that of a six-primary color display would increase from 1.027 to 1.061 by using emitting diodes with different electroluminescent efficiencies. Different color matching schemes of the seven-primary color format display were compared in a uniform color space, and the scheme of the color reproduction did not significantly affect the display performance. Although seven- and six-primary color format displays benefit a full-color display with higher quality, they are less efficient than three-primary (i.e., red (R), green (G), and blue (B), RGB) and four-primary (i.e., RGB+white, RGBW) color format displays. For the seven-primary color formats considered in this study, the advantages of white-primary-added display with efficiently developed light-emitting devices were more evident than the format without a white primary.

  1. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    Science.gov (United States)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  2. Evaluation of Performance of Investment Funds Based on Decision Models (DEA

    Directory of Open Access Journals (Sweden)

    Alireza Samet

    2016-12-01

    Full Text Available Selection of a suitable investment funds is very important from investors' point of view and may have a significant impact on the profit or loss of the funds. Therefore, evaluation of performance of investment funds to choose the most suitable fund will be given special emphasis. One of the new techniques for evaluating the performance of the Funds based on efficiency is the Data Envelopment Analysis technique. Accordingly, the present study is aimed to analyze and evaluate the performance of investment Funds in capital market of Iran, using the technique of efficiency evaluation through data envelopment analysis technique (DEA. This research is a descriptive - applicable study and to analyze the efficiency and effectiveness, 53 investment funds in the capital market of Iran in 2013 were considered as the sample. To analyze the efficiency of these funds, data envelopment analysis (DEA is used. Research findings showed that in 2013, of a total of 53 examined funds, 11 funds were in the efficiency situation and the other 42 funds were in a state of inefficiency. Also the reference funds and virtual composited funds of all inefficient funds were evaluated.

  3. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... received great interest from both industry and research communities. The focus of this thesis is on video-based analysis of human motion and the thesis presents work within three overall topics, namely foreground segmentation, action recognition, and human pose estimation. Foreground segmentation is often...... the first important step in the analysis of human motion. By separating foreground from background the subsequent analysis can be focused and efficient. This thesis presents a robust background subtraction method that can be initialized with foreground objects in the scene and is capable of handling...

  4. Performance of a gaseous detector based energy dispersive X-ray fluorescence imaging system: Analysis of human teeth treated with dental amalgam

    International Nuclear Information System (INIS)

    Silva, A.L.M.; Figueroa, R.; Jaramillo, A.; Carvalho, M.L.; Veloso, J.F.C.A.

    2013-01-01

    Energy dispersive X-ray fluorescence (EDXRF) imaging systems are of great interest in many applications of different areas, once they allow us to get images of the spatial elemental distribution in the samples. The detector system used in this study is based on a micro patterned gas detector, named Micro-Hole and Strip Plate. The full field of view system, with an active area of 28 × 28 mm 2 presents some important features for EDXRF imaging applications, such as a position resolution below 125 μm, an intrinsic energy resolution of about 14% full width at half maximum for 5.9 keV X-rays, and a counting rate capability of 0.5 MHz. In this work, analysis of human teeth treated by dental amalgam was performed by using the EDXRF imaging system mentioned above. The goal of the analysis is to evaluate the system capabilities in the biomedical field by measuring the drift of the major constituents of a dental amalgam, Zn and Hg, throughout the tooth structures. The elemental distribution pattern of these elements obtained during the analysis suggests diffusion of these elements from the amalgam to teeth tissues. - Highlights: • Demonstration of an EDXRF imaging system based on a 2D-MHSP detector for biological analysis • Evaluation of the drift of the dental amalgam constituents, throughout the teeth • Observation of Hg diffusion, due to hydroxyapatite crystal defects that compose the teeth tissues

  5. Performance of a gaseous detector based energy dispersive X-ray fluorescence imaging system: Analysis of human teeth treated with dental amalgam

    Energy Technology Data Exchange (ETDEWEB)

    Silva, A.L.M. [I3N, Physics Dept, University of Aveiro, 3810-193 Aveiro (Portugal); Figueroa, R.; Jaramillo, A. [Physics Department, Universidad de La Frontera, Temuco (Chile); Carvalho, M.L. [Atomic Physics Centre, University of Lisbon, 1649-03 Lisboa (Portugal); Veloso, J.F.C.A., E-mail: joao.veloso@ua.pt [I3N, Physics Dept, University of Aveiro, 3810-193 Aveiro (Portugal)

    2013-08-01

    Energy dispersive X-ray fluorescence (EDXRF) imaging systems are of great interest in many applications of different areas, once they allow us to get images of the spatial elemental distribution in the samples. The detector system used in this study is based on a micro patterned gas detector, named Micro-Hole and Strip Plate. The full field of view system, with an active area of 28 × 28 mm{sup 2} presents some important features for EDXRF imaging applications, such as a position resolution below 125 μm, an intrinsic energy resolution of about 14% full width at half maximum for 5.9 keV X-rays, and a counting rate capability of 0.5 MHz. In this work, analysis of human teeth treated by dental amalgam was performed by using the EDXRF imaging system mentioned above. The goal of the analysis is to evaluate the system capabilities in the biomedical field by measuring the drift of the major constituents of a dental amalgam, Zn and Hg, throughout the tooth structures. The elemental distribution pattern of these elements obtained during the analysis suggests diffusion of these elements from the amalgam to teeth tissues. - Highlights: • Demonstration of an EDXRF imaging system based on a 2D-MHSP detector for biological analysis • Evaluation of the drift of the dental amalgam constituents, throughout the teeth • Observation of Hg diffusion, due to hydroxyapatite crystal defects that compose the teeth tissues.

  6. Cost savings from performance-based maintenance contracting

    NARCIS (Netherlands)

    Straub, A.

    2009-01-01

    New procurement approaches combined with performance-based building approaches should reduce costs, but empirical qualitative and quantitative studies are lacking. Performance-based maintenance contracts give maintenance suppliers incentives to improve their way of working. Innovative,

  7. Optimal Sizing and Performance Evaluation of a Renewable Energy Based Microgrid in Future Seaports

    DEFF Research Database (Denmark)

    Baizura Binti Ahamad, Nor; Othman @ Marzuki, Muzaidi Bin; Quintero, Juan Carlos Vasquez

    2018-01-01

    This paper presents the optimal design and specifies the dimension, energy planning and evaluates the performance of a microgrid to supply the electricity to the load by using integrated microgrid. The integrated system consists of PV, wind turbine and a battery for grid-connected. This paper also...... analyzes the performance of the designed system based on seaport located in Copenhagen, Denmark as a case study. The analysis is performed by using Hybrid Optimization Model for Electric Renewables (HOMER) software which includes optimization and sensitivity analysis result. The simulation result indicates...... that the implementation of microgrid technologies would be a convenient solution to supply the electricity to the load application (shipboard)...

  8. Evaluation of analytical performance based on partial order methodology.

    Science.gov (United States)

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Analysis of Performance Measures in the Banking System

    Directory of Open Access Journals (Sweden)

    Angelica STRATULAT

    2013-08-01

    Full Text Available The complex and delicate character of the problem of banking performance, in the context of harsh competition and the emergence of multiple risks, impose on the banks the permanent evaluation of the behavior and the analysis of internal activity. In the context of the fast changes that take place in national economies lately, starting points towards a new banking order must be based on new models of banking management. The macroeconomic risk factors may have a significant impact on the performance of a banking institution, with direct implications on the quality of the credit portfolio, on profitability and its fructification and, finally, on the entire banking system. The evaluation of a bank’s profitability is done with the aid of the banking performance indicators, which reflect a multitude of aspects regarding the degree of profit realization, managerial and operational efficiency. The most important objective that banks which want to assume the responsibility of running a successful practice will be the identification of market needs and the choosing of a strategic position in this market, in the context of banking performance and minimal risk.

  10. Performance Analysis of Millimeter-Wave Multi-hop Machine-to-Machine Networks Based on Hop Distance Statistics

    Directory of Open Access Journals (Sweden)

    Haejoon Jung

    2018-01-01

    Full Text Available As an intrinsic part of the Internet of Things (IoT ecosystem, machine-to-machine (M2M communications are expected to provide ubiquitous connectivity between machines. Millimeter-wave (mmWave communication is another promising technology for the future communication systems to alleviate the pressure of scarce spectrum resources. For this reason, in this paper, we consider multi-hop M2M communications, where a machine-type communication (MTC device with the limited transmit power relays to help other devices using mmWave. To be specific, we focus on hop distance statistics and their impacts on system performances in multi-hop wireless networks (MWNs with directional antenna arrays in mmWave for M2M communications. Different from microwave systems, in mmWave communications, wireless channel suffers from blockage by obstacles that heavily attenuate line-of-sight signals, which may result in limited per-hop progress in MWNs. We consider two routing strategies aiming at different types of applications and derive the probability distributions of their hop distances. Moreover, we provide their baseline statistics assuming the blockage-free scenario to quantify the impact of blockages. Based on the hop distance analysis, we propose a method to estimate the end-to-end performances (e.g., outage probability, hop count, and transmit energy of the mmWave MWNs, which provides important insights into mmWave MWN design without time-consuming and repetitive end-to-end simulation.

  11. Performance Analysis of Millimeter-Wave Multi-hop Machine-to-Machine Networks Based on Hop Distance Statistics.

    Science.gov (United States)

    Jung, Haejoon; Lee, In-Ho

    2018-01-12

    As an intrinsic part of the Internet of Things (IoT) ecosystem, machine-to-machine (M2M) communications are expected to provide ubiquitous connectivity between machines. Millimeter-wave (mmWave) communication is another promising technology for the future communication systems to alleviate the pressure of scarce spectrum resources. For this reason, in this paper, we consider multi-hop M2M communications, where a machine-type communication (MTC) device with the limited transmit power relays to help other devices using mmWave. To be specific, we focus on hop distance statistics and their impacts on system performances in multi-hop wireless networks (MWNs) with directional antenna arrays in mmWave for M2M communications. Different from microwave systems, in mmWave communications, wireless channel suffers from blockage by obstacles that heavily attenuate line-of-sight signals, which may result in limited per-hop progress in MWNs. We consider two routing strategies aiming at different types of applications and derive the probability distributions of their hop distances. Moreover, we provide their baseline statistics assuming the blockage-free scenario to quantify the impact of blockages. Based on the hop distance analysis, we propose a method to estimate the end-to-end performances (e.g., outage probability, hop count, and transmit energy) of the mmWave MWNs, which provides important insights into mmWave MWN design without time-consuming and repetitive end-to-end simulation.

  12. Breath analysis based on micropreconcentrator for early cancer diagnosis

    Science.gov (United States)

    Lee, Sang-Seok

    2018-02-01

    We are developing micropreconcentrators based on micro/nanotechnology to detect trace levels of volatile organic compound (VOC) gases contained in human and canine exhaled breath. The possibility of using exhaled VOC gases as biomarkers for various cancer diagnoses has been previously discussed. For early cancer diagnosis, detection of trace levels of VOC gas is indispensable. Using micropreconcentrators based on MEMS technology or nanotechnology is very promising for detection of VOC gas. A micropreconcentrator based breath analysis technique also has advantages from the viewpoints of cost performance and availability for various cancers diagnosis. In this paper, we introduce design, fabrication and evaluation results of our MEMS and nanotechnology based micropreconcentrators. In the MEMS based device, we propose a flower leaf type Si microstructure, and its shape and configuration are optimized quantitatively by finite element method simulation. The nanotechnology based micropreconcentrator consists of carbon nanotube (CNT) structures. As a result, we achieve ppb level VOC gas detection with our micropreconcentrators and usual gas chromatography system that can detect on the order of ppm VOC in gas samples. In performance evaluation, we also confirm that the CNT based micropreconcentrator shows 115 times better concentration ratio than that of the Si based micropreconcentrator. Moreover, we discuss a commercialization idea for new cancer diagnosis using breath analysis. Future work and preliminary clinical testing in dogs is also discussed.

  13. Performance Analysis of Thermoelectric Based Automotive Waste Heat Recovery System with Nanofluid Coolant

    Directory of Open Access Journals (Sweden)

    Zhi Li

    2017-09-01

    Full Text Available Output performance of a thermoelectric-based automotive waste heat recovery system with a nanofluid coolant is analyzed in this study. Comparison between Cu-Ethylene glycol (Cu-EG nanofluid coolant and ethylene glycol with water (EG-W coolant under equal mass flow rate indicates that Cu-EG nanofluid as a coolant can effectively improve power output and thermoelectric conversion efficiency for the system. Power output enhancement for a 3% concentration of nanofluid is 2.5–8 W (12.65–13.95% compared to EG-Water when inlet temperature of exhaust varies within 500–710 K. The increase of nanofluid concentration within a realizable range (6% has positive effect on output performance of the system. Study on the relationship between total area of thermoelectric modules (TEMs and output performance of the system indicates that optimal total area of TEMs exists for maximizing output performance of the system. Cu-EG nanofluid as coolant can decrease optimal total area of TEMs compared with EG-W, which will bring significant advantages for the optimization and arrangement of TEMs whether the system space is sufficient or not. Moreover, power output enhancement under Cu-EG nanofluid coolant is larger than that of EG-W coolant due to the increase of hot side heat transfer coefficient of TEMs.

  14. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system....... A complex timber structure with a large number of failure modes is modelled with only a few dominant failure modes. First, a component based robustness analysis is performed based on the reliability indices of the remaining elements after the removal of selected critical elements. The robustness...... is expressed and evaluated by a robustness index. Next, the robustness is assessed using system reliability indices where the probabilistic failure model is modelled by a series system of parallel systems....

  15. Optimum Performance-Based Seismic Design Using a Hybrid Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    S. Talatahari

    2014-01-01

    Full Text Available A hybrid optimization method is presented to optimum seismic design of steel frames considering four performance levels. These performance levels are considered to determine the optimum design of structures to reduce the structural cost. A pushover analysis of steel building frameworks subject to equivalent-static earthquake loading is utilized. The algorithm is based on the concepts of the charged system search in which each agent is affected by local and global best positions stored in the charged memory considering the governing laws of electrical physics. Comparison of the results of the hybrid algorithm with those of other metaheuristic algorithms shows the efficiency of the hybrid algorithm.

  16. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  17. Determining the energy performance of manually controlled solar shades: A stochastic model based co-simulation analysis

    International Nuclear Information System (INIS)

    Yao, Jian

    2014-01-01

    Highlights: • Driving factor for adjustment of manually controlled solar shades was determined. • A stochastic model for manual solar shades was constructed using Markov method. • Co-simulation with Energyplus was carried out in BCVTB. • External shading even manually controlled should be used prior to LOW-E windows. • Previous studies on manual solar shades may overestimate energy savings. - Abstract: Solar shading devices play a significant role in reducing building energy consumption and maintaining a comfortable indoor condition. In this paper, a typical office building with internal roller shades in hot summer and cold winter zone was selected to determine the driving factor of control behavior of manual solar shades. Solar radiation was determined as the major factor in driving solar shading adjustment based on field measurements and logit analysis and then a stochastic model for manually adjusted solar shades was constructed by using Markov method. This model was used in BCVTB for further co-simulation with Energyplus to determine the impact of the control behavior of solar shades on energy performance. The results show that manually adjusted solar shades, whatever located inside or outside, have a relatively high energy saving performance than clear-pane windows while only external shades perform better than regularly used LOW-E windows. Simulation also indicates that using an ideal assumption of solar shade adjustment as most studies do in building simulation may lead to an overestimation of energy saving by about 16–30%. There is a need to improve occupants’ actions on shades to more effectively respond to outdoor conditions in order to lower energy consumption, and this improvement can be easily achieved by using simple strategies as a guide to control manual solar shades

  18. Uniformity testing: assessment of a centralized web-based uniformity analysis system.

    Science.gov (United States)

    Klempa, Meaghan C

    2011-06-01

    Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.

  19. Key Sustainability Performance Indicator Analysis for Czech Breweries

    Directory of Open Access Journals (Sweden)

    Edward Kasem

    2015-01-01

    Full Text Available Sustainability performance can be said to be an ability of an organization to remain productive over time and hold on to its potential for maintaining long-term profitability. Since the brewery sector is one of the most important and leading markets in the foodstuff industry of the Czech Republic, this study depicts the Czech breweries’ formal entry into sustainability reporting and performance. The purpose of this paper is to provide an efficiency level evaluation which would represent the level of corporate performance of Czech breweries. For this reason, Data Envelopment Analysis (DEA is introduced. In order to apply it, we utilize a set of key performance indicators (KPIs based on two international standard frameworks: the Global Reporting Initiative (GRI and its GRI 4 guidelines, and the guideline KPIs for ESG 3.0, which was published by the DVFA Society. Four sustainability dimensions (economic, environmental, social and governance are covered, making it thus possible to adequately evaluate sustainability performance in Czech breweries. The main output is not only the efficiency score of the company but also the input weights. These weights are used to determine the contribution of particular criteria to the breweries’ achieved score. According to the achieved efficiency results for Czech breweries, the percentage of women supervising the company does not affect the sustainability performance.

  20. Performance Issues in High Performance Fortran Implementations of Sensor-Based Applications

    Directory of Open Access Journals (Sweden)

    David R. O'hallaron

    1997-01-01

    Full Text Available Applications that get their inputs from sensors are an important and often overlooked application domain for High Performance Fortran (HPF. Such sensor-based applications typically perform regular operations on dense arrays, and often have latency and through put requirements that can only be achieved with parallel machines. This article describes a study of sensor-based applications, including the fast Fourier transform, synthetic aperture radar imaging, narrowband tracking radar processing, multibaseline stereo imaging, and medical magnetic resonance imaging. The applications are written in a dialect of HPF developed at Carnegie Mellon, and are compiled by the Fx compiler for the Intel Paragon. The main results of the study are that (1 it is possible to realize good performance for realistic sensor-based applications written in HPF and (2 the performance of the applications is determined by the performance of three core operations: independent loops (i.e., loops with no dependences between iterations, reductions, and index permutations. The article discusses the implications for HPF implementations and introduces some simple tests that implementers and users can use to measure the efficiency of the loops, reductions, and index permutations generated by an HPF compiler.

  1. Ongoing Analysis of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    Science.gov (United States)

    Ruf, Joseph; Holt, James B.; Canabal, Francisco

    1999-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  2. Performance Analysis of Information Services in a Grid Environment

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-10-01

    Full Text Available The Information Service is a fundamental component in a grid environment. It has to meet a lot of requirements such as access to static and dynamic information related to grid resources, efficient and secure access to dynamic data, decentralized maintenance, fault tolerance etc., in order to achieve better performance, scalability, security and extensibility. Currently there are two different major approaches. One is based on a directory infrastructure and another one on a novel approach that exploits a relational DBMS. In this paper we present a performance comparison analysis between Grid Resource Information Service (GRIS and Local Dynamic Grid Catalog relational information service (LDGC, providing also information about two projects (iGrid and Grid Relational Catalog in the grid data management area.

  3. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  4. Performance analysis of a Miller cycle engine by an indirect analysis method with sparking and knock in consideration

    International Nuclear Information System (INIS)

    Wang, Yuanfeng; Zu, Bingfeng; Xu, Yuliang; Wang, Zhen; Liu, Jie

    2016-01-01

    Highlights: • A quasi-dimensional model was adopted to study Miller cycle engine’s performance. • A new indirect performance analysis method was proposed. • The definition of effective compression ratio was modified. • The modified effective compression ratio takes the trapped mixture mass in account. • The factors limiting the fuel economy in Miller cycle engine were found out. - Abstract: In this paper, a full-factorial design of experiment was applied to thoroughly investigate the effects of compression ratio, intake valve closing retardation angle, and engine speed on the fuel consumption performance and power performance of the Miller cycle engine based on a quasi-dimensional simulation model. A new indirect analysis method based on formula derivation and main effect analysis was proposed to simplify the complex relationship between the design factors and the performance parameters. The definition of effective compression ratio was modified to take account of the actual mass of mixture trapped in the cylinder. The results show that the distributions of brake mean effective pressure and brake specific fuel consumption can be regarded as the re-organization results from the distributions of volumetric efficiency and indicated efficiency. The intake valve closing retardation angle has a strong negative correlation with volumetric efficiency. The modified effective compression ratio is the approximate product of the compression ratio and the volumetric efficiency, and makes obvious effects on the distribution of the indicated efficiency. Therefore the combustion process is co-evolved with the intake process in a Miller cycle engine. The further improvement of brake specific fuel consumption is mainly limited by four factors, i.e., the back flow loss, the exergy loss, the incomplete expansion loss, and the combustion loss. The improvement of fuel consumption performance is at a cost of power performance, and the trade-off between the both essentially

  5. ADMINISTRATOR’S ROLE IN PERFORMANCE BASED REWARD AS A DETERMINANT OF EMPLOYEE OUTCOMES

    Directory of Open Access Journals (Sweden)

    Azman ISMAIL

    2015-06-01

    Full Text Available According to the recent literature pertaining on workplace compensation program, administrators often play two important roles in planning and implementing performance based reward: communication and performance appraisal. Recent studies in this field highlights that the ability of administrators to appropriately communicate pay information and appraise employee performance may have a significant impact on employee outcomes, especially job satisfaction and organizational commitment. Therefore, this study was conducted to assess the relationship between administrator’s role in performance based reward and employee outcomes using self-administered questionnaires collected from employees at a district council in Malaysia. The outcomes of the SmartPLS path model analysis showed that pay communication does not act as an important determinant of job satisfaction, but performance appraisal does act as an important determinant of job satisfaction. Conversely, both pay communication and performance appraisal act as important determinants of organizational commitment. Hence, discussion, implications and conclusion are elaborated.

  6. Knowledge construction about port performance evaluation: An international literature analysis

    Directory of Open Access Journals (Sweden)

    Karine Somensi

    2017-10-01

    Full Text Available Purpose: This study aims at identifying and analyzing the characteristics of international scientific research that address the literature fragment referring to the Port Performance Evaluation to identify the existence of theoretical alignment of Performance Evaluation notion, as an area of knowledge, with practical area stage, the Port Performance Evaluation. Design/Methodology/Approach: The approach the problem, this paper makes use of qualitative research, since it analyzes the Bibliographical Portfolio characteristics related to the Performance Evaluation Port. The strategy adopted was action research where the authors through their analysis and interpretation made the selection of the  Bibliographical Portfolio. Findings: From the analyzed literature fragment it was possible to identify some misalignment between what has been pointed out in the literature regarding the management practices in the port sector. This discrepancy refers to the management practices that are ignored by port managers, which implies the loss of opportunities and may even come to jeopardize the organization's performance. Research limitations/implications: The literature search was restricted to articles written in the English language, published in indexed scientific journals in the selected databases (ii the restriction by the time limit of articles published after the year 2000; (iii the generation of knowledge based on the characteristics selected by the researchers and (iv the analysis of BP articles regarding the  by the judgment and interpretation of this research authors. It is suggested for future work the expanding this research to other databases, other languages, other features, and continuity of this investigation with the development of "systemic analysis' and 'identifying research opportunities' stages through ProKnow-C. Originality/value: Although two similar works have been developed in the same area of research in 2015, the results achieved have

  7. Performance analysis of photovoltaic thermal (PVT) water collectors

    International Nuclear Information System (INIS)

    Fudholi, Ahmad; Sopian, Kamaruzzaman; Yazdi, Mohammad H.; Ruslan, Mohd Hafidz; Ibrahim, Adnan; Kazem, Hussein A.

    2014-01-01

    Highlights: • Performances analysis of PVT collector based on energy efficiencies. • New absorber designs of PVT collectors were presented. • Comparison present study with other absorber collector designs was presented. • High efficiencies were obtained for spiral flow absorber. - Abstract: The electrical and thermal performances of photovoltaic thermal (PVT) water collectors were determined under 500–800 W/m 2 solar radiation levels. At each solar radiation level, mass flow rates ranging from 0.011 kg/s to 0.041 kg/s were introduced. The PVT collectors were tested with respect to PV efficiency, thermal efficiency, and a combination of both (PVT efficiency). The results show that the spiral flow absorber exhibited the highest performance at a solar radiation level of 800 W/m 2 and mass flow rate of 0.041 kg/s. This absorber produced a PVT efficiency of 68.4%, a PV efficiency of 13.8%, and a thermal efficiency of 54.6%. It also produced a primary-energy saving efficiency ranging from 79% to 91% at a mass flow rate of 0.011–0.041 kg/s

  8. Performance Evaluation of IP Based Multimedia Services in UMTS

    Directory of Open Access Journals (Sweden)

    Riri Fitri SARI

    2008-01-01

    Full Text Available This paper presents our work in the performance evaluation of UMTS network based on simulation. Enhanced UMTS Radio Access Network Extensions for NS-2 (EURANE developed by SEACORN has brought us to the higher phase of UMTS simulation in third generation wireless telecommunication system. Wireless 3G is designed to be able to deliver various kind of multimedia package through an IP network for the purpose of easier interconnection with fixed network with various existing multimedia services. Multimedia services with their bandwidth consumption characteristics are able to be sent through a UMTS network with the existence of High Speed Data Packet Access (HSPDA in Release 5. Quality of Service (QoS is a major concern in multimedia services. This paper shows the performance analysis of a number of multimedia services and their QoS using HSDPA in UMTS. The experiments were based on EURANE extension for NS-2. From the simulation conducted, we found that Unacknowledged Mode (UM in Radio Link Control (RLC will perform better for QoS class number 1 (VoIP and 2 (Video Streaming, while Acknowledged Mode (AM mode are more suitable for QoS class number 3 (web server and 4 (FTP.

  9. Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models

    OpenAIRE

    Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon

    2010-01-01

    Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied rece...

  10. Analysis of multiuser mixed RF/FSO relay networks for performance improvements in Cloud Computing-Based Radio Access Networks (CC-RANs)

    Science.gov (United States)

    Alimi, Isiaka A.; Monteiro, Paulo P.; Teixeira, António L.

    2017-11-01

    The key paths toward the fifth generation (5G) network requirements are towards centralized processing and small-cell densification systems that are implemented on the cloud computing-based radio access networks (CC-RANs). The increasing recognitions of the CC-RANs can be attributed to their valuable features regarding system performance optimization and cost-effectiveness. Nevertheless, realization of the stringent requirements of the fronthaul that connects the network elements is highly demanding. In this paper, considering the small-cell network architectures, we present multiuser mixed radio-frequency/free-space optical (RF/FSO) relay networks as feasible technologies for the alleviation of the stringent requirements in the CC-RANs. In this study, we use the end-to-end (e2e) outage probability, average symbol error probability (ASEP), and ergodic channel capacity as the performance metrics in our analysis. Simulation results show the suitability of deployment of mixed RF/FSO schemes in the real-life scenarios.

  11. Twenty-fifth water reactor safety information meeting: Proceedings. Volume 2: Human reliability analysis and human performance evaluation; Technical issues related to rulemakings; Risk-informed, performance-based initiatives; High burn-up fuel research

    International Nuclear Information System (INIS)

    Monteleone, S.

    1998-03-01

    This three-volume report contains papers presented at the conference. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Japan, Norway, and Russia. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. This volume contains the following: (1) human reliability analysis and human performance evaluation; (2) technical issues related to rulemakings; (3) risk-informed, performance-based initiatives; and (4) high burn-up fuel research

  12. Identifying Importance-Performance Matrix Analysis (IPMA) of ...

    African Journals Online (AJOL)

    Identifying Importance-Performance Matrix Analysis (IPMA) of intellectual capital and Islamic work ethics in Malaysian SMES. ... capital and Islamic work ethics significantly influenced business performance. ... AJOL African Journals Online.

  13. Speciation analysis of cobalt in foods by high-performance liquid chromatography and neutron activation analysis

    International Nuclear Information System (INIS)

    Muto, Toshio; Koyama, Motoko

    1994-01-01

    A combined method by coupling high-performance liquid chromatography (HPLC, as a separation method) with neutron activation analysis (as a detection method) have been applied to the speciation analysis of cobalt in daily foods (e.g. egg, fish and milk). Cobalt species including free cobalt, vitamin B 12 and protein-bound cobalt were separated with a preparative HPLC and a centrifuge. Subsequently, the determination of cobalt in the separated species was made by neutron activation analysis. The results showed that the content of the total cobalt in the foods was found to lie in the range 0.4-11ng/g(0.4-11ppb) based on wet weight. The compositions of free cobalt, vitamin B 12 and protein-bound cobalt were ranged 16-43%, 55-73%, 2.3-17%, respectively. These experimental evidences suggest that the combination of HPLC and neutron activation analysis is expected to be a useful tool for speciation analysis of trace elements in biological as well as environmental materials. (author)

  14. Code structure for U-Mo fuel performance analysis in high performance research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Gwan Yoon; Cho, Tae Won; Lee, Chul Min; Sohn, Dong Seong [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Kyu Hong; Park, Jong Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A performance analysis modeling applicable to research reactor fuel is being developed with available models describing fuel performance phenomena observed from in-pile tests. We established the calculation algorithm and scheme to best predict fuel performance using radio-thermo-mechanically coupled system to consider fuel swelling, interaction layer growth, pore formation in the fuel meat, and creep fuel deformation and mass relocation, etc. In this paper, we present a general structure of the performance analysis code for typical research reactor fuel and advanced features such as a model to predict fuel failure induced by combination of breakaway swelling and pore growth in the fuel meat. Thermo-mechanical code dedicated to the modeling of U-Mo dispersion fuel plates is being under development in Korea to satisfy a demand for advanced performance analysis and safe assessment of the plates. The major physical phenomena during irradiation are considered in the code such that interaction layer formation by fuel-matrix interdiffusion, fission induced swelling of fuel particle, mass relocation by fission induced stress, and pore formation at the interface between the reaction product and Al matrix.

  15. Developing safety performance functions incorporating reliability-based risk measures.

    Science.gov (United States)

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. CONPAS 1.0 (CONtainment Performance Analysis System). User`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Jin, Young Ho [Korea Atomic Energy Research Institute, Daeduk (Korea, Republic of)

    1996-04-01

    CONPAS (CONtainment Performance Analysis System) is a verified computer code package to integrate the numerical, graphical, and results-operation aspects of Level 2 probabilistic safety assessments (PSA) for nuclear power plants automatically under a PC window environment. Compared with the existing DOS-based computer codes for Level 2 PSA, the most important merit of the window-based computer code is that user can easily describe and quantify the accident progression models, and manipulate the resultant outputs in a variety of ways. As a main logic for accident progression analysis, CONPAS employs a concept of the small containment phenomenological event tree (CPET) helpful to trace out visually individual accident progressions and of the large supporting event tree (LSET) for its detailed quantification. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules; (1) ET Editor for construction of several event tree models describing the accident progressions, (2) Computer for quantification of the constructed event trees and graphical display of the resultant outputs, (3) Text Editor for preparation of input decks for quanification and utilization of calculational results, and (4) Mechanistic Code Plotter for utilization of results obtained from severe accident analysis codes. Compared with other existing computer codes for Level 2 PSA, the CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friend interface. 10 refs. (Author) .new.

  17. Performance based fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2002-01-01

    Different aspects of fault detection and fault isolation in closed-loop systems are considered. It is shown that using the standard setup known from feedback control, it is possible to formulate fault diagnosis problems based on a performance index in this general standard setup. It is also shown...

  18. Performance analysis of electricity generation by the medium temperature geothermal resources: Velika Ciglena case study

    International Nuclear Information System (INIS)

    Rašković, Predrag; Guzović, Zvonimir; Cvetković, Svetislav

    2013-01-01

    During the last decade, a design of an energy efficient and cost effective geothermal plant represents a significant and on-going technical challenge in all the Western Balkan countries. In the Republic of Croatia, the geothermal field Velika Ciglena is identified as one of the most valuable geothermal heat sources and probably the location where the first geothermal plant in the Western Balkan area will be built. The purpose of this work is the conceptual design and performance analysis of the binary plants–the one which operates under the Organic Rankine Cycle (ORC) and the other under Kalina (KLN) cycle–which can be used for geothermal energy utilization in Velika Ciglena. A conceptual plant design is performed by the equation-oriented modelling approach and supported by the two steady-state spreadsheet simulators. The performance analysis of all design solutions is conducted through energy and exergy analysis, and by the estimated total cost of operating units in the plant. The results of the analysis indicate that the plant design based on the ORC cycle has a higher thermodynamic efficiency and lower cost of equipment, and consequently, it is more suitable for the future geothermal plant in Velika Ciglena. - Highlights: ► Paper presents the analysis of binary geothermal plant for the utilization of recourses in Velika Ciglena field (Croatia). ► Thermodynamic and economical parameters of both cycles are calculated by the spreadsheet simulation software. ► The results of performance analysis indicate the advantage of electricity production based on ORC cycle

  19. Tourists‘ Satisfaction at Trijuginarayan, India: An Importance-Performance Analysis

    Directory of Open Access Journals (Sweden)

    Satish Chandra BAGRI

    2015-12-01

    Full Text Available Satisfaction is an excellent predictor of tourist behaviour as it influences the choice of destination, consumption of products and services, the decision to return and maintain lasting relationships. This paper analyzes the level of tourist satisfaction with destination attributes using the Importance-Performance Analysis (IPA, based on the information obtained from 200 domestic tourists visiting Trijuginarayan, an emerging spiritual and adventure tourist destination located in Garhwal Himalaya in Uttarakhand state of India. The results obtained show that attributes related to tourism product of spiritual and cultural nature, atmosphere and climate, a variety of tourist activities, hospitality and safety are significant factors in determining tourist satisfaction, whereas basic facilities such as accommodation, transportation, tourism infrastructure and hygiene and sanitation at destination are of significant importance in satisfaction evaluation. Findings also reveal that tourists were satisfied with the core products, but were dissatisfied with basic tourist facilities offered at the destination. Based on the results, the paper concludes that tourism stakeholders must outline effective strategies for holistic development and improving performance of attributes in the given destination.

  20. Meta-analysis of the relationship between TQM and Business Performance

    International Nuclear Information System (INIS)

    Ahmad M F; Zakuan N; Jusoh A; Tasir Z; Takala J

    2013-01-01

    Meta-analysis has been conducted based on 20 previous works from 4,040 firms at 16 countries from Asia, Europe and America. Throughout this paper a meta-analysis, this paper reviews the relationships between TQM and business performance amongst the regions. Meta-analysis result concludes that the average of rc is 0.47; Asia (rc=0.54), America (rc=0.43) and Europe (rc=0.38). The analysis also shows that Asia developed countries have greatest impact of TQM (rc=0.56). However, the analysis of ANOVA and t-test show that there is no significant difference amongst type of country (developed and developing countries) and regions at p=0.05. In addition, the average result of rc 2 is 0.24; Asia (rc 2 =0.33), America (rc 2 =0.22) and Europe (rc 2 =0.15). Meanwhile, rc 2 in developing countries (rc 2 =0.28) are higher than developed countries (rc 2 =0.21).

  1. Performance analysis of Supply Chain Management with Supply Chain Operation reference model

    Science.gov (United States)

    Hasibuan, Abdurrozzaq; Arfah, Mahrani; Parinduri, Luthfi; Hernawati, Tri; Suliawati; Harahap, Bonar; Rahmah Sibuea, Siti; Krianto Sulaiman, Oris; purwadi, Adi

    2018-04-01

    This research was conducted at PT. Shamrock Manufacturing Corpora, the company is required to think creatively to implement competition strategy by producing goods/services that are more qualified, cheaper. Therefore, it is necessary to measure the performance of Supply Chain Management in order to improve the competitiveness. Therefore, the company is required to optimize its production output to meet the export quality standard. This research begins with the creation of initial dimensions based on Supply Chain Management process, ie Plan, Source, Make, Delivery, and Return with hierarchy based on Supply Chain Reference Operation that is Reliability, Responsiveness, Agility, Cost, and Asset. Key Performance Indicator identification becomes a benchmark in performance measurement whereas Snorm De Boer normalization serves to equalize Key Performance Indicator value. Analiytical Hierarchy Process is done to assist in determining priority criteria. Measurement of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora produces SC. Responsiveness (0.649) has higher weight (priority) than other alternatives. The result of performance analysis using Supply Chain Reference Operation model of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora looks good because its monitoring system between 50-100 is good.

  2. Comprehensive assessment of firm financial performance using financial ratios and linguistic analysis of annual reports

    OpenAIRE

    Renáta Myšková; Petr Hájek

    2017-01-01

    Indicators of financial performance, especially financial ratio analysis, have become important financial decision-support information used by firm management and other stakeholders to assess financial stability and growth potential. However, additional information may be hidden in management communication. The article deals with the analysis of the annual reports of U.S. firms from both points of view, a financial one based on a set of financial ratios, and a linguistic one based on the anal...

  3. FINANCIAL PERFORMANCE ANALYSIS BASED ON THE FINANCIAL STATEMENTS FOR THE COMPANIES LOCATED IN THE BIHOR - HAJDU BIHAR EUROREGION

    Directory of Open Access Journals (Sweden)

    Droj Laurentiu

    2012-12-01

    Full Text Available This paper will be later used within the Doctoral thesis: “The Mechanism of Financing Investment Projects by Usage of European Structural Funds”, which is currently under development at the University Babeș Bolyai Cluj Napoca, Faculty of Economics and Business Management, under the coordination of the prof. univ. dr. Ioan Trenca. This paper comes also as a result of the European Funded project PERINPRO “Cross-Border Research Programme - Performance Indicators of the Economic Entities from Bihor-Hajdu Bihar Euroregion”. The goal of the project was to identify of a set of common indicators that characterizes companies in the Bihor-Hajdu Bihar Euroregion and which will be used to analyze the financial health of the economic entities in the Euroregion of Hajdu-Bihar- Bihor. The first chapter of the paper will introduce the research and also will present the literature review and the methodological framework: by establishing a common set of indicators for the financial analysis of the companies located in the Bihor-Hajdu Bihar Euroregion. Seven of these indicators considered to be highly important will also briefly described and defined. Some of these indicators are used for the first time in a trans-national analysis over companies located in the Romanian-Hungarian cross border area. In the second chapter the research will be focused over establishing a common ground for usage of the financial reporting documents as basis for the analysis. Several characteristics which differentiate the financial reporting documents from Romania and Hungary will be identified and measures for correction of the values of the indicators will be proposed. This comparative study can be considered an innovation, as well, in the cross-border area since in the past no other studies of this types were performed between Romania and Hungary. The third chapter will be focused over the application of seven identified common indicators to companies based

  4. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  5. Analysis of excess reactivity of JOYO MK-III performance test core

    International Nuclear Information System (INIS)

    Maeda, Shigetaka; Yokoyama, Kenji

    2003-10-01

    JOYO is currently being upgraded to the high performance irradiation bed JOYO MK-III core'. The MK-III core is divided into two fuel regions with different plutonium contents. To obtain a higher neutron flux, the active core height was reduced from 55 cm to 50 cm. The reflector subassemblies were replaced by shielding subassemblies in the outer two rows. Twenty of the MK-III outer core fuel subassemblies in the performance test core were partially burned in the transition core. Four irradiation test rigs, which do not contain any fuel material, were loaded in the center of the performance test core. In order to evaluate the excess reactivity of MK-III performance test core accurately, we evaluated it by applying not only the JOYO MK-II core management code system MAGI, but also the MK-III core management code system HESTIA, the JUPITER standard analysis method and the Monte Carlo method with JFS-3-J3.2R content set. The excess reactivity evaluations obtained by the JUPITER standard analysis method were corrected to results based on transport theory with zero mesh-size in space and angle. A bias factor based on the MK-II 35th core, which sensitivity was similar to MK-III performance test core's, was also applied, except in the case where an adjusted nuclear cross-section library was used. Exact three-dimensional, pin-by-pin geometry and continuous-energy cross sections were used in the Monte Carlo calculation. The estimated error components associated with cross-sections, methods correction factors and the bias factor were combined based on Takeda's theory. Those independently calculated values agree well and range from 2.8 to 3.4%Δk/kk'. The calculation result of the MK-III core management code system HESTLA was 3.13% Δk/kk'. The estimated errors for bias method range from 0.1 to 0.2%Δk/kk'. The error in the case using adjusted cross-section was 0.3%Δk/kk'. (author)

  6. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  7. Performance-based inspection and maintenance strategies

    International Nuclear Information System (INIS)

    Vesely, W.E.

    1995-01-01

    Performance-based inspection and maintenance strategies utilize measures of equipment performance to help guide inspection and maintenance activities. A relevant measure of performance for safety system components is component unavailability. The component unavailability can also be input into a plant risk model such as a Probabilistic Risk Assessment (PRA) to determine the associated plant risk performance. Based on the present and projected unavailability performance, or the present and projected risk performance, the effectiveness of current maintenance activities can be evaluated and this information can be used to plan future maintenance activities. A significant amount of information other than downtimes or failure times is collected or can be collected when an inspection or maintenance is conducted which can be used to estimate the component unavailability. This information generally involves observations on the condition or state of the component or component piecepart. The information can be detailed such as the amount of corrosion buildup or can be general such as the general state of the component described as ' high degradation', ' moderate degradation', or ' low degradation'. Much of the information collected in maintenance logs is qualitative and fuzzy. As part of an NRC Research program on performance-based engineering modeling, approaches have been developed to apply Fuzzy Set Theory to information collected on the state of the component to determine the implied component or component piecepart unavailability. Demonstrations of the applications of Fuzzy Set Theory are presented utilizing information from plant maintenance logs. The demonstrations show the power of Fuzzy Set Theory in translating engineering information to reliability and risk implications

  8. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  9. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    Directory of Open Access Journals (Sweden)

    STOJANOVIC, N.

    2014-11-01

    Full Text Available In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit is presented. We use CUDA (Compute Unified Device Architecture programming framework to implement parallel processing of common Geographic Information Systems (GIS algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CUDA for parallel implementation of GIS algorithms over large-scale geospatial datasets.

  10. Performance analysis of large-scale applications based on wavefront algorithms

    International Nuclear Information System (INIS)

    Hoisie, A.; Lubeck, O.; Wasserman, H.

    1998-01-01

    The authors introduced a performance model for parallel, multidimensional, wavefront calculations with machine performance characterized using the LogGP framework. The model accounts for overlap in the communication and computation components. The agreement with experimental data is very good under a variety of model sizes, data partitioning, blocking strategies, and on three different parallel architectures. Using the model, the authors analyzed performance of a deterministic transport code on a hypothetical 100 Tflops future parallel system of interest to ASCI

  11. Performance Analysis of Wavelet Channel Coding in COST207-based Channel Models on Simulated Radio-over-Fiber Systems at the W-Band

    DEFF Research Database (Denmark)

    Cavalcante, Lucas Costa Pereira; Silveira, Luiz F. Q.; Rommel, Simon

    2016-01-01

    Millimeter wave communications based on photonic technologies have gained increased attention to provide optic fiber-like capacity in wireless environments. However, the new hybrid fiber-wireless channel represents new challenges in terms of signal transmission performance analysis. Traditionally......, such systems use diversity schemes in combination with digital signal processing (DSP) techniques to overcome effects such as fading and inter-symbol interference (ISI). Wavelet Channel Coding (WCC) has emerged as a technique to minimize the fading effects of wireless channels, which is a mayor challenge...... in systems operating in the millimeter wave regime. This work takes the WCC one step beyond by performance evaluation in terms of bit error probability, over time-varying, frequency-selective multipath Rayleigh fading channels. The adopted propagation model follows the COST207 norm, the main international...

  12. Nuclear power company activity based costing management analysis

    International Nuclear Information System (INIS)

    Xu Dan

    2012-01-01

    With Nuclear Energy Industry development, Nuclear Power Company has the continual promoting stress of inner management to the sustainable marketing operation development. In view of this, it is very imminence that Nuclear Power Company should promote the cost management levels and built the nuclear safety based lower cost competitive advantage. Activity based costing management (ABCM) transfer the cost management emphases from the 'product' to the 'activity' using the value chain analysis methods, cost driver analysis methods and so on. According to the analysis of the detail activities and the value chains, cancel the unnecessary activity, low down the resource consuming of the necessary activity, and manage the cost from the source, achieve the purpose of reducing cost, boosting efficiency and realizing the management value. It gets the conclusion from the detail analysis with the nuclear power company procedure and activity, and also with the selection to 'pieces analysis' of the important cost related project in the nuclear power company. The conclusion is that the activities of the nuclear power company has the obviously performance. It can use the management of ABC method. And with the management of the procedure and activity, it is helpful to realize the nuclear safety based low cost competitive advantage in the nuclear power company. (author)

  13. Improved metamodel-based importance sampling for the performance assessment of radioactive waste repositories

    International Nuclear Information System (INIS)

    Cadini, F.; Gioletta, A.; Zio, E.

    2015-01-01

    In the context of a probabilistic performance assessment of a radioactive waste repository, the estimation of the probability of exceeding the dose threshold set by a regulatory body is a fundamental task. This may become difficult when the probabilities involved are very small, since the classically used sampling-based Monte Carlo methods may become computationally impractical. This issue is further complicated by the fact that the computer codes typically adopted in this context requires large computational efforts, both in terms of time and memory. This work proposes an original use of a Monte Carlo-based algorithm for (small) failure probability estimation in the context of the performance assessment of a near surface radioactive waste repository. The algorithm, developed within the context of structural reliability, makes use of an estimated optimal importance density and a surrogate, kriging-based metamodel approximating the system response. On the basis of an accurate analytic analysis of the algorithm, a modification is proposed which allows further reducing the computational efforts by a more effective training of the metamodel. - Highlights: • We tackle uncertainty propagation in a radwaste repository performance assessment. • We improve a kriging-based importance sampling for estimating failure probabilities. • We justify the modification by an analytic, comparative analysis of the algorithms. • The probability of exceeding dose thresholds in radwaste repositories is estimated. • The algorithm is further improved reducing the number of its free parameters

  14. FIRE SAFETY IN NUCLEAR POWER PLANTS: A RISK-INFORMED AND PERFORMANCE-BASED APPROACH

    International Nuclear Information System (INIS)

    AZARM, M.A.; TRAVIS, R.J.

    1999-01-01

    The consideration of risk in regulatory decision-making has long been a part of NRC's policy and practice. Initially, these considerations were qualitative and were based on risk insights. The early regulations relied on good practices, past insights, and accepted standards. As a result, most NRC regulations were prescriptive and were applied uniformly to all areas within the regulatory scope. Risk technology is changing regulations by prioritizing the areas within regulatory scope based on risk, thereby focusing on the risk-important areas. Performance technology, on the other hand, is changing the regulations by allowing requirements to be adjusted based on the specific performance expected and manifested, rather than a prior prescriptive requirement. Consistent with the objectives of risk-informed and performance-based regulatory requirements, BNL evaluated the feasibility of applying risk- and performance-technologies to modifying NRC's current regulations on fire protection for nuclear power plants. This feasibility study entailed several case studies (trial applications). This paper describes the results of two of them. Besides the case studies, the paper discusses an overall evaluation of methodologies for fire-risk analysis to support the risk-informed regulation. It identifies some current shortcomings and proposes some near-term solutions

  15. Application of monitoring, diagnosis, and prognosis in thermal performance analysis for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeong Min; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Na, Man Gyun [Chosun University, Gwangju (Korea, Republic of)

    2014-12-15

    As condition-based maintenance (CBM) has risen as a new trend, there has been an active movement to apply information technology for effective implementation of CBM in power plants. This motivation is widespread in operations and maintenance, including monitoring, diagnosis, prognosis, and decision-making on asset management. Thermal efficiency analysis in nuclear power plants (NPPs) is a longstanding concern being updated with new methodologies in an advanced IT environment. It is also a prominent way to differentiate competitiveness in terms of operations and maintenance costs. Although thermal performance tests implemented using industrial codes and standards can provide officially trustworthy results, they are essentially resource-consuming and maybe even a hind-sighted technique rather than a foresighted one, considering their periodicity. Therefore, if more accurate performance monitoring can be achieved using advanced data analysis techniques, we can expect more optimized operations and maintenance. This paper proposes a framework and describes associated methodologies for in-situ thermal performance analysis, which differs from conventional performance monitoring. The methodologies are effective for monitoring, diagnosis, and prognosis in pursuit of CBM. Our enabling techniques cover the intelligent removal of random and systematic errors, deviation detection between a best condition and a currently measured condition, degradation diagnosis using a structured knowledge base, and prognosis for decision-making about maintenance tasks. We also discuss how our new methods can be incorporated with existing performance tests. We provide guidance and directions for developers and end-users interested in in-situ thermal performance management, particularly in NPPs with large steam turbines.

  16. Application of monitoring, diagnosis, and prognosis in thermal performance analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Hyeong Min; Heo, Gyun Young; Na, Man Gyun

    2014-01-01

    As condition-based maintenance (CBM) has risen as a new trend, there has been an active movement to apply information technology for effective implementation of CBM in power plants. This motivation is widespread in operations and maintenance, including monitoring, diagnosis, prognosis, and decision-making on asset management. Thermal efficiency analysis in nuclear power plants (NPPs) is a longstanding concern being updated with new methodologies in an advanced IT environment. It is also a prominent way to differentiate competitiveness in terms of operations and maintenance costs. Although thermal performance tests implemented using industrial codes and standards can provide officially trustworthy results, they are essentially resource-consuming and maybe even a hind-sighted technique rather than a foresighted one, considering their periodicity. Therefore, if more accurate performance monitoring can be achieved using advanced data analysis techniques, we can expect more optimized operations and maintenance. This paper proposes a framework and describes associated methodologies for in-situ thermal performance analysis, which differs from conventional performance monitoring. The methodologies are effective for monitoring, diagnosis, and prognosis in pursuit of CBM. Our enabling techniques cover the intelligent removal of random and systematic errors, deviation detection between a best condition and a currently measured condition, degradation diagnosis using a structured knowledge base, and prognosis for decision-making about maintenance tasks. We also discuss how our new methods can be incorporated with existing performance tests. We provide guidance and directions for developers and end-users interested in in-situ thermal performance management, particularly in NPPs with large steam turbines.

  17. Performance Comparison of Grid-Faulty Control Schemes for Inverter-Based Industrial Microgrids

    Directory of Open Access Journals (Sweden)

    Antonio Camacho

    2017-12-01

    Full Text Available Several control schemes specifically designed to operate inverter-based industrial microgrids during voltage sags have been recently proposed. This paper first classifies these control schemes in three categories and then performs a comparative analysis of them. Representative control schemes of each category are selected, described and used to identify the main features and performance of the considered category. The comparison is based on the evaluation of several indexes, which measure the power quality of the installation and utility grid during voltage sags, including voltage regulation, reactive current injection and transient response. The paper includes selected simulation results from a 500 kVA industrial microgrid to validate the expected features of the considered control schemes. Finally, in view of the obtained results, the paper proposes an alternative solution to cope with voltage sags, which includes the use of a static compensator in parallel with the microgrid. The novelty of this proposal is the suitable selection of the control schemes for both the microgrid and the static compensator. The superior performance of the proposal is confirmed by the analysis of the quality indexes. Its practical limitations are also revealed, showing that the topic studied in this paper is still open for further research.

  18. Reliability analysis of digital based I and C system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, I. S.; Cho, B. S.; Choi, M. J. [KOPEC, Yongin (Korea, Republic of)

    1999-10-01

    Rapidly, digital technology is being widely applied in replacing analog component installed in existing plant and designing new nuclear power plant for control and monitoring system in Korea as well as in foreign countries. Even though many merits of digital technology, it is being faced with a new problem of reliability assurance. The studies for solving this problem are being performed vigorously in foreign countries. The reliability of KNGR Engineered Safety Features Component Control System (ESF-CCS), digital based I and C system, was analyzed to verify fulfillment of the ALWR EPRI-URD requirement for reliability analysis and eliminate hazards in design applied new technology. The qualitative analysis using FMEA and quantitative analysis using reliability block diagram were performed. The results of analyses are shown in this paper.

  19. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis☆

    Science.gov (United States)

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908

  20. Rapid quantitative analysis of individual anthocyanin content based on high-performance liquid chromatography with diode array detection with the pH differential method.

    Science.gov (United States)

    Wang, Huayin

    2014-09-01

    A new quantitative technique for the simultaneous quantification of the individual anthocyanins based on the pH differential method and high-performance liquid chromatography with diode array detection is proposed in this paper. The six individual anthocyanins (cyanidin 3-glucoside, cyanidin 3-rutinoside, petunidin 3-glucoside, petunidin 3-rutinoside, and malvidin 3-rutinoside) from mulberry (Morus rubra) and Liriope platyphylla were used for demonstration and validation. The elution of anthocyanins was performed using a C18 column with stepwise gradient elution and individual anthocyanins were identified by high-performance liquid chromatography with tandem mass spectrometry. Based on the pH differential method, the high-performance liquid chromatography peak areas of maximum and reference absorption wavelengths of anthocyanin extracts were conducted to quantify individual anthocyanins. The calibration curves for these anthocyanins were linear within the range of 10-5500 mg/L. The correlation coefficients (r(2)) all exceeded 0.9972, and the limits of detection were in the range of 1-4 mg/L at a signal-to-noise ratio ≥5 for these anthocyanins. The proposed quantitative analysis was reproducible with good accuracy of all individual anthocyanins ranging from 96.3 to 104.2% and relative recoveries were in the range 98.4-103.2%. The proposed technique is performed without anthocyanin standards and is a simple, rapid, accurate, and economical method to determine individual anthocyanin contents. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    Science.gov (United States)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  2. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  3. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  4. Performance analysis of a Principal Component Analysis ensemble classifier for Emotiv headset P300 spellers.

    Science.gov (United States)

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2014-01-01

    The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.

  5. Investment Performance of PT. Gresik Migas Based on Enterpreneur

    Directory of Open Access Journals (Sweden)

    Abdul Hamid

    2014-12-01

    Full Text Available In knowing the company's ability to manage the capital invested by the investor, the need for measuring the financial performance of the company. This also applies to companies in the sphere of regional, or local government (Regional Owned Enterprises. Therefore, the focus of this study is: (1 How-owned PT Gresik Migas profile based on performance; 2 How is the performance improvement strategy-owned PT Gresik Migas entrepreneurs based on the scope of the Provincial Government of East Java. The results showed that; 1 Performance PT Gresik oil and gas enterprises in East Java province measured by the conventional method / Ratio Analysis indicates good results (2 There are four strategically to improve the performance of enterprises, namely: (a the ability of the human resource managers of enterprises, including the strengthening of entrepreneurship spirit; (b Clarity and firmness legal basis for the establishment of the rule of enterprises; (c the financial management aspects of public enterprises; and(d Feasibility and sustainability of the business or business unit-owned both the products and the services sector is measured based on the internal and external performance. To improve the performance of enterprises PT. Gas Gresik in East Java province, as well as implementing 4 (four strategy that has been set, then there are some things that need to suggest, namely; 1 For local government should have the courage and firmness to minimize various forms, practices and patterns which raises the political cost, prepare clear SOPs related enterprises managing resource recruitment patterns, consistent to encourage more independent and professional enterprises, without intervention, and pays tribute to the manager who managed to bring enterprises to Go Public; 2 for the management of public enterprises should be able to create an environment more conducive working and always oriented towards the task and the future, foster leadership and managers of

  6. Theoretical analysis of ejector refrigeration system performance under overall modes

    International Nuclear Information System (INIS)

    Chen, Weixiong; Shi, Chaoyin; Zhang, Shuangping; Chen, Huiqiang; Chong, Daotong; Yan, Junjie

    2017-01-01

    Highlights: • Real gas theoretical model is used to get ejector performance at critical/sub-critical modes. • The model has a better accuracy against the experiment results compared to ideal gas model. • The overall performances of two refrigerants are analyzed based on the parameter analysis. - Abstract: The ejector refrigeration integrated in the air-conditioning system is a promising technology, because it could be driven by the low grade energy. In the present study, a theoretical calculation based on the real gas property is put forward to estimate the ejector refrigeration system performance under overall modes (critical/sub-critical modes). The experimental data from literature are applied to validate the proposed model. The findings show that the proposed model has higher accuracy compared to the model using the ideal gas law, especially when the ejector operates at sub-critical mode. Then, the performances of the ejector refrigeration circle using different refrigerants are analyzed. R290 and R134a are selected as typical refrigerants by considering the aspects of COP, environmental impact, safety and economy. Finally, the ejector refrigeration performance is investigated under variable operation conditions with R290 and R134a as refrigerants. The results show that the R290 ejector circle has higher COP under critical mode and could operate at low evaporator temperature. However, the performance would decrease rapidly at high condenser temperature. The performance of R134a ejector circle is the opposite, with relatively lower COP, and higher COP at high condenser temperature compared to R290.

  7. Entransy analysis on the thermal performance of flat plate solar air collectors

    Institute of Scientific and Technical Information of China (English)

    Jie Deng; Xudong Yang; Yupeng Xu; Ming Yang

    2017-01-01

    Based on the thermo-electric analogy (the so-called thermal entransy analysis), the unified airside convective heat transfer coefficient for different sorts of flat plate solar air collectors (FPSACs) is identified in terms of colector aperture area. In addition, the colector thermodynamic characteristic matching coefficient is defined to depict the matching property of collector thermal performance between the collector airside heat transfer and the total heat losses. It is found that the airside convective heat transfer coefficient can be experimentally determined by collector thermal performance test method to compare the airside thermal performances of FPSACs with different types of airflow structures. Moreover, the smaler the colector thermodynamic characteristic matching coefficient is, the better the thermodynamic perfect degree of a FPSAC is. The minimum limit value of the collector thermodynamic matching coefficient is close to zero but it can not vanish in practical engineering. Parameter sensitivity analysis on the total entransy dissipation and the entransy increment of a general FPSAC is also undertaken. The results indicate that the effective way of decreasing total entransy dissipation and enhancing the useful entransy increment is improving the efficiency intercept of the FPSAC. This is equivalent to the cognition result of thermal analysis. However, the evaluation indices identified by the thermal entransy analysis can not be extracted by singular thermal analysis.

  8. An Analysis of the Effect of Operations Management Practices on Performance

    Directory of Open Access Journals (Sweden)

    Elisa Battistoni Andrea Bonacelli

    2013-09-01

    Full Text Available In this paper we investigate the possible relationships among some optimization techniques used in Operations Management and the performance of SMEs that operate in the manufacturing sector. A model based on the Structural Equation Modelling (SEM approach is used to analyse a dataset of small and medium-sized Italian enterprises. The model is expressed by a system of simultaneous equations and is solved through regression analysis. Taking advantage of the contributions presented previously, we focus our research on the Italian economy, highlighting the importance of Operations Management practices, which are relevant drivers of these firms’ performances.

  9. Design-time performance analysis of component-based real-time systems

    NARCIS (Netherlands)

    Bondarev, E.

    2009-01-01

    In current real-time systems, performance metrics are one of the most challenging properties to specify, predict and measure. Performance properties depend on various factors, like environmental context, load profile, middleware, operating system, hardware platform and sharing of internal resources.

  10. Comparison of overhead line lightning performance based on two different tower geometries

    DEFF Research Database (Denmark)

    Ebdrup, Thomas; Olason, Daniel; Bak, Claus Leth

    2014-01-01

    of the substation and transmission line is of great importance as it is a part of the 400 kV backbone between Sweden, Norway, Germany and the offshore wind farms in Horns Rev, Denmark. The new Eagle pylon has been designed with the focus of minimizing the visual impacted of overhead lines. A detailed lightning...... performance analysis of the existing Donau and the new Eagle pylon is therefore important in order to assess the risk of failure. The lightning strike analysis is based on the number of strikes expected to terminate on the line and an investigation of how many of these there may be expected to cause...... better protected from direct stroke than the phase conductors on the Donau pylon. Furthermore with respect to a backflash, the Eagle has a better performance than the Donau pylon. It is therefore concluded that the Eagle has a better lightning performance than the Donau....

  11. Error and Performance Analysis of MEMS-based Inertial Sensors with a Low-cost GPS Receiver

    Directory of Open Access Journals (Sweden)

    Yang Gao

    2008-03-01

    Full Text Available Global Navigation Satellite Systems (GNSS, such as the Global Positioning System (GPS, have been widely utilized and their applications are becoming popular, not only in military or commercial applications, but also for everyday life. Although GPS measurements are the essential information for currently developed land vehicle navigation systems (LVNS, GPS signals are often unavailable or unreliable due to signal blockages under certain environments such as urban canyons. This situation must be compensated in order to provide continuous navigation solutions. To overcome the problems of unavailability and unreliability using GPS and to be cost and size effective as well, Micro Electro Mechanical Systems (MEMS based inertial sensor technology has been pushing for the development of low-cost integrated navigation systems for land vehicle navigation and guidance applications. This paper will analyze the characterization of MEMS based inertial sensors and the performance of an integrated system prototype of MEMS based inertial sensors, a low-cost GPS receiver and a digital compass. The influence of the stochastic variation of sensors will be assessed and modeled by two different methods, namely Gauss-Markov (GM and AutoRegressive (AR models, with GPS signal blockage of different lengths. Numerical results from kinematic testing have been used to assess the performance of different modeling schemes.

  12. Benefits of Exergy-Based Analysis for Aerospace Engineering Applications—Part I

    Directory of Open Access Journals (Sweden)

    John H. Doty

    2009-01-01

    Full Text Available This paper compares the analysis of systems from two different perspectives: an energy-based focus and an exergy-based focus. A complex system was simply modeled as interacting thermodynamic systems to illustrate the differences in analysis methodologies and results. The energy-based analysis had combinations of calculated states that are infeasible. On the other hand, the exergy-based analyses only allow feasible states. More importantly, the exergy-based analyses provide clearer insight to the combination of operating conditions for optimum system-level performance. The results strongly suggest changing the analysis/design paradigm used in aerospace engineering from energy-based to exergy-based. This methodology shift is even more critical in exploratory research and development where previous experience may not be available to provide guidance. Although the models used herein may appear simplistic, the message is very powerful and extensible to higher-fidelity models: the 1st Law is only a necessary condition for design, whereas the 1st and 2nd Laws provide the sufficiency condition.

  13. Choosing a heuristic and root node for edge ordering in BDD-based network reliability analysis

    International Nuclear Information System (INIS)

    Mo, Yuchang; Xing, Liudong; Zhong, Farong; Pan, Zhusheng; Chen, Zhongyu

    2014-01-01

    In the Binary Decision Diagram (BDD)-based network reliability analysis, heuristics have been widely used to obtain a reasonably good ordering of edge variables. Orderings generated using different heuristics can lead to dramatically different sizes of BDDs, and thus dramatically different running times and memory usages for the analysis of the same network. Unfortunately, due to the nature of the ordering problem (i.e., being an NP-complete problem) no formal guidelines or rules are available for choosing a good heuristic or for choosing a high-performance root node to perform edge searching using a particular heuristic. In this work, we make novel contributions by proposing heuristic and root node selection methods based on the concept of boundary sets for the BDD-based network reliability analysis. Empirical studies show that the proposed selection methods can help to generate high-performance edge ordering for most of studied cases, enabling the efficient BDD-based reliability analysis of large-scale networks. The proposed methods are demonstrated on different types of networks, including square lattice networks, torus lattice networks and de Bruijn networks

  14. Balanced scorecard-based performance evaluation of Chinese county hospitals in underdeveloped areas.

    Science.gov (United States)

    Gao, Hongda; Chen, He; Feng, Jun; Qin, Xianjing; Wang, Xuan; Liang, Shenglin; Zhao, Jinmin; Feng, Qiming

    2018-05-01

    Objective Since the Guangxi government implemented public county hospital reform in 2009, there have been no studies of county hospitals in this underdeveloped area of China. This study aimed to establish an evaluation indicator system for Guangxi county hospitals and to generate recommendations for hospital development and policymaking. Methods A performance evaluation indicator system was developed based on balanced scorecard theory. Opinions were elicited from 25 experts from administrative units, universities and hospitals and the Delphi method was used to modify the performance indicators. The indicator system and the Topsis method were used to evaluate the performance of five county hospitals randomly selected from the same batch of 2015 Guangxi reform pilots. Results There were 4 first-level indicators, 9 second-level indicators and 36 third-level indicators in the final performance evaluation indicator system that showed good consistency, validity and reliability. The performance rank of the hospitals was B > E > A > C > D. Conclusions The performance evaluation indicator system established using the balanced scorecard is practical and scientific. Analysis of the results based on this indicator system identified several factors affecting hospital performance, such as resource utilisation efficiency, medical service price, personnel structure and doctor-patient relationships.

  15. Business sustainability performance measurement: Eco-ratio analysis

    Directory of Open Access Journals (Sweden)

    Collins C. Ngwakwe

    2016-12-01

    Full Text Available Eco-aware customers and stakeholders are demanding for a measurement that links environmental performance with other business operations. To bridge this seemingly measurement gap, this paper suggests ‘Eco-Ratio Analysis’ and proposes an approach for conducting eco-ratio analysis. It is argued that since accounting ratios function as a tool for evaluating corporate financial viability by management and investors, eco-ratio analysis should be brought to the fore to provide a succinct measurement about the linkage between environmental performance and conventional business performance. It is hoped that this suggestion will usher in a nuance debate and approach in the teaching, research and practice of environmental management and sustainability accounting

  16. Performance Based Islamic Performance Index (Study on the Bank Muamalat Indonesia and Bank Syariah Mandiri

    Directory of Open Access Journals (Sweden)

    Siti Aisjah

    2015-09-01

    Full Text Available The development of Islamic base banks in Indonesia in recent years show rapid growth. The main challenge for Islamic base banks is how to raise belief from the stakeholders. Stakeholder expectations of the Islamic banks is different from a conventional bank. Since, Islamic banks are built on basic principles of Islamic economics. Therefore, we need a tool to evaluate and measure the performance of Islamic base banks. Islamicity Performance Index is a method which can evaluate the performance of Islamic base banks not only their financial but also justice principles, halal (lawfulness, and tazkiyah (sanctification. There are six financial ratios which are measured from Islamicity Performance Index:profit sharing ratio, zakat performance ratio, equitable distribution ratio, directors-employees welfare ratio, Islamic investment versus non-Islamic investment ratio, Islamic income versus non-Islamic income. This research is intended to figure out the performance of Islamic base Bank in Indonesia based on Islamicity Performance Index. The samples are the Bank Muamalat Indonesia and Bank Syariah Mandiri. Sources of data are the financial reports of Bank Muamalat Indonesia and Bank Syariah Mandiri in 2009–2010 period.The results show that the financial performance of Islamic Base Bank in Indonesia during 2009-2010 period have ”quite satisfactory level of  valuation. However, there are two unsatisfactory ratios. They are zakat performance ratio and director-employee welfare contrast ratio. It shows that zakat issued by the Islamic base bank in Indonesia is still low and the contrast of the director-employee welfare is still huge.

  17. Performance Analysis of On-Demand Routing Protocols in Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Arafatur RAHMAN

    2009-01-01

    Full Text Available Wireless Mesh Networks (WMNs have recently gained a lot of popularity due to their rapid deployment and instant communication capabilities. WMNs are dynamically self-organizing, self-configuring and self-healing with the nodes in the network automatically establishing an adiej hoc network and preserving the mesh connectivity. Designing a routing protocol for WMNs requires several aspects to consider, such as wireless networks, fixed applications, mobile applications, scalability, better performance metrics, efficient routing within infrastructure, load balancing, throughput enhancement, interference, robustness etc. To support communication, various routing protocols are designed for various networks (e.g. ad hoc, sensor, wired etc.. However, all these protocols are not suitable for WMNs, because of the architectural differences among the networks. In this paper, a detailed simulation based performance study and analysis is performed on the reactive routing protocols to verify the suitability of these protocols over such kind of networks. Ad Hoc On-Demand Distance Vector (AODV, Dynamic Source Routing (DSR and Dynamic MANET On-demand (DYMO routing protocol are considered as the representative of reactive routing protocols. The performance differentials are investigated using varying traffic load and number of source. Based on the simulation results, how the performance of each protocol can be improved is also recommended.

  18. Cage-based performance capture

    CERN Document Server

    Savoye, Yann

    2014-01-01

    Nowadays, highly-detailed animations of live-actor performances are increasingly easier to acquire and 3D Video has reached considerable attentions in visual media production. In this book, we address the problem of extracting or acquiring and then reusing non-rigid parametrization for video-based animations. At first sight, a crucial challenge is to reproduce plausible boneless deformations while preserving global and local captured properties of dynamic surfaces with a limited number of controllable, flexible and reusable parameters. To solve this challenge, we directly rely on a skin-detached dimension reduction thanks to the well-known cage-based paradigm. First, we achieve Scalable Inverse Cage-based Modeling by transposing the inverse kinematics paradigm on surfaces. Thus, we introduce a cage inversion process with user-specified screen-space constraints. Secondly, we convert non-rigid animated surfaces into a sequence of optimal cage parameters via Cage-based Animation Conversion. Building upon this re...

  19. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    Science.gov (United States)

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  20. Analysis of performance for centrifugal steam compressor

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seung Hwan; Ryu, Chang Kook; Ko, Han Seo [Sungkyunkwan University, Suwon (Korea, Republic of)

    2016-12-15

    In this study, mean streamline and Computational fluid dynamics (CFD) analyses were performed to investigate the performance of a small centrifugal steam compressor using a latent heat recovery technology. The results from both analysis methods showed good agreement. The compression ratio and efficiency of steam were found to be related with those of air by comparing the compression performances of both gases. Thus, the compression performance of steam could be predicted by the compression performance of air using the developed dimensionless parameters.

  1. Analysis of performance for centrifugal steam compressor

    International Nuclear Information System (INIS)

    Kang, Seung Hwan; Ryu, Chang Kook; Ko, Han Seo

    2016-01-01

    In this study, mean streamline and Computational fluid dynamics (CFD) analyses were performed to investigate the performance of a small centrifugal steam compressor using a latent heat recovery technology. The results from both analysis methods showed good agreement. The compression ratio and efficiency of steam were found to be related with those of air by comparing the compression performances of both gases. Thus, the compression performance of steam could be predicted by the compression performance of air using the developed dimensionless parameters

  2. Performance analysis in saber.

    Science.gov (United States)

    Aquili, Andrea; Tancredi, Virginia; Triossi, Tamara; De Sanctis, Desiree; Padua, Elvira; DʼArcangelo, Giovanna; Melchiorri, Giovanni

    2013-03-01

    Fencing is a sport practiced by both men and women, which uses 3 weapons: foil, épée, and saber. In general, there are few scientific studies available in international literature; they are limited to the performance analysis of fencing bouts, yet there is nothing about saber. There are 2 kinds of competitions in the World Cup for both men and women: the "FIE GP" and "A." The aim of this study was to carry out a saber performance analysis to gain useful indicators for the definition of a performance model. In addition, it is expected to verify if it could be influenced by the type of competition and if there are differences between men and women. Sixty bouts: 33 FIE GP and 27 "A" competitions (35 men's and 25 women's saber bouts) were analyzed. The results indicated that most actions are offensive (55% for men and 49% for women); the central area of the piste is mostly used (72% for men and 67% for women); the effective fighting time is 13.6% for men and 17.1% for women, and the ratio between the action and break times is 1:6.5 for men and 1:5.1 for women. A lunge is carried out every 23.9 seconds by men and every 20 seconds by women, and a direction change is carried out every 65.3 seconds by men and every 59.7 seconds by women. The data confirm the differences between the saber and the other 2 weapons. There is no significant difference between the data of the 2 different kinds of competitions.

  3. Measuring the performance of Islamic banks using maqāsid based model

    Directory of Open Access Journals (Sweden)

    Mustafa Omar Mohammed

    2015-12-01

    Full Text Available The vision and mission of Islamic banks were supposed to reflect the adherence of their activities and aspiration to Maqāṣid al-Sharī‘ah. However, there are contentions that Islamic banks have been converging towards conventional banking system. Efforts have been expended to reverse the tide and harmonise Islamic banking to its Sharī‘ah objectives. Hitherto, the existing conventional yardsticks have failed to measure the impact of the harmonisation exercise on Islamic banks’ performance. Therefore, using maqāṣid based yardstick to measure the performance of Islamic banks becomes imperative. This study has made use of al-Imām al-Ghazālī’s theory of Maqāṣid al-Sharī‘ah and Ibn ‘Āshūr’s reinterpretation, adopting content analysis and Sekaran (2000 behavioral science methods to develop a Maqāṣid Based Performance Evaluation Model (MPEM to measure the performance of Islamic banks. Experts’ opinions have validated the model and its acceptability. Suggestions are provided to policy makers and future research.

  4. Control and communication co-design: analysis and practice on performance improvement in distributed measurement and control system based on fieldbus and Ethernet.

    Science.gov (United States)

    Liang, Geng

    2015-01-01

    In this paper, improving control performance of a networked control system by reducing DTD in a different perspective was investigated. Two different network architectures for system implementation were presented. Analysis and improvement dealing with DTD for the experimental control system were expounded. Effects of control scheme configuration on DTD in the form of FB were investigated and corresponding improvements by reallocation of FB and re-arrangement of schedule table are proposed. Issues of DTD in hybrid network were investigated and corresponding approaches to improve performance including (1) reducing DTD in PLC or PAC by way of IEC61499 and (2) cascade Smith predictive control with BPNN-based identification were proposed and investigated. Control effects under the proposed methodologies were also given. Experimental and field practices validated these methodologies. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Performance analysis of EM-based blind detection for ON-OFF keying modulation over atmospheric optical channels

    Science.gov (United States)

    Dabiri, Mohammad Taghi; Sadough, Seyed Mohammad Sajad

    2018-04-01

    In the free-space optical (FSO) links, atmospheric turbulence lead to scintillation in the received signal. Due to its ease of implementation, intensity modulation with direct detection (IM/DD) based on ON-OFF keying (OOK) is a popular signaling scheme in these systems. Over turbulence channel, to detect OOK symbols in a blind way, i.e., without sending pilot symbols, an expectation-maximization (EM)-based detection method was recently proposed in the literature related to free-space optical (FSO) communication. However, the performance of EM-based detection methods severely depends on the length of the observation interval (Ls). To choose the optimum values of Ls at target bit error rates (BER)s of FSO communications which are commonly lower than 10-9, Monte-Carlo simulations would be very cumbersome and require a very long processing time. To facilitate performance evaluation, in this letter we derive the analytic expressions for BER and outage probability. Numerical results validate the accuracy of our derived analytic expressions. Our results may serve to evaluate the optimum value for Ls without resorting to time-consuming Monte-Carlo simulations.

  6. Storage element performance optimization for CMS analysis jobs

    International Nuclear Information System (INIS)

    Behrmann, G; Dahlblom, J; Guldmyr, J; Happonen, K; Lindén, T

    2012-01-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance

  7. Improving Eastern Bluebird nest box performance using computer analysis of satellite images

    Directory of Open Access Journals (Sweden)

    Sarah Svatora

    2012-06-01

    Full Text Available Bird conservationists have been introducing man-made boxes in an effort to increase the bluebird population. In this study we use computer analysis of satellite images to show that the performance of the boxes used by Eastern Bluebirds (Sialia sialis in Michigan can be improved by about 48%. The analysis is based on a strongcorrelation found between the edge directionality measured in the satellite image of the area around the box, and the preferences of the birds when selecting their nesting site. The method is based on satellite images taken from Google Earth, and can be used by conservationists to select a box placement strategy that will optimize the efficacy of the boxes deployed in a given area.

  8. Deep learning—Accelerating Next Generation Performance Analysis Systems?

    Directory of Open Access Journals (Sweden)

    Heike Brock

    2018-02-01

    Full Text Available Deep neural network architectures show superior performance in recognition and prediction tasks of the image, speech and natural language domains. The success of such multi-layered networks encourages their implementation in further application scenarios as the retrieval of relevant motion information for performance enhancement in sports. However, to date deep learning is only seldom applied to activity recognition problems of the human motion domain. Therefore, its use for sports data analysis might remain abstract to many practitioners. This paper provides a survey on recent works in the field of high-performance motion data and examines relevant technologies for subsequent deployment in real training systems. In particular, it discusses aspects of data acquisition, processing and network modeling. Analysis suggests the advantage of deep neural networks under difficult and noisy data conditions. However, further research is necessary to confirm the benefit of deep learning for next generation performance analysis systems.

  9. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    Science.gov (United States)

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  10. UMTS base station-like exposure, well-being, and cognitive performance.

    Science.gov (United States)

    Regel, Sabine J; Negovetic, Sonja; Röösli, Martin; Berdiñas, Veronica; Schuderer, Jürgen; Huss, Anke; Lott, Urs; Kuster, Niels; Achermann, Peter

    2006-08-01

    Radio-frequency electromagnetic fields (RF EMF) of mobile communication systems are widespread in the living environment, yet their effects on humans are uncertain despite a growing body of literature. We investigated the influence of a Universal Mobile Telecommunications System (UMTS) base station-like signal on well-being and cognitive performance in subjects with and without self-reported sensitivity to RF EMF. We performed a controlled exposure experiment (45 min at an electric field strength of 0, 1, or 10 V/m, incident with a polarization of 45 degrees from the left back side of the subject, weekly intervals) in a randomized, double-blind crossover design. A total of 117 healthy subjects (33 self-reported sensitive, 84 nonsensitive subjects) participated in the study. We assessed well-being, perceived field strength, and cognitive performance with questionnaires and cognitive tasks and conducted statistical analyses using linear mixed models. Organ-specific and brain tissue-specific dosimetry including uncertainty and variation analysis was performed. In both groups, well-being and perceived field strength were not associated with actual exposure levels. We observed no consistent condition-induced changes in cognitive performance except for two marginal effects. At 10 V/m we observed a slight effect on speed in one of six tasks in the sensitive subjects and an effect on accuracy in another task in nonsensitive subjects. Both effects disappeared after multiple end point adjustment. In contrast to a recent Dutch study, we could not confirm a short-term effect of UMTS base station-like exposure on well-being. The reported effects on brain functioning were marginal and may have occurred by chance. Peak spatial absorption in brain tissue was considerably smaller than during use of a mobile phone. No conclusions can be drawn regarding short-term effects of cell phone exposure or the effects of long-term base station-like exposure on human health.

  11. FY01 Supplemental Science and Performance Analysis: Volume 1, Scientific Bases and Analyses

    International Nuclear Information System (INIS)

    Bodvarsson, G.S.; Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  12. FY01 Supplemental Science and Performance Analysis: Volume 1,Scientific Bases and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.; Dobson, David

    2001-05-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054 [DIRS 124754

  13. TRAC analysis of passive containment cooling system performance

    International Nuclear Information System (INIS)

    Arai, Kenji; Kataoka, Kazuyoshi; Nagasaka, Hideo

    1993-01-01

    A passive containment cooling system (PCCS) is a promising concept to improve the reliability of decay heat removal during an accident. Toshiba has carried out analytical studies for PCCS development in addition to experimental studies, using a best estimate thermal hydraulic computer code TRAC. In order to establish an analytical model for the PCCS performance analysis, it is necessary for the analytical model to be qualified against experimental results and thoroughly address the phenomena important for PCCS performance analysis. In this paper, the TRAC qualification for PCCS application is reported. A TRAC model has been verified against a drain line break simulation test conducted at the PCCS integral test facility, GIRAFFE. The result shows that the TRAC model can accurately predict the major system response and the PCCS performance in the drain line break test. In addition, the results of several sensitivity analyses, showing various points concerning the modeling in the PCCS performance analysis, have been reported. The analyses have been carried out for the SBWR and the analytical points are closely related to important phenomena which can affect PCCS performance

  14. Analysis of the low-frequency magnetoelectric performance in three-phase laminate composites with Fe-based nanocrystalline ribbon

    International Nuclear Information System (INIS)

    Chen, Lei; Li, Ping; Wen, Yumei; Zhu, Yong

    2013-01-01

    The theoretical analysis of magnetoelectric (ME) performance in three-phase Terfenol-D/PZT/FeCuNbSiB (MPF) laminate composite is presented in this paper. The ME couplings at low frequency for ideal and less than ideal interface couplings are studied, respectively, and our analysis predicts that (i) the ME voltage coefficient for ideal interface coupling increases with the increasing layers (n) of Fe-based nanocrystalline ribbon FeCuNbSiB (Fe 73.5 Cu 1 Nb 3 Si 13.5 B 9 ) while the sizes of PZT (Pb(Zr 1−x Ti x )O 3 ) and Terfenol-D (Tb 1−x Dy x Fe 2−y ) are kept constant, and then it tends to be a constant when the layers of FeCuNbSiB are >100; (ii) by introducing the interface coupling factor k and considering the degradation of d 33m,f with n, the ME voltage coefficient for a less than ideal interface condition is predicted. As the FeCuNbSiB layer increases, it first increases and reaches to a maximum value, and then slowly decreases. Various MPF laminates are fabricated and tested. It is found that the theoretical predictions for the consideration of actual boundary conditions at the interface are in agreement with the experimental observations. This study plays a guiding role for the design of MPF composite in real applications. (paper)

  15. Exploring the relationships among performance-based functional ability, self-rated disability, perceived instrumental support, and depression: a structural equation model analysis.

    Science.gov (United States)

    Weil, Joyce; Hutchinson, Susan R; Traxler, Karen

    2014-11-01

    Data from the Women's Health and Aging Study were used to test a model of factors explaining depressive symptomology. The primary purpose of the study was to explore the association between performance-based measures of functional ability and depression and to examine the role of self-rated physical difficulties and perceived instrumental support in mediating the relationship between performance-based functioning and depression. The inclusion of performance-based measures allows for the testing of functional ability as a clinical precursor to disability and depression: a critical, but rarely examined, association in the disablement process. Structural equation modeling supported the overall fit of the model and found an indirect relationship between performance-based functioning and depression, with perceived physical difficulties serving as a significant mediator. Our results highlight the complementary nature of performance-based and self-rated measures and the importance of including perception of self-rated physical difficulties when examining depression in older persons. © The Author(s) 2014.

  16. FIRE SAFETY IN NUCLEAR POWER PLANTS: A RISK-INFORMED AND PERFORMANCE-BASED APPROACH

    Energy Technology Data Exchange (ETDEWEB)

    AZARM,M.A.; TRAVIS,R.J.

    1999-11-14

    The consideration of risk in regulatory decision-making has long been a part of NRC's policy and practice. Initially, these considerations were qualitative and were based on risk insights. The early regulations relied on good practices, past insights, and accepted standards. As a result, most NRC regulations were prescriptive and were applied uniformly to all areas within the regulatory scope. Risk technology is changing regulations by prioritizing the areas within regulatory scope based on risk, thereby focusing on the risk-important areas. Performance technology, on the other hand, is changing the regulations by allowing requirements to be adjusted based on the specific performance expected and manifested, rather than a prior prescriptive requirement. Consistent with the objectives of risk-informed and performance-based regulatory requirements, BNL evaluated the feasibility of applying risk- and performance-technologies to modifying NRC's current regulations on fire protection for nuclear power plants. This feasibility study entailed several case studies (trial applications). This paper describes the results of two of them. Besides the case studies, the paper discusses an overall evaluation of methodologies for fire-risk analysis to support the risk-informed regulation. It identifies some current shortcomings and proposes some near-term solutions.

  17. Policy and Validity Prospects for Performance-Based Assessment.

    Science.gov (United States)

    Baker, Eva L.; And Others

    1994-01-01

    This article describes performance-based assessment as expounded by its proponents, comments on these conceptions, reviews evidence regarding the technical quality of performance-based assessment, and considers its validity under various policy options. (JDD)

  18. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  19. Performance Recognition for Sulphur Flotation Process Based on Froth Texture Unit Distribution

    Directory of Open Access Journals (Sweden)

    Mingfang He

    2013-01-01

    Full Text Available As an important indicator of flotation performance, froth texture is believed to be related to operational condition in sulphur flotation process. A novel fault detection method based on froth texture unit distribution (TUD is proposed to recognize the fault condition of sulphur flotation in real time. The froth texture unit number is calculated based on texture spectrum, and the probability density function (PDF of froth texture unit number is defined as texture unit distribution, which can describe the actual textual feature more accurately than the grey level dependence matrix approach. As the type of the froth TUD is unknown, a nonparametric kernel estimation method based on the fixed kernel basis is proposed, which can overcome the difficulty when comparing different TUDs under various conditions is impossible using the traditional varying kernel basis. Through transforming nonparametric description into dynamic kernel weight vectors, a principle component analysis (PCA model is established to reduce the dimensionality of the vectors. Then a threshold criterion determined by the TQ statistic based on the PCA model is proposed to realize the performance recognition. The industrial application results show that the accurate performance recognition of froth flotation can be achieved by using the proposed method.

  20. Hydrophilic Pt nanoflowers: synthesis, crystallographic analysis and catalytic performance.

    Science.gov (United States)

    Mourdikoudis, Stefanos; Altantzis, Thomas; Liz-Marzán, Luis M; Bals, Sara; Pastoriza-Santos, Isabel; Pérez-Juste, Jorge

    2016-05-21

    Water-soluble Pt nanoflowers (NFs) were prepared by diethylene glycol-mediated reduction of Pt acetylacetonate (Pt(acac) 2 ) in the presence of polyethylenimine. Advanced electron microscopy analysis showed that the NFs consist of multiple branches with a truncated cubic morphology and different crystallographic orientations. We demonstrate that the nature of the solvent strongly influences the resulting morphology. The catalytic performance of the Pt NFs in 4-nitrophenol reduction was found to be superior to that of other nanoparticle-based catalysts. Additionally, the Pt NFs display good catalytic reusability with no loss of activity after five consecutive cycles.

  1. Availability of thermodynamic system with multiple performance parameters based on vector-universal generating function

    International Nuclear Information System (INIS)

    Cai Qi; Shang Yanlong; Chen Lisheng; Zhao Yuguang

    2013-01-01

    Vector-universal generating function was presented to analyze the availability of thermodynamic system with multiple performance parameters. Vector-universal generating function of component's performance was defined, the arithmetic model based on vector-universal generating function was derived for the thermodynamic system, and the calculation method was given for state probability of multi-state component. With the stochastic simulation of the degeneration trend of the multiple factors, the system availability with multiple performance parameters was obtained under composite factors. It is shown by an example that the results of the availability obtained by the binary availability analysis method are somewhat conservative, and the results considering parameter failure based on vector-universal generating function reflect the operation characteristics of the thermodynamic system better. (authors)

  2. Twenty-fifth water reactor safety information meeting: Proceedings. Volume 2: Human reliability analysis and human performance evaluation; Technical issues related to rulemakings; Risk-informed, performance-based initiatives; High burn-up fuel research

    Energy Technology Data Exchange (ETDEWEB)

    Monteleone, S. [comp.] [Brookhaven National Lab., Upton, NY (United States)

    1998-03-01

    This three-volume report contains papers presented at the conference. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Japan, Norway, and Russia. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. This volume contains the following: (1) human reliability analysis and human performance evaluation; (2) technical issues related to rulemakings; (3) risk-informed, performance-based initiatives; and (4) high burn-up fuel research. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  3. Photovoltaic thermal module concepts and their performance analysis: A review

    International Nuclear Information System (INIS)

    Hasan, M. Arif; Sumathy, K.

    2010-01-01

    This paper presents a review of the available literature covering the latest module aspects of different photovoltaic/thermal (PV/T) collectors and their performances in terms of electrical as well as thermal output. The review covers detailed description of flat-plate and concentrating PV/T systems, using liquid or air as the working fluid, numerical model analysis, experimental work and qualitative evaluation of thermal and electrical output. Also an in-depth review on the performance parameters such as, optimum mass flow rate, PV/T dimensions, air channel geometry is presented in this study. Based on the thorough review, it is clear that PV/T modules are very promising devices and there exists lot of scope to further improve their performances. Appropriate recommendations are made which will aid PV/T systems to improve their efficiency and reducing their cost, making them more competitive in the present market. (author)

  4. Photovoltaic thermal module concepts and their performance analysis: A review

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, M. Arif; Sumathy, K. [Department of Mechanical Engineering, North Dakota State University, Fargo, ND (United States)

    2010-09-15

    This paper presents a review of the available literature covering the latest module aspects of different photovoltaic/thermal (PV/T) collectors and their performances in terms of electrical as well as thermal output. The review covers detailed description of flat-plate and concentrating PV/T systems, using liquid or air as the working fluid, numerical model analysis, experimental work and qualitative evaluation of thermal and electrical output. Also an in-depth review on the performance parameters such as, optimum mass flow rate, PV/T dimensions, air channel geometry is presented in this study. Based on the thorough review, it is clear that PV/T modules are very promising devices and there exists lot of scope to further improve their performances. Appropriate recommendations are made which will aid PV/T systems to improve their efficiency and reducing their cost, making them more competitive in the present market. (author)

  5. Dynamic Channel Slot Allocation Scheme and Performance Analysis of Cyclic Quorum Multichannel MAC Protocol

    Directory of Open Access Journals (Sweden)

    Xing Hu

    2017-01-01

    Full Text Available In high diversity node situation, multichannel MAC protocol can improve the frequency efficiency, owing to fewer collisions compared with single-channel MAC protocol. And the performance of cyclic quorum-based multichannel (CQM MAC protocol is outstanding. Based on cyclic quorum system and channel slot allocation, it can avoid the bottleneck that others suffered from and can be easily realized with only one transceiver. To obtain the accurate performance of CQM MAC protocol, a Markov chain model, which combines the channel-hopping strategy of CQM protocol and IEEE 802.11 distributed coordination function (DCF, is proposed. The results of numerical analysis show that the optimal performance of CQM protocol can be obtained in saturation bound situation. And then we obtain the saturation bound of CQM system by bird swarm algorithm. In addition, to improve the performance of CQM protocol in unsaturation situation, a dynamic channel slot allocation of CQM (DCQM protocol is proposed, based on wavelet neural network. Finally, the performance of CQM protocol and DCQM protocol is simulated by Qualnet platform. And the simulation results show that the analytic and simulation results match very well; the DCQM performs better in unsaturation situation.

  6. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  7. Comparative exergetic performance analysis for certain thermal power plants in Serbia

    Directory of Open Access Journals (Sweden)

    Mitrović Dejan M.

    2016-01-01

    Full Text Available Traditional methods of analysis and calculation of complex thermal systems are based on the first law of thermodynamics. These methods use energy balance for a system. In general, energy balances do not provide any information about internal losses. In contrast, the second law of thermodynamics introduces the concept of exergy, which is useful in the analysis of thermal systems. Exergy is a measure for assessing the quality of energy, and allows one to determine the location, cause, and real size of losses incurred as well as residues in a thermal process. The purpose of this study is to comparatively analyze the performance of four thermal power plants from the energetic and exergetic viewpoint. Thermodynamic models of the plants are developed based on the first and second law of thermodynamics. The primary objectives of this paper are to analyze the system components separately and to identify and quantify the sites having largest energy and exergy losses. Finally, by means of these analyses, the main sources of thermodynamic inefficiencies as well as a reasonable comparison of each plant to others are identified and discussed. As a result, the outcomes of this study can provide a basis for the improvement of plant performance for the considered thermal power plants.

  8. Performance analysis of manufacturing systems : queueing approximations and algorithms

    NARCIS (Netherlands)

    Vuuren, van M.

    2007-01-01

    Performance Analysis of Manufacturing Systems Queueing Approximations and Algorithms This thesis is concerned with the performance analysis of manufacturing systems. Manufacturing is the application of tools and a processing medium to the transformation of raw materials into finished goods for sale.

  9. Carbon nanotube based VLSI interconnects analysis and design

    CERN Document Server

    Kaushik, Brajesh Kumar

    2015-01-01

    The brief primarily focuses on the performance analysis of CNT based interconnects in current research scenario. Different CNT structures are modeled on the basis of transmission line theory. Performance comparison for different CNT structures illustrates that CNTs are more promising than Cu or other materials used in global VLSI interconnects. The brief is organized into five chapters which mainly discuss: (1) an overview of current research scenario and basics of interconnects; (2) unique crystal structures and the basics of physical properties of CNTs, and the production, purification and applications of CNTs; (3) a brief technical review, the geometry and equivalent RLC parameters for different single and bundled CNT structures; (4) a comparative analysis of crosstalk and delay for different single and bundled CNT structures; and (5) various unique mixed CNT bundle structures and their equivalent electrical models.

  10. Receiver operating characteristic analysis of age-related changes in lineup performance.

    Science.gov (United States)

    Humphries, Joyce E; Flowe, Heather D

    2015-04-01

    In the basic face memory literature, support has been found for the late maturation hypothesis, which holds that face recognition ability is not fully developed until at least adolescence. Support for the late maturation hypothesis in the criminal lineup identification literature, however, has been equivocal because of the analytic approach that has been used to examine age-related changes in identification performance. Recently, receiver operator characteristic (ROC) analysis was applied for the first time in the adult eyewitness memory literature to examine whether memory sensitivity differs across different types of lineup tests. ROC analysis allows for the separation of memory sensitivity from response bias in the analysis of recognition data. Here, we have made the first ROC-based comparison of adults' and children's (5- and 6-year-olds and 9- and 10-year-olds) memory performance on lineups by reanalyzing data from Humphries, Holliday, and Flowe (2012). In line with the late maturation hypothesis, memory sensitivity was significantly greater for adults compared with young children. Memory sensitivity for older children was similar to that for adults. The results indicate that the late maturation hypothesis can be generalized to account for age-related performance differences on an eyewitness memory task. The implications for developmental eyewitness memory research are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Performance of a gaseous detector based energy dispersive X-ray fluorescence imaging system: Analysis of human teeth treated with dental amalgam

    Science.gov (United States)

    Silva, A. L. M.; Figueroa, R.; Jaramillo, A.; Carvalho, M. L.; Veloso, J. F. C. A.

    2013-08-01

    Energy dispersive X-ray fluorescence (EDXRF) imaging systems are of great interest in many applications of different areas, once they allow us to get images of the spatial elemental distribution in the samples. The detector system used in this study is based on a micro patterned gas detector, named Micro-Hole and Strip Plate. The full field of view system, with an active area of 28 × 28 mm2 presents some important features for EDXRF imaging applications, such as a position resolution below 125 μm, an intrinsic energy resolution of about 14% full width at half maximum for 5.9 keV X-rays, and a counting rate capability of 0.5 MHz. In this work, analysis of human teeth treated by dental amalgam was performed by using the EDXRF imaging system mentioned above. The goal of the analysis is to evaluate the system capabilities in the biomedical field by measuring the drift of the major constituents of a dental amalgam, Zn and Hg, throughout the tooth structures. The elemental distribution pattern of these elements obtained during the analysis suggests diffusion of these elements from the amalgam to teeth tissues.

  12. Cardboard Based Packaging Materials as Renewable Thermal Insulation of Buildings: Thermal and Life Cycle Performance

    OpenAIRE

    Čekon, Miroslav; Struhala, Karel; Slávik, Richard

    2017-01-01

    Cardboard based packaging components represent a material with a significant potential of renewable exploitation in buildings. This study presents the results of thermal and environmental analysis of existing packaging materials compared with standard conventional thermal insulations. Experimental measurements were performed to identify the thermal performance of studied cardboard packaging materials. Real-size samples were experimentally tested in laboratory measurements. The thermal resi...

  13. The Performance-based Funding Scheme of Universities

    Directory of Open Access Journals (Sweden)

    Juha KETTUNEN

    2016-05-01

    Full Text Available The purpose of this study is to analyse the effectiveness of the performance-based funding scheme of the Finnish universities that was adopted at the beginning of 2013. The political decision-makers expect that the funding scheme will create incentives for the universities to improve performance, but these funding schemes have largely failed in many other countries, primarily because public funding is only a small share of the total funding of universities. This study is interesting because Finnish universities have no tuition fees, unlike in many other countries, and the state allocates funding based on the objectives achieved. The empirical evidence of the graduation rates indicates that graduation rates increased when a new scheme was adopted, especially among male students, who have more room for improvement than female students. The new performance-based funding scheme allocates the funding according to the output-based indicators and limits the scope of strategic planning and the autonomy of the university. The performance-based funding scheme is transformed to the strategy map of the balanced scorecard. The new funding scheme steers universities in many respects but leaves the research and teaching skills to the discretion of the universities. The new scheme has also diminished the importance of the performance agreements between the university and the Ministry. The scheme increases the incentives for universities to improve the processes and structures in order to attain as much public funding as possible. It is optimal for the central administration of the university to allocate resources to faculties and other organisational units following the criteria of the performance-based funding scheme. The new funding scheme has made the universities compete with each other, because the total funding to the universities is allocated to each university according to the funding scheme. There is a tendency that the funding schemes are occasionally

  14. Performance Analysis of Blind Subspace-Based Signature Estimation Algorithms for DS-CDMA Systems with Unknown Correlated Noise

    Science.gov (United States)

    Zarifi, Keyvan; Gershman, Alex B.

    2006-12-01

    We analyze the performance of two popular blind subspace-based signature waveform estimation techniques proposed by Wang and Poor and Buzzi and Poor for direct-sequence code division multiple-access (DS-CDMA) systems with unknown correlated noise. Using the first-order perturbation theory, analytical expressions for the mean-square error (MSE) of these algorithms are derived. We also obtain simple high SNR approximations of the MSE expressions which explicitly clarify how the performance of these techniques depends on the environmental parameters and how it is related to that of the conventional techniques that are based on the standard white noise assumption. Numerical examples further verify the consistency of the obtained analytical results with simulation results.

  15. Analysis of survival data with dependent censoring copula-based approaches

    CERN Document Server

    Emura, Takeshi

    2018-01-01

    This book introduces readers to copula-based statistical methods for analyzing survival data involving dependent censoring. Primarily focusing on likelihood-based methods performed under copula models, it is the first book solely devoted to the problem of dependent censoring. The book demonstrates the advantages of the copula-based methods in the context of medical research, especially with regard to cancer patients’ survival data. Needless to say, the statistical methods presented here can also be applied to many other branches of science, especially in reliability, where survival analysis plays an important role. The book can be used as a textbook for graduate coursework or a short course aimed at (bio-) statisticians. To deepen readers’ understanding of copula-based approaches, the book provides an accessible introduction to basic survival analysis and explains the mathematical foundations of copula-based survival models.

  16. Analysis of Air Force Wartime Contracted Construction Project Performance

    Science.gov (United States)

    2015-03-26

    KPIs ) identified in current literature. Chan’s meta-analysis of KPIs found that time and cost are the primary objective indicators of a successful...Smith, Currie & Hancock, 2009). 46 Key performance indicators ( KPI ) were also used as input factors to analyze differences between contracts...Chan, et al. (2002) performed a meta-analysis of KPIs , as determined by construction researchers. They found that the most predictive performance

  17. An Analysis of the Effect of Operations Management Practices on Performance

    OpenAIRE

    Battistoni, Elisa; Bonacelli, Andra; Fronzetti Colladon, Andrea; Schiraldi, Massimiliano M.

    2013-01-01

    In this paper we investigate the possible relationships among some optimization techniques used in Operations Management and the performance of SMEs that operate in the manufacturing sector. A model based on the Structural Equation Modelling (SEM) approach is used to analyse a dataset of small and medium-sized Italian enterprises. The model is expressed by a system of simultaneous equations and is solved through regression analysis. Taking advantage of the contributions presented previously, ...

  18. Talent in Female Gymnastics: a Survival Analysis Based upon Performance Characteristics.

    Science.gov (United States)

    Pion, J; Lenoir, M; Vandorpe, B; Segers, V

    2015-11-01

    This study investigated the link between the anthropometric, physical and motor characteristics assessed during talent identification and dropout in young female gymnasts. 3 cohorts of female gymnasts (n=243; 6-9 years) completed a test battery for talent identification. Performance-levels were monitored over 5 years of competition. Kaplan-Meier and Cox Proportional Hazards analyses were conducted to determine the survival rate and the characteristics that influence dropout respectively. Kaplan-Meier analysis indicated that only 18% of the female gymnasts that passed the baseline talent identification test survived at the highest competition level 5 years later. The Cox Proportional Hazards Model indicated that gymnasts with a score in the best quartile for a specific characteristic significantly increased chances of survival by 45-129%. These characteristics being: basic motor skills (129%), shoulder strength (96%), leg strength (53%) and 3 gross motor coordination items (45-73%). These results suggest that tests batteries commonly used for talent identification in young female gymnasts may also provide valuable insights into future dropout. Therefore, multidimensional test batteries deserve a prominent place in the selection process. The individual test results should encourage trainers to invest in an early development of basic physical and motor characteristics to prevent attrition. © Georg Thieme Verlag KG Stuttgart · New York.

  19. Analysis of Wind Data for Sports Performance Design: A Case Study for Sailing Sports

    Directory of Open Access Journals (Sweden)

    Alessandro Pezzoli

    2014-11-01

    Full Text Available Environmental conditions affect outdoor sports performance. This is particularly true in some sports, especially in the sport of sailing, where environmental parameters are extremely influential as they interact directly with strategic analysis of the race area and then with strategic analysis of the performance itself. For these reasons, this research presents an innovative methodology for the strategic analysis of the race course that is based on the integrated assessment of meteorological data measured on the ground, meteorological data measured at sea during the training activities and the results of the CALMET model in hindcasting over a limited scale. The results obtained by the above analysis are then integrated into a graphical representation that provides to coaches and athletes the main strategic directions of the race course in a simple and easy-to-use way. The authors believe that the innovative methodology that has been adopted in the present research may represent a new approach to the integrated analysis of meteorological data on coastal environments. On the other hand, the results of this analysis, if presented with an appropriate technique of meta‑communication adapted to the sport sectors, can be used effectively for the improvement of athletes’ performances.

  20. PERFORMANCE CHARACTERISTIC MEMS-BASED IMUs FOR UAVs NAVIGATION

    Directory of Open Access Journals (Sweden)

    H. A. Mohamed

    2015-08-01

    Full Text Available Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK, and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS signal outage.

  1. Performance Characteristic Mems-Based IMUs for UAVs Navigation

    Science.gov (United States)

    Mohamed, H. A.; Hansen, J. M.; Elhabiby, M. M.; El-Sheimy, N.; Sesay, A. B.

    2015-08-01

    Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs) are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS) or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK), and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS) signal outage.

  2. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Directory of Open Access Journals (Sweden)

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  3. 48 CFR 970.1100-1 - Performance-based contracting.

    Science.gov (United States)

    2010-10-01

    ... Federal Procurement Policy's Seven Steps to Performance-Based Acquisition located at Web site http://www...) performance standards and objectives and quality assurance surveillance plans; provide performance incentives... planning processes. Measurable performance criteria, objective measures, and where appropriate, performance...

  4. Column Selection for Biomedical Analysis Supported by Column Classification Based on Four Test Parameters.

    Science.gov (United States)

    Plenis, Alina; Rekowska, Natalia; Bączek, Tomasz

    2016-01-21

    This article focuses on correlating the column classification obtained from the method created at the Katholieke Universiteit Leuven (KUL), with the chromatographic resolution attained in biomedical separation. In the KUL system, each column is described with four parameters, which enables estimation of the FKUL value characterising similarity of those parameters to the selected reference stationary phase. Thus, a ranking list based on the FKUL value can be calculated for the chosen reference column, then correlated with the results of the column performance test. In this study, the column performance test was based on analysis of moclobemide and its two metabolites in human plasma by liquid chromatography (LC), using 18 columns. The comparative study was performed using traditional correlation of the FKUL values with the retention parameters of the analytes describing the column performance test. In order to deepen the comparative assessment of both data sets, factor analysis (FA) was also used. The obtained results indicated that the stationary phase classes, closely related according to the KUL method, yielded comparable separation for the target substances. Therefore, the column ranking system based on the FKUL-values could be considered supportive in the choice of the appropriate column for biomedical analysis.

  5. TOF plotter - a program to perform routine analysis time-of-flight mass spectral data

    International Nuclear Information System (INIS)

    Knippel, Brad C.; Padgett, Clifford W.; Marcus, R. Kenneth

    2004-01-01

    The main article discusses the operation and application of the program to mass spectral data files. This laboratory has recently reported the construction and characterization of a linear time-of-flight mass spectrometer (ToF-MS) utilizing a radio frequency glow discharge ionization source. Data acquisition and analysis was performed using a digital oscilloscope and Microsoft Excel, respectively. Presently, no software package is available that is specifically designed for time-of-flight mass spectral analysis that is not instrument dependent. While spreadsheet applications such as Excel offer tremendous utility, they can be cumbersome when repeatedly performing tasks which are too complex or too user intensive for macros to be viable. To address this situation and make data analysis a faster, simpler task, our laboratory has developed a Microsoft Windows-based software program coded in Microsoft Visual Basic. This program enables the user to rapidly perform routine data analysis tasks such as mass calibration, plotting and smoothing on x-y data sets. In addition to a suite of tools for data analysis, a number of calculators are built into the software to simplify routine calculations pertaining to linear ToF-MS. These include mass resolution, ion kinetic energy and single peak identification calculators. A detailed description of the software and its associated functions is presented followed by a characterization of its performance in the analysis of several representative ToF-MS spectra obtained from different GD-ToF-MS systems

  6. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  7. Economic structure and performance of forest-based industries

    International Nuclear Information System (INIS)

    O'Laughlin, J.

    1989-01-01

    This paper reports on the economic structure, conduct, and performance of industries dependent on the nation's forests that are topics of special importance for research. A major challenge to research involving industrial organization of forest-based industries is to link descriptions of structure, conduct, and industrial performance in ways that facilitate public and private policy making. Not to be overlooked is the need to continue efforts to monitor changes in structure and conduct dimensions at the national level and to conduct baseline studies of industry structure-conduct-performance at regional, state, and local levels. Specifically needed is research that will improve understanding of restructuring within the wood-based industry; definitions of the wood-based industry and segments thereof; linkages between structure and regional economic development; timberland as a managerial and economic variable; structural consequences of technological innovations; corporate strategies as related to performance; structural dimensions in an international setting; and structure and performance of nonwood-based forest industries. Economics research focused in such directions will go far toward improving the manner in which the nation's many forest industries organize and conduct their activities

  8. 48 CFR 52.232-32 - Performance-Based Payments.

    Science.gov (United States)

    2010-10-01

    ... Contracting Officer, such excess shall be credited as a reduction in the unliquidated performance-based... adjustments. (e) Reduction or suspension of performance-based payments. The Contracting Officer may reduce or... sound and generally accepted accounting principles and practices: (i) Parts, materials, inventories, and...

  9. Two-Stage Performance Engineering of Container-based Virtualization

    Directory of Open Access Journals (Sweden)

    Zheng Li

    2018-02-01

    Full Text Available Cloud computing has become a compelling paradigm built on compute and storage virtualization technologies. The current virtualization solution in the Cloud widely relies on hypervisor-based technologies. Given the recent booming of the container ecosystem, the container-based virtualization starts receiving more attention for being a promising alternative. Although the container technologies are generally considered to be lightweight, no virtualization solution is ideally resource-free, and the corresponding performance overheads will lead to negative impacts on the quality of Cloud services. To facilitate understanding container technologies from the performance engineering’s perspective, we conducted two-stage performance investigations into Docker containers as a concrete example. At the first stage, we used a physical machine with “just-enough” resource as a baseline to investigate the performance overhead of a standalone Docker container against a standalone virtual machine (VM. With findings contrary to the related work, our evaluation results show that the virtualization’s performance overhead could vary not only on a feature-by-feature basis but also on a job-to-job basis. Moreover, the hypervisor-based technology does not come with higher performance overhead in every case. For example, Docker containers particularly exhibit lower QoS in terms of storage transaction speed. At the ongoing second stage, we employed a physical machine with “fair-enough” resource to implement a container-based MapReduce application and try to optimize its performance. In fact, this machine failed in affording VM-based MapReduce clusters in the same scale. The performance tuning results show that the effects of different optimization strategies could largely be related to the data characteristics. For example, LZO compression can bring the most significant performance improvement when dealing with text data in our case.

  10. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  11. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  12. The Analysis of Vertical Transaction Behavior and Performance Based on Automobile Brand Trust in Supply Chain

    Directory of Open Access Journals (Sweden)

    Guanglan Zhou

    2016-01-01

    Full Text Available The nontrust behaviors among the automobile supply chain members lead to a trust crisis situation. Under such circumstances, this paper studies the mutual influences of trust, enterprise behavior, and transaction performance on the independent brand automobile supply chain. The business behavior concept which consists of information sharing, joint action, and specific asset investment is proposed. Then, the paper tests the reliability and validity of the collected data through Structural Equation Modeling (SEM. Through empirical test and analysis on mutual relationship among vertical transaction enterprise behaviors, trust, and transaction performance, the vertical transaction enterprise behaviors can be regulated, so as to improve the efficiency of independent brand automobile supply chain.

  13. Techno-Economic Analysis of Biofuels Production Based on Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Swanson, R. M.; Platon, A.; Satrio, J. A.; Brown, R. C.; Hsu, D. D.

    2010-11-01

    This study compares capital and production costs of two biomass-to-liquid production plants based on gasification. The first biorefinery scenario is an oxygen-fed, low-temperature (870?C), non-slagging, fluidized bed gasifier. The second scenario is an oxygen-fed, high-temperature (1,300?C), slagging, entrained flow gasifier. Both are followed by catalytic Fischer-Tropsch synthesis and hydroprocessing to naphtha-range (gasoline blend stock) and distillate-range (diesel blend stock) liquid fractions. Process modeling software (Aspen Plus) is utilized to organize the mass and energy streams and cost estimation software is used to generate equipment costs. Economic analysis is performed to estimate the capital investment and operating costs. Results show that the total capital investment required for nth plant scenarios is $610 million and $500 million for high-temperature and low-temperature scenarios, respectively. Product value (PV) for the high-temperature and low-temperature scenarios is estimated to be $4.30 and $4.80 per gallon of gasoline equivalent (GGE), respectively, based on a feedstock cost of $75 per dry short ton. Sensitivity analysis is also performed on process and economic parameters. This analysis shows that total capital investment and feedstock cost are among the most influential parameters affecting the PV.

  14. Performance-based Pareto optimal design

    NARCIS (Netherlands)

    Sariyildiz, I.S.; Bittermann, M.S.; Ciftcioglu, O.

    2008-01-01

    A novel approach for performance-based design is presented, where Pareto optimality is pursued. Design requirements may contain linguistic information, which is difficult to bring into computation or make consistent their impartial estimations from case to case. Fuzzy logic and soft computing are

  15. FMEA team performance in health care: A qualitative analysis of team member perceptions.

    Science.gov (United States)

    Wetterneck, Tosha B; Hundt, Ann Schoofs; Carayon, Pascale

    2009-06-01

    : Failure mode and effects analysis (FMEA) is a commonly used prospective risk assessment approach in health care. Failure mode and effects analyses are time consuming and resource intensive, and team performance is crucial for FMEA success. We evaluate FMEA team members' perceptions of FMEA team performance to provide recommendations to improve the FMEA process in health care organizations. : Structured interviews and survey questionnaires were administered to team members of 2 FMEA teams at a Midwest Hospital to evaluate team member perceptions of FMEA team performance and factors influencing team performance. Interview transcripts underwent content analysis, and descriptive statistics were performed on questionnaire results to identify and quantify FMEA team performance. Theme-based nodes were categorized using the input-process-outcome model for team performance. : Twenty-eight interviews and questionnaires were completed by 24 team members. Four persons participated on both teams. There were significant differences between the 2 teams regarding perceptions of team functioning and overall team effectiveness that are explained by difference in team inputs and process (e.g., leadership/facilitation, team objectives, attendance of process owners). : Evaluation of team members' perceptions of team functioning produced useful insights that can be used to model future team functioning. Guidelines for FMEA team success are provided.

  16. Pin-wise Reactor Analysis Based on the Generalized Equivalence Theory

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Hwan Yeal; Heo, Woong; Kim, Yong Hee [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    In this paper, a pin-wise reactor analysis is performed based on the generalized equivalence theory. From the conventional fuel assembly lattice calculations, pin-wise 2-group cross sections and pin DFs are generated. Based on the numerical results on a small PWR benchmark, it is observed that the pin-wise core analysis provide quite accurate prediction on the effective multiplication factor and the peak pin power error is bounded by about 3% in peripheral fuel assemblies facing the baffle-reflector. Also, it was found that relatively large pin power errors occur along the interface between clearly different fuel assemblies. It is expected that the GET-based pin-by-pin core calculation can be further developed as an advanced method for reactor analysis via improving the group constants and discontinuity factors. Recently, high-fidelity multi-dimensional analysis tools are gaining more attention because of their accurate prediction of local parameters for core design and safety assessment. In terms of accuracy, direct whole-core transport is quite promising. However, it is clear that it is still very costly in terms of the computing time and memory requirements. Another possible solution is the pin-by-pin core analysis in which only small fuel pins are homogenized and the 3-D core analysis is still performed using a low-order operator such as the diffusion theory. In this paper, a pin-by-pin core analysis is performed using the hybrid CMFD (HCMFD) method. Hybrid CMFD is a new global-local iteration method that has been developed for efficient parallel calculation of pinby-pin heterogeneous core analysis. For the HCMFD method, the one-node CMFD scheme is combined with a local two-node CMFD method in a non-linear way. Since the SPH method is iterative and SPH factors are not direction dependent, it is clear that SPH method takes more computing cost and cannot take into account the different heterogeneity and transport effects at each pin interface. Unlike the SPH

  17. Performance of advanced self-shielding models in DRAGON Version4 on analysis of a high conversion light water reactor lattice

    International Nuclear Information System (INIS)

    Karthikeyan, Ramamoorthy; Hebert, Alain

    2008-01-01

    A high conversion light water reactor lattice has been analysed using the code DRAGON Version4. This analysis was performed to test the performance of the advanced self-shielding models incorporated in DRAGON Version4. The self-shielding models are broadly classified into two groups - 'equivalence in dilution' and 'subgroup approach'. Under the 'equivalence in dilution' approach we have analysed the generalized Stamm'ler model with and without Nordheim model and Riemann integration. These models have been analysed also using the Livolant-Jeanpierre normalization. Under the 'subgroup approach', we have analysed Statistical self-shielding model based on physical probability tables and Ribon extended self-shielding model based on mathematical probability tables. This analysis will help in understanding the performance of advanced self-shielding models for a lattice that is tight and has a large fraction of fissions happening in the resonance region. The nuclear data for the analysis was generated in-house. NJOY99.90 was used for generating libraries in DRAGLIB format for analysis using DRAGON and A Compact ENDF libraries for analysis using MCNP5. The evaluated datafiles were chosen based on the recommendations of the IAEA Co-ordinated Research Project on the WIMS Library Update Project. The reference solution for the problem was obtained using Monte Carlo code MCNP5. It was found that the Ribon extended self-shielding model based on mathematical probability tables using correlation model performed better than all other models

  18. Performance, labour flexibility and migrant workers in hotels: An establishment and departmental level analysis

    OpenAIRE

    Yaduma, N; Williams, A; Lockwood, A; Park, S

    2015-01-01

    © 2015. This paper analyses flexible working, and the employment of migrants, as determinants of performance in hotels, utilising a highly disaggregated data set of actual hours worked and outputs, on a monthly basis, over an 8 year period for 25 establishments within a single firm. It examines not only inter-establishment, but also intra-establishment (departmental) variations in performance. The analysis also systematically compares the findings based on financial versus physical measures, ...

  19. Sub-pattern based multi-manifold discriminant analysis for face recognition

    Science.gov (United States)

    Dai, Jiangyan; Guo, Changlu; Zhou, Wei; Shi, Yanjiao; Cong, Lin; Yi, Yugen

    2018-04-01

    In this paper, we present a Sub-pattern based Multi-manifold Discriminant Analysis (SpMMDA) algorithm for face recognition. Unlike existing Multi-manifold Discriminant Analysis (MMDA) approach which is based on holistic information of face image for recognition, SpMMDA operates on sub-images partitioned from the original face image and then extracts the discriminative local feature from the sub-images separately. Moreover, the structure information of different sub-images from the same face image is considered in the proposed method with the aim of further improve the recognition performance. Extensive experiments on three standard face databases (Extended YaleB, CMU PIE and AR) demonstrate that the proposed method is effective and outperforms some other sub-pattern based face recognition methods.

  20. FINANCIAL PERFORMANCE ANALYSIS BASED ON THE PROFIT AND LOSS STATEMENT

    Directory of Open Access Journals (Sweden)

    Ludmila PROFIR

    2017-07-01

    Full Text Available Financial performance is often difficult to achieve by economic entities, especially in the current economic context. Successful models of some companies constitute examples of good practice for aspirants. The profit and loss statement is part of the annual reports, is a synthesis accounting document that shows the result of the companies activity and thus measures the firm’s performance during a year. The purpose of this paper is to analyze the impact of the operating result on the financial performance through the net income. The target population of the study was the companies listed and traded on the Bucharest Stock Exchange during 2012-2016. The results of this study showed that the operating result has contributed significantly to the net income, and the companies listed and traded on the Bucharest Stock Exchange have successfully overcome the negative effects of the crisis and the recession.

  1. Observability analysis for model-based fault detection and sensor selection in induction motors

    International Nuclear Information System (INIS)

    Nakhaeinejad, Mohsen; Bryant, Michael D

    2011-01-01

    Sensors in different types and configurations provide information on the dynamics of a system. For a specific task, the question is whether measurements have enough information or whether the sensor configuration can be changed to improve the performance or to reduce costs. Observability analysis may answer the questions. This paper presents a general algorithm of nonlinear observability analysis with application to model-based diagnostics and sensor selection in three-phase induction motors. A bond graph model of the motor is developed and verified with experiments. A nonlinear observability matrix based on Lie derivatives is obtained from state equations. An observability index based on the singular value decomposition of the observability matrix is obtained. Singular values and singular vectors are used to identify the most and least observable configurations of sensors and parameters. A complex step derivative technique is used in the calculation of Jacobians to improve the computational performance of the observability analysis. The proposed algorithm of observability analysis can be applied to any nonlinear system to select the best configuration of sensors for applications of model-based diagnostics, observer-based controller, or to determine the level of sensor redundancy. Observability analysis on induction motors provides various sensor configurations with corresponding observability indices. Results show the redundancy levels for different sensors, and provide a sensor selection guideline for model-based diagnostics, and for observer-based controllers. The results can also be used for sensor fault detection and to improve the reliability of the system by increasing the redundancy level in measurements

  2. Scaling earthquake ground motions for performance-based assessment of buildings

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  3. Performance Analysis of Web-Based Ppp Services with DİFFERENT Visibility Conditions

    Science.gov (United States)

    Albayrak, M.; Erkaya, H.; Ozludemir, M. T.; Ocalan, T.

    2016-12-01

    GNSS is being used effectively to precise position for many measuring and geodetic purposes at the present time. There is an increasing variety of these systems including the post-processing calculations in terms of number, quality and features and many different techniques are developed to determine position. Precise positioning intend to derive requires user experience and scientific or commercial software with costly license fees. However, in recent years important alternatives to this software that are user friendly and offer free web-based online precise point positioning service have become widely used in geodetic applications. The aim of this study is to test the performance of PPP techniques on ground control points with different visibility conditions. Within this framework, static observations were carried out for three hours a day repeatedly for six days, in YTU Davutpasa Campus on three different ground control points. The locations of these stations were selected by taking into account the impact of natural (trees, etc.) and artificial (buildings, etc.) obstacles. In order to compare the obtained GPS observations with PPP performances, first of all the accurate coordinates of the control points were computed with relative positioning technique in connection with the IGS stations using Bernese v5.0 software. Afterwards, three different web-based positioning services (CSRS-PPP, magicGNSS, GAPS) were used to analyze the GPS observations via PPP technique. To compare all of the obtained results, ITRF2008 datum measurement epoch coordinates were preferred by taking the service result criteria into consideration. In coordinate comparison, for the first station located nearby a building and possibly subjected to multipath effect horizontal discrepancies vary between 2-14.5 cm while vertical differences are between 3.5-16 cm. For the second point located partly in a forestry area, the discrepancies have been obtained as 1.5-8 cm and 2-10 cm for horizontal and

  4. AR(p) -based detrended fluctuation analysis

    Science.gov (United States)

    Alvarez-Ramirez, J.; Rodriguez, E.

    2018-07-01

    Autoregressive models are commonly used for modeling time-series from nature, economics and finance. This work explored simple autoregressive AR(p) models to remove long-term trends in detrended fluctuation analysis (DFA). Crude oil prices and bitcoin exchange rate were considered, with the former corresponding to a mature market and the latter to an emergent market. Results showed that AR(p) -based DFA performs similar to traditional DFA. However, the former DFA provides information on stability of long-term trends, which is valuable for understanding and quantifying the dynamics of complex time series from financial systems.

  5. Economic Performance Analysis of National Research and Development Project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. S.; Yun, S. W.; Kim, S. E. [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are a lot of differences between these two evaluation programs in terms of their main objectives, assessment items, and evaluation methods by item. When considering the recent evaluation trend of being more concerned with the objective and scientifically well-founded base of judgment than the qualitative results data, there seems to be much supplement and improvement points in both evaluation programs. Firstly, the MSIP's evaluation program which is known as 'The performance analysis of national R and D program in Korea is applying the principle of ex-post evaluation for the overall performances of R and D activities focusing on the scientific and technological outputs, economic effects, and social performances such as the training of science and engineering personnel. Its report has been done and published by the collaboration of MSIP and KISTEP(Korea Institute of Science and Technology Evaluation and Planning). There seems to be a trend that the economic contributions to the national economy and the industries by national R and D projects have been underestimated due to the difficulties of not presenting properly the reliable quantitative effects even though they have contributed not only to the real economy and economic growth but to the industrial productions and public benefits. The key reasons to this phenomenon might be the deficiency of perception for evaluation tools and methodologies development and the original difficulty of evaluation for R and D performances. Especially the evaluation results for national R and D projects could impact on the investment decision on the long-term national R and D program, with being based on the investment efficiency or the necessity and urgency which might be represented by evaluation results.

  6. Economic Performance Analysis of National Research and Development Project

    International Nuclear Information System (INIS)

    Kim, S. S.; Yun, S. W.; Kim, S. E.

    2016-01-01

    There are a lot of differences between these two evaluation programs in terms of their main objectives, assessment items, and evaluation methods by item. When considering the recent evaluation trend of being more concerned with the objective and scientifically well-founded base of judgment than the qualitative results data, there seems to be much supplement and improvement points in both evaluation programs. Firstly, the MSIP's evaluation program which is known as 'The performance analysis of national R and D program in Korea is applying the principle of ex-post evaluation for the overall performances of R and D activities focusing on the scientific and technological outputs, economic effects, and social performances such as the training of science and engineering personnel. Its report has been done and published by the collaboration of MSIP and KISTEP(Korea Institute of Science and Technology Evaluation and Planning). There seems to be a trend that the economic contributions to the national economy and the industries by national R and D projects have been underestimated due to the difficulties of not presenting properly the reliable quantitative effects even though they have contributed not only to the real economy and economic growth but to the industrial productions and public benefits. The key reasons to this phenomenon might be the deficiency of perception for evaluation tools and methodologies development and the original difficulty of evaluation for R and D performances. Especially the evaluation results for national R and D projects could impact on the investment decision on the long-term national R and D program, with being based on the investment efficiency or the necessity and urgency which might be represented by evaluation results

  7. A Performance Survey on Stack-based and Register-based Virtual Machines

    OpenAIRE

    Fang, Ruijie; Liu, Siqi

    2016-01-01

    Virtual machines have been widely adapted for high-level programming language implementations and for providing a degree of platform neutrality. As the overall use and adaptation of virtual machines grow, the overall performance of virtual machines has become a widely-discussed topic. In this paper, we present a survey on the performance differences of the two most widely adapted types of virtual machines - the stack-based virtual machine and the register-based virtual machine - using various...

  8. Voxel-Based Morphometry ALE meta-analysis of Bipolar Disorder

    Science.gov (United States)

    Magana, Omar; Laird, Robert

    2012-03-01

    A meta-analysis was performed independently to view the changes in gray matter (GM) on patients with Bipolar disorder (BP). The meta-analysis was conducted on a Talairach Space using GingerALE to determine the voxels and their permutation. In order to achieve the data acquisition, published experiments and similar research studies were uploaded onto the online Voxel-Based Morphometry database (VBM). By doing so, coordinates of activation locations were extracted from Bipolar disorder related journals utilizing Sleuth. Once the coordinates of given experiments were selected and imported to GingerALE, a Gaussian was performed on all foci points to create the concentration points of GM on BP patients. The results included volume reductions and variations of GM between Normal Healthy controls and Patients with Bipolar disorder. A significant amount of GM clusters were obtained in Normal Healthy controls over BP patients on the right precentral gyrus, right anterior cingulate, and the left inferior frontal gyrus. In future research, more published journals could be uploaded onto the database and another VBM meta-analysis could be performed including more activation coordinates or a variation of age groups.

  9. A theory of planned behaviour-based analysis of TIMSS 2011 to determine factors influencing inquiry teaching practices in high-performing countries

    Science.gov (United States)

    Pongsophon, Pongprapan; Herman, Benjamin C.

    2017-07-01

    Given the abundance of literature describing the strong relationship between inquiry-based teaching and student achievement, more should be known about the factors impacting science teachers' classroom inquiry implementation. This study utilises the theory of planned behaviour to propose and validate a causal model of inquiry-based teaching through analysing data relating to high-performing countries retrieved from the 2011 Trends in International Mathematics and Science Study assessments. Data analysis was completed through structural equation modelling using a polychoric correlation matrix for data input and diagonally weighted least squares estimation. Adequate fit of the full model to the empirical data was realised. The model demonstrates that the extent the teachers participated in academic collaborations was positively related to their occupational satisfaction, confidence in teaching inquiry, and classroom inquiry practices. Furthermore, the teachers' confidence with implementing inquiry was positively related to their classroom inquiry implementation and occupational satisfaction. However, perceived student-generated constraints demonstrated a negative relationship with the teachers' confidence with implementing inquiry and occupational satisfaction. Implications from this study include supporting teachers through promoting collaborative opportunities that facilitate inquiry-based practices and occupational satisfaction.

  10. Performance and Complexity Analysis of Blind FIR Channel Identification Algorithms Based on Deterministic Maximum Likelihood in SIMO Systems

    DEFF Research Database (Denmark)

    De Carvalho, Elisabeth; Omar, Samir; Slock, Dirk

    2013-01-01

    We analyze two algorithms that have been introduced previously for Deterministic Maximum Likelihood (DML) blind estimation of multiple FIR channels. The first one is a modification of the Iterative Quadratic ML (IQML) algorithm. IQML gives biased estimates of the channel and performs poorly at low...... to the initialization. Its asymptotic performance does not reach the DML performance though. The second strategy, called Pseudo-Quadratic ML (PQML), is naturally denoised. The denoising in PQML is furthermore more efficient than in DIQML: PQML yields the same asymptotic performance as DML, as opposed to DIQML......, but requires a consistent initialization. We furthermore compare DIQML and PQML to the strategy of alternating minimization w.r.t. symbols and channel for solving DML (AQML). An asymptotic performance analysis, a complexity evaluation and simulation results are also presented. The proposed DIQML and PQML...

  11. How Game Location Affects Soccer Performance: T-Pattern Analysis of Attack Actions in Home and Away Matches

    Directory of Open Access Journals (Sweden)

    Barbara Diana

    2017-08-01

    Full Text Available The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance. This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013—First Leg. The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann–Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training.

  12. How Game Location Affects Soccer Performance: T-Pattern Analysis of Attack Actions in Home and Away Matches.

    Science.gov (United States)

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M; Jonsson, Gudberg K; Anguera, M Teresa

    2017-01-01

    The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013-First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann-Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training.

  13. Hospital Value-Based Purchasing Performance: Do Organizational and Market Characteristics Matter?

    Science.gov (United States)

    Spaulding, Aaron; Edwardson, Nick; Zhao, Mei

    The hospital value-based purchasing (HVBP) program of the Centers for Medicare & Medicaid Services challenges hospitals to deliver high-quality care or face a reduction in Medicare payments. How do different organizational structures and market characteristics enable or inhibit successful transition to this new model of value-based care? To address that question, this study employs an institutional theory lens to test whether certain organizational structures and market characteristics mediate hospitals' ability to perform across HVBP domains.Data from the 2014 American Hospital Association Annual Survey Database, Area Health Resource File, the Medicare Hospital Compare Database, and the association between external environment and hospital performance are assessed through multiple regression analysis. Results indicate that hospitals that belong to a system are more likely than independent hospitals to score highly on the domains associated with the HVBP incentive arrangement. However, varying and sometimes counterintuitive market influences bring different dimensions to the HVBP program. A hospital's ability to score well in this new value arrangement may be heavily based on the organization's ability to learn from others, implement change, and apply the appropriate amount of control in various markets.

  14. Mixed movements/performance-based drawing

    DEFF Research Database (Denmark)

    Brabrand, Helle

    2011-01-01

    Mixed Movements is a research project engaged in performance-based architectural drawing. As one in a series working with architectonic implementation in relation to body and movements, the actual project relates body-movement and dynamic drawing and presents the material as interactive ‘space-time-tables’....

  15. Performance Comparison of Assorted Color Spaces for Multilevel Block Truncation Coding based Face Recognition

    OpenAIRE

    H.B. Kekre; Sudeep Thepade; Karan Dhamejani; Sanchit Khandelwal; Adnan Azmi

    2012-01-01

    The paper presents a performance analysis of Multilevel Block Truncation Coding based Face Recognition among widely used color spaces. In [1], Multilevel Block Truncation Coding was applied on the RGB color space up to four levels for face recognition. Better results were obtained when the proposed technique was implemented using Kekre’s LUV (K’LUV) color space [25]. This was the motivation to test the proposed technique using assorted color spaces. For experimental analysis, two face databas...

  16. Performance analysis of CRF-based learning for processing WoT application requests expressed in natural language.

    Science.gov (United States)

    Yoon, Young

    2016-01-01

    In this paper, we investigate the effectiveness of a CRF-based learning method for identifying necessary Web of Things (WoT) application components that would satisfy the users' requests issued in natural language. For instance, a user request such as "archive all sports breaking news" can be satisfied by composing a WoT application that consists of ESPN breaking news service and Dropbox as a storage service. We built an engine that can identify the necessary application components by recognizing a main act (MA) or named entities (NEs) from a given request. We trained this engine with the descriptions of WoT applications (called recipes) that were collected from IFTTT WoT platform. IFTTT hosts over 300 WoT entities that offer thousands of functions referred to as triggers and actions. There are more than 270,000 publicly-available recipes composed with those functions by real users. Therefore, the set of these recipes is well-qualified for the training of our MA and NE recognition engine. We share our unique experience of generating the training and test set from these recipe descriptions and assess the performance of the CRF-based language method. Based on the performance evaluation, we introduce further research directions.

  17. High-performance cement-based grouts for use in a nuclear waste disposal facility

    International Nuclear Information System (INIS)

    Onofrei, M.; Gray, M.N.

    1992-12-01

    National and international agencies have identified cement-based materials as prime candidates for sealing vaults that would isolate nuclear fuel wastes from the biosphere. Insufficient information is currently available to allow a reasonable analysis of the long-term performance of these sealing materials in a vault. A combined laboratory and modelling research program was undertaken to provide the necessary information for a specially developed high-performance cement grout. The results indicate that acceptable performance is likely for at least thousands of years and probably for much longer periods. The materials, which have been proven to be effective in field applications, are shown to be virtually impermeable and highly leach resistant under vault conditions. Special plasticizing additives used in the material formulation enhance the physical characteristics of the grout without detriment to its chemical durability. Neither modelling nor laboratory testing have yet provided a definitive assessment of the grout's longevity. However, none of the results of these studies has contraindicated the use of high-performance cement-based grouts in vault sealing applications. (Author) (24 figs., 6 tabs., 21 refs.)

  18. Performance Analysis of 20MW gas turbine power plant by Energy and Exergy Methods

    International Nuclear Information System (INIS)

    Lebele-Alawa, B. T.; Asuo, J. M.

    2013-01-01

    Energy and exergy analysis were conducted to evaluate the optimal performance of a 20 MW gas turbine power plant. The energy analysis was based on First Law of Thermodynamics, while the exergy method used both First and Second Laws of Thermodynamics. The locations and magnitude of losses which inhibited the performance of the power plant were identified by balance system equations. The internal losses associated with each plant component were estimated for improvement to be made to such component for maximum power output. The energy efficiency was 20.73 %, while the exergeric efficiency was 16.39 %; but the exergy loss of 38.62 % in the combustor was the largest among the components of plant. (au)

  19. Work-family conflict, emotional exhaustion and performance-based self-esteem: reciprocal relationships.

    Science.gov (United States)

    Richter, Anne; Schraml, Karin; Leineweber, Constanze

    2015-01-01

    The three constructs of work-family conflict, emotional exhaustion and performance-based self-esteem are all related to tremendous negative consequences for the individual, the organization as well as for society. Even though there are studies that connect two of those constructs, the prospective relations between all three of them have not been studied yet. We explored the prospective relations between the three constructs in a large Swedish data set representative of the Swedish workforce. Gender differences in the relations were investigated. Longitudinal data with a 2-year time lag were gathered from 3,387 working men and women who responded to the 2006 and 2008 waves of the Swedish Longitudinal Occupational Survey of Health. Four different cross-lagged models were analysed. In the best fitting model, higher levels of work-family conflict at time 1 were associated with an increased level of performance-based self-esteem at time 2, but not with emotional exhaustion, after controlling for having children, gender, education and age. Also, relationships between emotional exhaustion at time 1 and work-family conflict and performance-based self-esteem at time 2 could be established. Furthermore, relationships between performance-based self-esteem time 1 and work-family conflict and emotional exhaustion time 2 were found. Multiple-group analysis did not show any differences in the relations of the tested constructs over time for either men or women. We conclude that the three constructs are interrelated and best understood through a reciprocal model. No differences were found between men and women.

  20. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    Science.gov (United States)

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https