WorldWideScience

Sample records for identify application performance

  1. Identifying performance gaps in hydrogen safety sensor technology for automotive and stationary applications

    International Nuclear Information System (INIS)

    Boon-Brett, L.; Bousek, J.; Black, G.; Moretto, P.; Castello, P.; Huebert, T.; Banach, U.

    2010-01-01

    A market survey has been performed of commercially available hydrogen safety sensors, resulting in a total sample size of 53 sensors from 21 manufacturers. The technical specifications, as provided by the manufacturer, have been collated and are displayed herein as a function of sensor working principle. These specifications comprise measuring range, response and recovery times, ambient temperature, pressure and relative humidity, power consumption and lifetime. These are then compared against known performance targets for both automotive and stationary applications in order to establish in how far current technology satisfies current requirements of sensor end users. Gaps in the performance of hydrogen sensing technologies are thus identified and areas recommended for future research and development. (author)

  2. Identifying trace evidence in data wiping application software

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2012-06-01

    Full Text Available One area of particular concern for computer forensics examiners involves situations in which someone utilized software applications to destroy evidence. There are products available in the marketplace that are relatively inexpensive and advertised as being able to destroy targeted portions of data stored within a computer system. This study was undertaken to identify these tools and analyze them to determine the extent to which each of the evaluated data wiping applications perform their tasks and to identify trace evidence, if any, left behind on disk media after executing these applications. We evaluated five Windows 7 compatible software products whose advertised features include the ability for users to wipe targeted files, folders, or evidence of selected activities. We conducted a series of experiments that involved executing each application on systems with identical data, and we then analyzed the results and compared the before and after images for each application. We identified information for each application that is beneficial to forensics examiners when faced with similar situations. This paper describes our application selection process, our application evaluation methodology, and our findings. Following this, we describe limitations of this study and suggest areas of additional research that will benefit the study of digital forensics.

  3. Risk and Performance Technologies: Identifying the Keys to Successful Implementation

    International Nuclear Information System (INIS)

    McClain, Lynn; Smith, Art; O'Regan, Patrick

    2002-01-01

    The nuclear power industry has been utilizing risk and performance based technologies for over thirty years. Applications of these technologies have included risk assessment (e.g. Individual Plant Examinations), burden reduction (e.g. Risk-Informed Inservice Inspection, RI-ISI) and risk management (Maintenance Rule, 10CFR50.65). Over the last five to ten years the number of risk-informed (RI) burden reduction initiatives has increased. Unfortunately, the efficiencies of some of these applications have been questionable. This paper investigates those attributes necessary to support successful, cost-effective RI-applications. The premise to this paper is that by understanding the key attributes that support one successful application, insights can be gleaned that will streamline/coordinate future RI-applications. This paper is an extension to a paper presented at the Pressure Vessel and Piping (PVP-2001) Conference. In that paper, a number issues and opportunities were identified that needed to be assessed in order to support future (and efficient) RI-applications. It was noted in the paper that a proper understanding and resolution of these issues will facilitate implementation of risk and performance technology in the operation, maintenance and design disciplines. In addition, it will provide the foundation necessary to support regulatory review and approval. (authors)

  4. Identifying and weighting of key performance indicators of knowledge management2.0 in organizations

    Directory of Open Access Journals (Sweden)

    Saeed Khalilazar

    2016-03-01

    Full Text Available Main purpose of this research is identifying and weighting of key performance indicators of knowledge management2.0 in organizations. According to widespread permeation of technology, especially social media in different organizational dimensions and functional view to this phenomenon in knowledge management, performance measurement of this kind of media in order to meet organizational goals seems necessary. KM2.0 key performance indicators in this article has been identified and weighted through Delphi methodology, via questionnaire in three rounds. KM2.0 KPIs which are identified and weighted in this article are applicable in organizations that are eager to implement KM2.0 initiative and they can measure the performance of KM2.0 activities therefore this research is applicable in goal oriented approach. According to the results, KM2.0 participation process consists of 3 stages and 8 steps as mentioned below: First stage which is presence, consists of 3 steps which are registration, visit and download. Second stage which is feedback consists of 3 steps which are conversation, applause and amplification. Finally, third stage which is creation consists of 2 steps which are codification and personalization. Ultimate contribution of this research is identifying and weighting KPIs of KM2.0 in conceptual framework of KM2.0. Based on developing a conceptual framework and participation process in KM2.0 and listing related KPIs as an applicable solution in order to measure and improve the performance of organizational social media, this research has unique innovation among related and other articles.

  5. Non-identifier based adaptive control in mechatronics theory and application

    CERN Document Server

    Hackl, Christoph M

    2017-01-01

    This book introduces non-identifier-based adaptive control (with and without internal model) and its application to the current, speed and position control of mechatronic systems such as electrical synchronous machines, wind turbine systems, industrial servo systems, and rigid-link, revolute-joint robots. In mechatronics, there is often only rough knowledge of the system. Due to parameter uncertainties, nonlinearities and unknown disturbances, model-based control strategies can reach their performance or stability limits without iterative controller design and performance evaluation, or system identification and parameter estimation. The non-identifier-based adaptive control presented is an alternative that neither identifies the system nor estimates its parameters but ensures stability. The adaptive controllers are easy to implement, compensate for disturbances and are inherently robust to parameter uncertainties and nonlinearities. For controller implementation only structural system knowledge (like relativ...

  6. On the importance of identifying, characterizing, and predicting fundamental phenomena towards microbial electrochemistry applications.

    Science.gov (United States)

    Torres, César Iván

    2014-06-01

    The development of microbial electrochemistry research toward technological applications has increased significantly in the past years, leading to many process configurations. This short review focuses on the need to identify and characterize the fundamental phenomena that control the performance of microbial electrochemical cells (MXCs). Specifically, it discusses the importance of recent efforts to discover and characterize novel microorganisms for MXC applications, as well as recent developments to understand transport limitations in MXCs. As we increase our understanding of how MXCs operate, it is imperative to continue modeling efforts in order to effectively predict their performance, design efficient MXC technologies, and implement them commercially. Thus, the success of MXC technologies largely depends on the path of identifying, understanding, and predicting fundamental phenomena that determine MXC performance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Mobile Application to Identify Indonesian Flowers on Android Platform

    Directory of Open Access Journals (Sweden)

    Tita Karlita

    2013-12-01

    Full Text Available Although many people love flowers, they do not know their name. Especially, many people do not recognize local flowers. To find the flower image, we can use search engine such as Google, but it does not give much help to find the name of local flower. Sometimes, Google cannotshow the correct name of local flowers. This study proposes an application to identify Indonesian flowers that runs on the Android platform for easy use anywhere. Flower recognition is based on the color features using the Hue-Index, shape feature using Centroid Contour Distance (CCD, and the similarity measurement using Entropy calculations. The outputs of this application are information about inputted flower image including Latinname, local name, description, distribution and ecology. Based on tests performed on 44 types of flowers with 181 images in the database, the best similarity percentage is 97.72%. With this application, people will be expected to know more about Indonesia flowers. Keywords: Indonesian flowers, android, hue-index, CCD, entropy

  8. Identifying High Performance ERP Projects

    OpenAIRE

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  9. Identifying Importance-Performance Matrix Analysis (IPMA) of ...

    African Journals Online (AJOL)

    Identifying Importance-Performance Matrix Analysis (IPMA) of intellectual capital and Islamic work ethics in Malaysian SMES. ... capital and Islamic work ethics significantly influenced business performance. ... AJOL African Journals Online.

  10. Measuring individual work performance: identifying and selecting indicators.

    Science.gov (United States)

    Koopmans, Linda; Bernaards, Claire M; Hildebrandt, Vincent H; de Vet, Henrica C W; van der Beek, Allard J

    2014-01-01

    Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions. This study was designed to (1) identify indicators for each dimension, (2) select the most relevant indicators, and (3) determine the relative weight of each dimension in ratings of work performance. IWP indicators were identified from multiple research disciplines, via literature, existing questionnaires, and expert interviews. Subsequently, experts selected the most relevant indicators per dimension and scored the relative weight of each dimension in ratings of IWP. In total, 128 unique indicators were identified. Twenty-three of these indicators were selected by experts as most relevant for measuring IWP. Task performance determined 36% of the work performance rating, while the other three dimensions respectively determined 22%, 20% and 21% of the rating. Notable consensus was found on relevant indicators of IWP, reducing the number from 128 to 23 relevant indicators. This provides an important step towards the development of a standardized, generic and short measurement instrument for assessing IWP.

  11. Identifying and Ranking the Determinants of Tourism Performance

    DEFF Research Database (Denmark)

    Assaf, A.George; Josiassen, Alexander

    2012-01-01

    , their tourism industries, and tourism businesses seek to improve the performance of the tourism industry and its constituents by vigorously promoting themselves to international tourists, cutting costs, and identifying synergies in their tourism endeavors. In seeking to improve the tourism industry......, the determinants that affect tourism performance are of key interest to the stakeholders. A key obstacle toward improving performance is the multitude of determinants that can affect tourism performance. The literature has yet to provide concrete insights into the determinants of tourism performance...... and their relative importance. The present study addresses this important gap. We identify and rank the determinants of tourism performance. We also provide performance measures of international tourism destinations. The results are derived using the Data Envelopment Analysis (DEA) and bootstrap truncated regression...

  12. How to identify, assess and utilise mobile medical applications in clinical practice.

    Science.gov (United States)

    Aungst, T D; Clauson, K A; Misra, S; Lewis, T L; Husain, I

    2014-02-01

    There are thousands of medical applications for mobile devices targeting use by healthcare professionals. However, several factors related to the structure of the existing market for medical applications create significant barriers preventing practitioners from effectively identifying mobile medical applications for individual professional use. To define existing market factors relevant to selection of medical applications and describe a framework to empower clinicians to identify, assess and utilise mobile medical applications in their own practice. Resources available on the Internet regarding mobile medical applications, guidelines and published research on mobile medical applications. Mobile application stores (e.g. iTunes, Google Play) are not effective means of identifying mobile medical applications. Users of mobile devices that desire to implement mobile medical applications into practice need to carefully assess individual applications prior to utilisation. Searching and identifying mobile medical applications requires clinicians to utilise multiple references to determine what application is best for their individual practice methods. This can be done with a cursory exploration of mobile application stores and then moving onto other available resources published in the literature or through Internet resources (e.g. blogs, medical websites, social media). Clinicians must also take steps to ensure that an identified mobile application can be integrated into practice after carefully reviewing it themselves. Clinicians seeking to identify mobile medical application for use in their individual practice should use a combination of app stores, published literature, web-based resources, and personal review to ensure safe and appropriate use. © 2014 John Wiley & Sons Ltd.

  13. Predictive Performance Tuning of OpenACC Accelerated Applications

    KAUST Repository

    Siddiqui, Shahzeb

    2014-05-04

    Graphics Processing Units (GPUs) are gradually becoming mainstream in supercomputing as their capabilities to significantly accelerate a large spectrum of scientific applications have been clearly identified and proven. Moreover, with the introduction of high level programming models such as OpenACC [1] and OpenMP 4.0 [2], these devices are becoming more accessible and practical to use by a larger scientific community. However, performance optimization of OpenACC accelerated applications usually requires an in-depth knowledge of the hardware and software specifications. We suggest a prediction-based performance tuning mechanism [3] to quickly tune OpenACC parameters for a given application to dynamically adapt to the execution environment on a given system. This approach is applied to a finite difference kernel to tune the OpenACC gang and vector clauses for mapping the compute kernels into the underlying accelerator architecture. Our experiments show a significant performance improvement against the default compiler parameters and a faster tuning by an order of magnitude compared to the brute force search tuning.

  14. Improving applicant selection: identifying qualities of the unsuccessful otolaryngology resident.

    Science.gov (United States)

    Badran, Karam W; Kelley, Kanwar; Conderman, Christian; Mahboubi, Hossein; Armstrong, William B; Bhandarkar, Naveen D

    2015-04-01

    To identify the prevalence and management of problematic residents. Additionally, we hope to identify the factors associated with successful remediation of unsuccessful otolaryngology residents. Self-reported Internet and paper-based survey. An anonymous survey was distributed to 152 current and former program directors (PDs) in 2012. The factors associated with unsuccessful otolaryngology residents and those associated with the successful remediation of problematic residents were investigated. An unsuccessful resident is defined as one who quit or was removed from the program for any reason, or one whose actions resulted in criminal action or citation against their medical license after graduation from residency. Remediation is defined as an individualized program implemented to correct documented weaknesses. The overall response rate was 26% (40 PDs). Seventy-three unsuccessful or problematic residents were identified. Sixty-six problematic or unsuccessful residents were identified during residency, with 58 of 66 (88%) undergoing remediation. Thirty-one (47%) residents did not graduate. The most commonly identified factors of an unsuccessful resident were: change in specialty (21.5%), interpersonal and communication skills with health professionals (13.9%), and clinical judgment (10.1%). Characteristics of those residents who underwent successful remediation include: poor performance on in-training examination (17%, P otolaryngology PDs in this sample identified at least one unsuccessful resident. Improved methods of applicant screening may assist in optimizing otolaryngology resident selection. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  15. Family and academic performance: identifying high school student profiles

    Directory of Open Access Journals (Sweden)

    Alicia Aleli Chaparro Caso López

    2016-01-01

    Full Text Available The objective of this study was to identify profiles of high school students, based on variables related to academic performance, socioeconomic status, cultural capital and family organization. A total of 21,724 high school students, from the five municipalities of the state of Baja California, took part. A K-means cluster analysis was performed to identify the profiles. The analyses identified two clearly-defined clusters: Cluster 1 grouped together students with high academic performance and who achieved higher scores for socioeconomic status, cultural capital and family involvement, whereas Cluster 2 brought together students with low academic achievement, and who also obtained lower scores for socioeconomic status and cultural capital, and had less family involvement. It is concluded that the family variables analyzed form student profiles that can be related to academic achievement.

  16. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  17. An Application Of Receptor Modeling To Identify Airborne Particulate ...

    African Journals Online (AJOL)

    An Application Of Receptor Modeling To Identify Airborne Particulate Sources In Lagos, Nigeria. FS Olise, OK Owoade, HB Olaniyi. Abstract. There have been no clear demarcations between industrial and residential areas of Lagos with focus on industry as the major source. There is need to identify potential source types in ...

  18. Expert system applications to nuclear power for enhancement of productivity and performance

    International Nuclear Information System (INIS)

    Naser, J.A.; Cain, D.G.; Sun, B.K.H.; Colley, R.W.; Hirota, N.S.; Gelhaus, F.E.

    1989-01-01

    Expert system technology has matured enough to offer a great deal of promise for a number of application areas in the electric utility industry. These applications can enhance productivity and aid in decision-making. Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of expert system technology. The first effort is the development of expert system building tools which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities which are required. The tool development helps define the applications which can be successfully developed. The purpose of this paper is to describe some of the tool and application development work which is being performed at EPRI for the electric utility industry. (orig.)

  19. Heat pumps for geothermal applications: availability and performance. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Reistad, G.M.; Means, P.

    1980-05-01

    A study of the performance and availability of water-source heat pumps was carried out. The primary purposes were to obtain the necessary basic information required for proper evaluation of the role of water-source heat pumps in geothermal energy utilization and/or to identify the research needed to provide this information. The Search of Relevant Literature considers the historical background, applications, achieved and projected performance evaluations and performance improvement techniques. The commercial water-source heat pump industry is considered in regard to both the present and projected availability and performance of units. Performance evaluations are made for units that use standard components but are redesigned for use in geothermal heating.

  20. Identifying Architectural Technical Debt in Android Applications through Compliance Checking

    NARCIS (Netherlands)

    Verdecchia, R.

    By considering the fast pace at which mobile applications need to evolve, Architectural Technical Debt results to be a crucial yet implicit factor of success. In this research we present an approach to automatically identify Architectural Technical Debt in Android applications. The approach takes

  1. Performance Issues in High Performance Fortran Implementations of Sensor-Based Applications

    Directory of Open Access Journals (Sweden)

    David R. O'hallaron

    1997-01-01

    Full Text Available Applications that get their inputs from sensors are an important and often overlooked application domain for High Performance Fortran (HPF. Such sensor-based applications typically perform regular operations on dense arrays, and often have latency and through put requirements that can only be achieved with parallel machines. This article describes a study of sensor-based applications, including the fast Fourier transform, synthetic aperture radar imaging, narrowband tracking radar processing, multibaseline stereo imaging, and medical magnetic resonance imaging. The applications are written in a dialect of HPF developed at Carnegie Mellon, and are compiled by the Fx compiler for the Intel Paragon. The main results of the study are that (1 it is possible to realize good performance for realistic sensor-based applications written in HPF and (2 the performance of the applications is determined by the performance of three core operations: independent loops (i.e., loops with no dependences between iterations, reductions, and index permutations. The article discusses the implications for HPF implementations and introduces some simple tests that implementers and users can use to measure the efficiency of the loops, reductions, and index permutations generated by an HPF compiler.

  2. Nuclide identifier and grat data reader application for ORIGEN output file

    International Nuclear Information System (INIS)

    Arif Isnaeni

    2011-01-01

    ORIGEN is a one-group depletion and radioactive decay computer code developed at the Oak Ridge National Laboratory (ORNL). ORIGEN takes one-group neutronics calculation providing various nuclear material characteristics (the buildup, decay and processing of radioactive materials). ORIGEN output is a text-based file, ORIGEN output file contains only numbers in the form of group data nuclide, nuclide identifier and grat. This application was created to facilitate data collection nuclide identifier and grat, this application also has a function to acquire mass number data and calculate mass (gram) for each nuclide. Output from these applications can be used for computer code data input for neutronic calculations such as MCNP. (author)

  3. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  4. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  5. I/O Performance Characterization of Lustre and NASA Applications on Pleiades

    Science.gov (United States)

    Saini, Subhash; Rappleye, Jason; Chang, Johnny; Barker, David Peter; Biswas, Rupak; Mehrotra, Piyush

    2012-01-01

    In this paper we study the performance of the Lustre file system using five scientific and engineering applications representative of NASA workload on large-scale supercomputing systems such as NASA s Pleiades. In order to facilitate the collection of Lustre performance metrics, we have developed a software tool that exports a wide variety of client and server-side metrics using SGI's Performance Co-Pilot (PCP), and generates a human readable report on key metrics at the end of a batch job. These performance metrics are (a) amount of data read and written, (b) number of files opened and closed, and (c) remote procedure call (RPC) size distribution (4 KB to 1024 KB, in powers of 2) for I/O operations. RPC size distribution measures the efficiency of the Lustre client and can pinpoint problems such as small write sizes, disk fragmentation, etc. These extracted statistics are useful in determining the I/O pattern of the application and can assist in identifying possible improvements for users applications. Information on the number of file operations enables a scientist to optimize the I/O performance of their applications. Amount of I/O data helps users choose the optimal stripe size and stripe count to enhance I/O performance. In this paper, we demonstrate the usefulness of this tool on Pleiades for five production quality NASA scientific and engineering applications. We compare the latency of read and write operations under Lustre to that with NFS by tracing system calls and signals. We also investigate the read and write policies and study the effect of page cache size on I/O operations. We examine the performance impact of Lustre stripe size and stripe count along with performance evaluation of file per process and single shared file accessed by all the processes for NASA workload using parameterized IOR benchmark.

  6. Integrated multi sensors and camera video sequence application for performance monitoring in archery

    Science.gov (United States)

    Taha, Zahari; Arif Mat-Jizat, Jessnor; Amirul Abdullah, Muhammad; Muazu Musa, Rabiu; Razali Abdullah, Mohamad; Fauzi Ibrahim, Mohamad; Hanafiah Shaharudin, Mohd Ali

    2018-03-01

    This paper explains the development of a comprehensive archery performance monitoring software which consisted of three camera views and five body sensors. The five body sensors evaluate biomechanical related variables of flexor and extensor muscle activity, heart rate, postural sway and bow movement during archery performance. The three camera views with the five body sensors are integrated into a single computer application which enables the user to view all the data in a single user interface. The five body sensors’ data are displayed in a numerical and graphical form in real-time. The information transmitted by the body sensors are computed with an embedded algorithm that automatically transforms the summary of the athlete’s biomechanical performance and displays in the application interface. This performance will be later compared to the pre-computed psycho-fitness performance from the prefilled data into the application. All the data; camera views, body sensors; performance-computations; are recorded for further analysis by a sports scientist. Our developed application serves as a powerful tool for assisting the coach and athletes to observe and identify any wrong technique employ during training which gives room for correction and re-evaluation to improve overall performance in the sport of archery.

  7. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  8. Identifying the connective strength between model parameters and performance criteria

    Directory of Open Access Journals (Sweden)

    B. Guse

    2017-11-01

    Full Text Available In hydrological models, parameters are used to represent the time-invariant characteristics of catchments and to capture different aspects of hydrological response. Hence, model parameters need to be identified based on their role in controlling the hydrological behaviour. For the identification of meaningful parameter values, multiple and complementary performance criteria are used that compare modelled and measured discharge time series. The reliability of the identification of hydrologically meaningful model parameter values depends on how distinctly a model parameter can be assigned to one of the performance criteria. To investigate this, we introduce the new concept of connective strength between model parameters and performance criteria. The connective strength assesses the intensity in the interrelationship between model parameters and performance criteria in a bijective way. In our analysis of connective strength, model simulations are carried out based on a latin hypercube sampling. Ten performance criteria including Nash–Sutcliffe efficiency (NSE, Kling–Gupta efficiency (KGE and its three components (alpha, beta and r as well as RSR (the ratio of the root mean square error to the standard deviation for different segments of the flow duration curve (FDC are calculated. With a joint analysis of two regression tree (RT approaches, we derive how a model parameter is connected to different performance criteria. At first, RTs are constructed using each performance criterion as the target variable to detect the most relevant model parameters for each performance criterion. Secondly, RTs are constructed using each parameter as the target variable to detect which performance criteria are impacted by changes in the values of one distinct model parameter. Based on this, appropriate performance criteria are identified for each model parameter. In this study, a high bijective connective strength between model parameters and performance criteria

  9. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  10. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  11. Performance-Oriented packaging: A guide to identifying, procuring, and using

    International Nuclear Information System (INIS)

    O'Brien, J.H.

    1992-09-01

    This document guides users through the process of correctly identifying, obtaining, and using performance-oriented packaging. Almost all hazardous material shipments can be made in commercially available performance-oriented packaging. To cover the remaining shipments requiring specially designed packaging, a design guide is being developed. The design guide is scheduled to be issued 1 year after this procurement guide

  12. Diagnostic performance of BMI percentiles to identify adolescents with metabolic syndrome.

    Science.gov (United States)

    Laurson, Kelly R; Welk, Gregory J; Eisenmann, Joey C

    2014-02-01

    To compare the diagnostic performance of the Centers for Disease Control and Prevention (CDC) and FITNESSGRAM (FGram) BMI standards for quantifying metabolic risk in youth. Adolescents in the NHANES (n = 3385) were measured for anthropometric variables and metabolic risk factors. BMI percentiles were calculated, and youth were categorized by weight status (using CDC and FGram thresholds). Participants were also categorized by presence or absence of metabolic syndrome. The CDC and FGram standards were compared by prevalence of metabolic abnormalities, various diagnostic criteria, and odds of metabolic syndrome. Receiver operating characteristic curves were also created to identify optimal BMI percentiles to detect metabolic syndrome. The prevalence of metabolic syndrome in obese youth was 19% to 35%, compared with <2% in the normal-weight groups. The odds of metabolic syndrome for obese boys and girls were 46 to 67 and 19 to 22 times greater, respectively, than for normal-weight youth. The receiver operating characteristic analyses identified optimal thresholds similar to the CDC standards for boys and the FGram standards for girls. Overall, BMI thresholds were more strongly associated with metabolic syndrome in boys than in girls. Both the CDC and FGram standards are predictive of metabolic syndrome. The diagnostic utility of the CDC thresholds outperformed the FGram values for boys, whereas FGram standards were slightly better thresholds for girls. The use of a common set of thresholds for school and clinical applications would provide advantages for public health and clinical research and practice.

  13. Performance of the lot quality assurance sampling method compared to surveillance for identifying inadequately-performing areas in Matlab, Bangladesh.

    Science.gov (United States)

    Bhuiya, Abbas; Hanifi, S M A; Roy, Nikhil; Streatfield, P Kim

    2007-03-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population.

  14. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  15. Identifying The Most Applicable Renewable Energy Systems Of Iran

    Directory of Open Access Journals (Sweden)

    Nasibeh Mousavi

    2017-03-01

    Full Text Available These years because of energy crisis all of country try to find a new way to reduce energy consumptions and obtain maximum use of renewable energy. Iran also is not an exception of this progress. Renewable energy is energy that is provided by renewable sources such as the sun or wind. In general renewable energies are not adaptable to every single community. Because of location and special climate conditions of Iran most applicable renewable energy systems in Iran are solar and wind energy. Main purpose of this paper is to review and identify most applicable renewable energy systems of Iran and also review on traditional and current methods that utilized to obtain maximum use of these renewable energies.

  16. Identifying the most significant indicators of the total road safety performance index.

    Science.gov (United States)

    Tešić, Milan; Hermans, Elke; Lipovac, Krsto; Pešić, Dalibor

    2018-04-01

    The review of the national and international literature dealing with the assessment of the road safety level has shown great efforts of the authors who tried to define the methodology for calculating the composite road safety index on a territory (region, state, etc.). The procedure for obtaining a road safety composite index of an area has been largely harmonized. The question that has not been fully resolved yet concerns the selection of indicators. There is a wide range of road safety indicators used to show a road safety situation on a territory. Road safety performance index (RSPI) obtained on the basis of a larger number of safety performance indicators (SPIs) enable decision makers to more precisely define the earlier goal- oriented actions. However, recording a broader comprehensive set of SPIs helps identify the strengths and weaknesses of a country's road safety system. Providing high quality national and international databases that would include comparable SPIs seems to be difficult since a larger number of countries dispose of a small number of identical indicators available for use. Therefore, there is a need for calculating a road safety performance index with a limited number of indicators (RSPI ln n ) which will provide a comparison of a sufficient quality, of as many countries as possible. The application of the Data Envelopment Analysis (DEA) method and correlative analysis has helped to check if the RSPI ln n is likely to be of sufficient quality. A strong correlation between the RSPI ln n and the RSPI has been identified using the proposed methodology. Based on this, the most contributing indicators and methodologies for gradual monitoring of SPIs, have been defined for each country analyzed. The indicator monitoring phases in the analyzed countries have been defined in the following way: Phase 1- the indicators relating to alcohol, speed and protective systems; Phase 2- the indicators relating to roads and Phase 3- the indicators relating to

  17. Performance test of a dual-purpose disc agrochemical applicator ...

    African Journals Online (AJOL)

    The performance test of a dual-purpose disc agrochemical applicator for field crop was conducted with view to assess the distribution patterns/droplet sizes and uniformity of spreading and or spraying for the agrochemical application. The equipment performances for both granular and liquid chemical application were ...

  18. Total System Performance Assessment - License Application Methods and Approach

    International Nuclear Information System (INIS)

    McNeish, J.

    2003-01-01

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document

  19. Benchmarking road safety performance: Identifying a meaningful reference (best-in-class).

    Science.gov (United States)

    Chen, Faan; Wu, Jiaorong; Chen, Xiaohong; Wang, Jianjun; Wang, Di

    2016-01-01

    For road safety improvement, comparing and benchmarking performance are widely advocated as the emerging and preferred approaches. However, there is currently no universally agreed upon approach for the process of road safety benchmarking, and performing the practice successfully is by no means easy. This is especially true for the two core activities of which: (1) developing a set of road safety performance indicators (SPIs) and combining them into a composite index; and (2) identifying a meaningful reference (best-in-class), one which has already obtained outstanding road safety practices. To this end, a scientific technique that can combine the multi-dimensional safety performance indicators (SPIs) into an overall index, and subsequently can identify the 'best-in-class' is urgently required. In this paper, the Entropy-embedded RSR (Rank-sum ratio), an innovative, scientific and systematic methodology is investigated with the aim of conducting the above two core tasks in an integrative and concise procedure, more specifically in a 'one-stop' way. Using a combination of results from other methods (e.g. the SUNflower approach) and other measures (e.g. Human Development Index) as a relevant reference, a given set of European countries are robustly ranked and grouped into several classes based on the composite Road Safety Index. Within each class the 'best-in-class' is then identified. By benchmarking road safety performance, the results serve to promote best practice, encourage the adoption of successful road safety strategies and measures and, more importantly, inspire the kind of political leadership needed to create a road transport system that maximizes safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. The application of particle swarm optimization to identify gamma spectrum with neural network

    International Nuclear Information System (INIS)

    Shi Dongsheng; Di Yuming; Zhou Chunlin

    2006-01-01

    Aiming at the shortcomings that BP algorithm is usually trapped to a local optimum and it has a low speed of convergence in the application of neural network to identify gamma spectrum, according to the advantage of the globe optimal searching of particle swarm optimization, this paper put forward a new algorithm for neural network training by combining BP algorithm and Particle Swarm Optimization-mixed PSO-BP algorithm. In the application to identify gamma spectrum, the new algorithm overcomes the shortcoming that BP algorithm is usually trapped to a local optimum and the neural network trained by it has a high ability of generalization with identification result of one hundred percent correct. Practical example shows that the mixed PSO-BP algorithm can effectively and reliably be used to identify gamma spectrum. (authors)

  1. Plasma proteomics to identify biomarkers - Application to cardiovascular diseases

    DEFF Research Database (Denmark)

    Beck, Hans Christian; Overgaard, Martin; Melholt Rasmussen, Lars

    2015-01-01

    There is an unmet need for new cardiovascular biomarkers. Despite this only few biomarkers for the diagnosis or screening of cardiovascular diseases have been implemented in the clinic. Thousands of proteins can be analysed in plasma by mass spectrometry-based proteomics technologies. Therefore......, this technology may therefore identify new biomarkers that previously have not been associated with cardiovascular diseases. In this review, we summarize the key challenges and considerations, including strategies, recent discoveries and clinical applications in cardiovascular proteomics that may lead...

  2. Identifying the Gifted Child Humorist.

    Science.gov (United States)

    Fern, Tami L.

    1991-01-01

    This study attempted to identify gifted child humorists among 1,204 children in grades 3-6. Final identification of 13 gifted child humorists was determined through application of such criteria as funniness, originality, and exemplary performance or product. The influence of intelligence, development, social factors, sex differences, family…

  3. A prediction model to identify hospitalised, older adults with reduced physical performance

    DEFF Research Database (Denmark)

    Bruun, Inge H; Maribo, Thomas; Nørgaard, Birgitte

    2017-01-01

    of discharge, health systems could offer these patients additional therapy to maintain or improve health and prevent institutionalisation or readmission. The principle aim of this study was to identify predictors for persisting, reduced physical performance in older adults following acute hospitalisation......BACKGROUND: Identifying older adults with reduced physical performance at the time of hospital admission can significantly affect patient management and trajectory. For example, such patients could receive targeted hospital interventions such as routine mobilisation. Furthermore, at the time...... admission, falls, physical activity level, self-rated health, use of a walking aid before admission, number of prescribed medications, 30s-CST, and the De Morton Mobility Index. RESULTS: A total of 78 (67%) patients improved in physical performance in the interval between admission and follow-up assessment...

  4. Performance analysis of CRF-based learning for processing WoT application requests expressed in natural language.

    Science.gov (United States)

    Yoon, Young

    2016-01-01

    In this paper, we investigate the effectiveness of a CRF-based learning method for identifying necessary Web of Things (WoT) application components that would satisfy the users' requests issued in natural language. For instance, a user request such as "archive all sports breaking news" can be satisfied by composing a WoT application that consists of ESPN breaking news service and Dropbox as a storage service. We built an engine that can identify the necessary application components by recognizing a main act (MA) or named entities (NEs) from a given request. We trained this engine with the descriptions of WoT applications (called recipes) that were collected from IFTTT WoT platform. IFTTT hosts over 300 WoT entities that offer thousands of functions referred to as triggers and actions. There are more than 270,000 publicly-available recipes composed with those functions by real users. Therefore, the set of these recipes is well-qualified for the training of our MA and NE recognition engine. We share our unique experience of generating the training and test set from these recipe descriptions and assess the performance of the CRF-based language method. Based on the performance evaluation, we introduce further research directions.

  5. Solving Enterprise Applications Performance Puzzles Queuing Models to the Rescue

    CERN Document Server

    Grinshpan, Leonid

    2012-01-01

    A groundbreaking scientific approach to solving enterprise applications performance problems Enterprise applications are the information backbone of today's corporations, supporting vital business functions such as operational management, supply chain maintenance, customer relationship administration, business intelligence, accounting, procurement logistics, and more. Acceptable performance of enterprise applications is critical for a company's day-to-day operations as well as for its profitability. Unfortunately, troubleshooting poorly performing enterprise applications has traditionally

  6. 12 CFR 228.29 - Effect of CRA performance on applications.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Effect of CRA performance on applications. 228... account the record of performance under the CRA of: (1) Each applicant bank for the: (i) Establishment of... approval of application. A bank's record of performance may be the basis for denying or conditioning...

  7. 12 CFR 25.29 - Effect of CRA performance on applications.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Effect of CRA performance on applications. 25... takes into account the record of performance under the CRA of each applicant bank in considering an... application. A bank's record of performance may be the basis for denying or conditioning approval of an...

  8. Total System Performance Assessment - License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2003-12-08

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document.

  9. Expert system applications to nuclear plant for enhancement of productivity and performance

    International Nuclear Information System (INIS)

    Sun, B.; Cain, D.; Naser, J.; Colley, R.; Hirota, N.

    1988-01-01

    Expert systems, a major essence of the artificial intelligence (AI) technology, are referred to as computer software and hardware systems which are designed to capture and emulate the knowledge, reasoning, judgment, and to store the expertise of humans. Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools which are tailored to electric utility industry applications. The second effort is the development of expert system application prototypes. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities which are required. The tool development helps define the applications which can be successfully developed. This paper summarizes a number of research projects which are being performed at EPRI in both the areas of expert system building tool development and expert system applications to operations and maintenance. The AI technology as demonstrated by the development is being established as a credible technological tool for the electric utility industry. A challenge to transferring the expert systems technology to the utility industry is to gain utility users' acceptance of this modern information technology. To achieve successful technology transfer, the technology developers need to (1) understand the problems which can be addressed successfully using AI technology, (2) involve with users throughout the development and testing phases, and (3) demonstrate the benefits of the technology by the users

  10. Screening applicants for risk of poor academic performance: a novel scoring system using preadmission grade point averages and graduate record examination scores.

    Science.gov (United States)

    Luce, David

    2011-01-01

    The purpose of this study was to develop an effective screening tool for identifying physician assistant (PA) program applicants at highest risk for poor academic performance. Prior to reviewing applications for the class of 2009, a retrospective analysis of preadmission data took place for the classes of 2006, 2007, and 2008. A single composite score was calculated for each student who matriculated (number of subjects, N=228) incorporating the total undergraduate grade point average (UGPA), the science GPA (SGPA), and the three component Graduate Record Examination (GRE) scores: verbal (GRE-V), quantitative (GRE-Q), analytical (GRE-A). Individual applicant scores for each of the five parameters were ranked in descending quintiles. Each applicant's five quintile scores were then added, yielding a total quintile score ranging from 25, which indicated an excellent performance, to 5, which indicated poorer performance. Thirteen of the 228 students had academic difficulty (dismissal, suspension, or one-quarter on academic warning or probation). Twelve of the 13 students having academic difficulty had a preadmission total quintile score 12 (range, 6-14). In response to this descriptive analysis, when selecting applicants for the class of 2009, the admissions committee used the total quintile score for screening applicants for interviews. Analysis of correlations in preadmission, graduate, and postgraduate performance data for the classes of 2009-2013 will continue and may help identify those applicants at risk for academic difficulty. Establishing a threshold total quintile score of applicant GPA and GRE scores may significantly decrease the number of entering PA students at risk for poor academic performance.

  11. Identifying context factors explaining physician's low performance in communication assessment: an explorative study in general practice.

    Science.gov (United States)

    Essers, Geurt; van Dulmen, Sandra; van Weel, Chris; van der Vleuten, Cees; Kramer, Anneke

    2011-12-13

    Communication is a key competence for health care professionals. Analysis of registrar and GP communication performance in daily practice, however, suggests a suboptimal application of communication skills. The influence of context factors could reveal why communication performance levels, on average, do not appear adequate. The context of daily practice may require different skills or specific ways of handling these skills, whereas communication skills are mostly treated as generic. So far no empirical analysis of the context has been made. Our aim was to identify context factors that could be related to GP communication. A purposive sample of real-life videotaped GP consultations was analyzed (N = 17). As a frame of reference we chose the MAAS-Global, a widely used assessment instrument for medical communication. By inductive reasoning, we analyzed the GP behaviour in the consultation leading to poor item scores on the MAAS-Global. In these cases we looked for the presence of an intervening context factor, and how this might explain the actual GP communication behaviour. We reached saturation after having viewed 17 consultations. We identified 19 context factors that could potentially explain the deviation from generic recommendations on communication skills. These context factors can be categorized into doctor-related, patient-related, and consultation-related factors. Several context factors seem to influence doctor-patient communication, requiring the GP to apply communication skills differently from recommendations on communication. From this study we conclude that there is a need to explicitly account for context factors in the assessment of GP (and GP registrar) communication performance. The next step is to validate our findings.

  12. APM Best Practices Realizing Application Performance Management

    CERN Document Server

    Sydor, Michael J

    2011-01-01

    The objective of APM Best Practices: Realizing Application Performance Management is to establish reliable application performance management (APM) practices - to demonstrate value, to do it quickly, and to adapt to the client circumstances. It's important to balance long-term goals with short-term deliverables, but without compromising usefulness or correctness. The successful strategy is to establish a few reasonable goals, achieve them quickly, and then iterate over the same topics two more times, with each successive iteration expanding the skills and capabilities of the APM team. This str

  13. High-performance silicon photonics technology for telecommunications applications.

    Science.gov (United States)

    Yamada, Koji; Tsuchizawa, Tai; Nishi, Hidetaka; Kou, Rai; Hiraki, Tatsurou; Takeda, Kotaro; Fukuda, Hiroshi; Ishikawa, Yasuhiko; Wada, Kazumi; Yamamoto, Tsuyoshi

    2014-04-01

    By way of a brief review of Si photonics technology, we show that significant improvements in device performance are necessary for practical telecommunications applications. In order to improve device performance in Si photonics, we have developed a Si-Ge-silica monolithic integration platform, on which compact Si-Ge-based modulators/detectors and silica-based high-performance wavelength filters are monolithically integrated. The platform features low-temperature silica film deposition, which cannot damage Si-Ge-based active devices. Using this platform, we have developed various integrated photonic devices for broadband telecommunications applications.

  14. High-performance silicon photonics technology for telecommunications applications

    International Nuclear Information System (INIS)

    Yamada, Koji; Tsuchizawa, Tai; Nishi, Hidetaka; Kou, Rai; Hiraki, Tatsurou; Takeda, Kotaro; Fukuda, Hiroshi; Yamamoto, Tsuyoshi; Ishikawa, Yasuhiko; Wada, Kazumi

    2014-01-01

    By way of a brief review of Si photonics technology, we show that significant improvements in device performance are necessary for practical telecommunications applications. In order to improve device performance in Si photonics, we have developed a Si-Ge-silica monolithic integration platform, on which compact Si-Ge–based modulators/detectors and silica-based high-performance wavelength filters are monolithically integrated. The platform features low-temperature silica film deposition, which cannot damage Si-Ge–based active devices. Using this platform, we have developed various integrated photonic devices for broadband telecommunications applications. (review)

  15. High-performance silicon photonics technology for telecommunications applications

    Science.gov (United States)

    Yamada, Koji; Tsuchizawa, Tai; Nishi, Hidetaka; Kou, Rai; Hiraki, Tatsurou; Takeda, Kotaro; Fukuda, Hiroshi; Ishikawa, Yasuhiko; Wada, Kazumi; Yamamoto, Tsuyoshi

    2014-04-01

    By way of a brief review of Si photonics technology, we show that significant improvements in device performance are necessary for practical telecommunications applications. In order to improve device performance in Si photonics, we have developed a Si-Ge-silica monolithic integration platform, on which compact Si-Ge-based modulators/detectors and silica-based high-performance wavelength filters are monolithically integrated. The platform features low-temperature silica film deposition, which cannot damage Si-Ge-based active devices. Using this platform, we have developed various integrated photonic devices for broadband telecommunications applications.

  16. Application of identifying transmission spheres for spherical surface testing

    Science.gov (United States)

    Han, Christopher B.; Ye, Xin; Li, Xueyuan; Wang, Quanzhao; Tang, Shouhong; Han, Sen

    2017-06-01

    We developed a new application on Microsoft Foundation Classes (MFC) to identify correct transmission spheres (TS) for Spherical Surface Testing (SST). Spherical surfaces are important optical surfaces, and the wide application and high production rate of spherical surfaces necessitates an accurate and highly reliable measuring device. A Fizeau Interferometer is an appropriate tool for SST due to its subnanometer accuracy. It measures the contour of a spherical surface using a common path, which is insensitive to the surrounding circumstances. The Fizeau Interferometer transmits a wide laser beam, creating interference fringes from re-converging light from the transmission sphere and the test surface. To make a successful measurement, the application calculates and determines the appropriate transmission sphere for the test surface. There are 3 main inputs from the test surfaces that are utilized to determine the optimal sizes and F-numbers of the transmission spheres: (1) the curvatures (concave or convex), (2) the Radii of Curvature (ROC), and (3) the aperture sizes. The application will firstly calculate the F-numbers (i.e. ROC divided by aperture) of the test surface, secondly determine the correct aperture size of a convex surface, thirdly verify that the ROC of the test surface must be shorter than the reference surface's ROC of the transmission sphere, and lastly calculate the percentage of area that the test surface will be measured. However, the amount of interferometers and transmission spheres should be optimized when measuring large spherical surfaces to avoid requiring a large amount of interferometers and transmission spheres for each test surface. Current measuring practices involve tedious and potentially inaccurate calculations. This smart application eliminates human calculation errors, optimizes the selection of transmission spheres (including the least number required) and interferometer sizes, and increases efficiency.

  17. A systematic literature search to identify performance measure outcomes used in clinical studies of racehorses.

    Science.gov (United States)

    Wylie, C E; Newton, J R

    2018-05-01

    Racing performance is often used as a measurable outcome variable in research studies investigating clinical diagnoses or interventions. However, the use of many different performance measures largely precludes conduct of meaningful comparative studies and, to date, those being used have not been collated. To systematically review the veterinary scientific literature for the use of racing performance as a measurable outcome variable in clinical studies of racehorses, collate and identify those most popular, and identify their advantages and disadvantages. Systematic literature search. The search criteria "((racing AND performance) AND (horses OR equidae))" were adapted for both MEDLINE and CAB Abstracts databases. Data were collected in standardised recording forms for binary, categorical and quantitative measures, and the use of performance indices. In total, 217 studies that described racing performance were identified, contributing 117 different performance measures. No one performance measure was used in all studies, despite 90.3% using more than one variable. Data regarding race starts and earnings were used most commonly, with 88.0% and 54.4% of studies including at least one measure of starts and earnings, respectively. Seventeen variables were used 10 times or more, with the top five comprising: 'return to racing', 'number of starts', 'days to first start', 'earnings per period of time' and 'earnings per start'. The search strategies may not have identified all relevant papers, introducing bias to the review. Performance indices have been developed to improve assessment of interventions; however, they are not widely adopted in the scientific literature. Use of the two most commonly identified measures, whether the horse returned to racing and number of starts over a defined period of time, would best facilitate future systematic reviews and meta-analyses in advance of the development of a gold-standard measure of race performance outcome. © 2017 EVJ Ltd.

  18. Optical design applications for enhanced illumination performance

    Science.gov (United States)

    Gilray, Carl; Lewin, Ian

    1995-08-01

    Nonimaging optical design techniques have been applied in the illumination industry for many years. Recently however, powerful software has been developed which allows accurate simulation and optimization of illumination devices. Wide experience has been obtained in using such design techniques for practical situations. These include automotive lighting where safety is of greatest importance, commercial lighting systems designed for energy efficiency, and numerous specialized applications. This presentation will discuss the performance requirements of a variety of illumination devices. It will further cover design methodology and present a variety of examples of practical applications for enhanced system performance.

  19. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  20. Identifiability and Identification of Trace Continuous Pollutant Source

    Directory of Open Access Journals (Sweden)

    Hongquan Qu

    2014-01-01

    Full Text Available Accidental pollution events often threaten people’s health and lives, and a pollutant source is very necessary so that prompt remedial actions can be taken. In this paper, a trace continuous pollutant source identification method is developed to identify a sudden continuous emission pollutant source in an enclosed space. The location probability model is set up firstly, and then the identification method is realized by searching a global optimal objective value of the location probability. In order to discuss the identifiability performance of the presented method, a conception of a synergy degree of velocity fields is presented in order to quantitatively analyze the impact of velocity field on the identification performance. Based on this conception, some simulation cases were conducted. The application conditions of this method are obtained according to the simulation studies. In order to verify the presented method, we designed an experiment and identified an unknown source appearing in the experimental space. The result showed that the method can identify a sudden trace continuous source when the studied situation satisfies the application conditions.

  1. Real-Time Application Performance Steering and Adaptive Control

    National Research Council Canada - National Science Library

    Reed, Daniel

    2002-01-01

    .... The objective of the Real-time Application Performance Steering and Adaptive Control project is to replace ad hoc, post-mortem performance optimization with an extensible, portable, and distributed...

  2. Cache Performance Optimization for SoC Vedio Applications

    OpenAIRE

    Lei Li; Wei Zhang; HuiYao An; Xing Zhang; HuaiQi Zhu

    2014-01-01

    Chip Multiprocessors (CMPs) are adopted by industry to deal with the speed limit of the single-processor. But memory access has become the bottleneck of the performance, especially in multimedia applications. In this paper, a set of management policies is proposed to improve the cache performance for a SoC platform of video application. By analyzing the behavior of Vedio Engine, the memory-friendly writeback and efficient prefetch policies are adopted. The experiment platform is simulated by ...

  3. LFK, FORTRAN Application Performance Test

    International Nuclear Information System (INIS)

    McMahon, F.H.

    1991-01-01

    1 - Description of program or function: LFK, the Livermore FORTRAN Kernels, is a computer performance test that measures a realistic floating-point performance range for FORTRAN applications. Informally known as the Livermore Loops test, the LFK test may be used as a computer performance test, as a test of compiler accuracy (via checksums) and efficiency, or as a hardware endurance test. The LFK test, which focuses on FORTRAN as used in computational physics, measures the joint performance of the computer CPU, the compiler, and the computational structures in units of Mega-flops/sec or Mflops. A C language version of subroutine KERNEL is also included which executes 24 samples of C numerical computation. The 24 kernels are a hydrodynamics code fragment, a fragment from an incomplete Cholesky conjugate gradient code, the standard inner product function of linear algebra, a fragment from a banded linear equations routine, a segment of a tridiagonal elimination routine, an example of a general linear recurrence equation, an equation of state fragment, part of an alternating direction implicit integration code, an integrate predictor code, a difference predictor code, a first sum, a first difference, a fragment from a two-dimensional particle-in-cell code, a part of a one-dimensional particle-in-cell code, an example of how casually FORTRAN can be written, a Monte Carlo search loop, an example of an implicit conditional computation, a fragment of a two-dimensional explicit hydrodynamics code, a general linear recurrence equation, part of a discrete ordinates transport program, a simple matrix calculation, a segment of a Planck distribution procedure, a two-dimensional implicit hydrodynamics fragment, and determination of the location of the first minimum in an array. 2 - Method of solution: CPU performance rates depend strongly on the maturity of FORTRAN compiler machine code optimization. The LFK test-bed executes the set of 24 kernels three times, resetting the DO

  4. Optical Thermal Characterization Enables High-Performance Electronics Applications

    Energy Technology Data Exchange (ETDEWEB)

    2016-02-01

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  5. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  6. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  7. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts

  8. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  9. Key Issues in Empirically Identifying Chronically Low-Performing and Turnaround Schools

    Science.gov (United States)

    Hansen, Michael

    2012-01-01

    One of the US Department of Education's key priorities is turning around the nation's persistently low-achieving schools, yet exactly how to identify low-performing schools is a task left to state policy makers, and a myriad of definitions have been utilized. In addition, exactly how to recognize when a school begins to turn around is not well…

  10. Scientific Applications Performance Evaluation on Burst Buffer

    KAUST Repository

    Markomanolis, George S.

    2017-10-19

    Parallel I/O is an integral component of modern high performance computing, especially in storing and processing very large datasets, such as the case of seismic imaging, CFD, combustion and weather modeling. The storage hierarchy includes nowadays additional layers, the latest being the usage of SSD-based storage as a Burst Buffer for I/O acceleration. We present an in-depth analysis on how to use Burst Buffer for specific cases and how the internal MPI I/O aggregators operate according to the options that the user provides during his job submission. We analyze the performance of a range of I/O intensive scientific applications, at various scales on a large installation of Lustre parallel file system compared to an SSD-based Burst Buffer. Our results show a performance improvement over Lustre when using Burst Buffer. Moreover, we show results from a data hierarchy library which indicate that the standard I/O approaches are not enough to get the expected performance from this technology. The performance gain on the total execution time of the studied applications is between 1.16 and 3 times compared to Lustre. One of the test cases achieved an impressive I/O throughput of 900 GB/s on Burst Buffer.

  11. Applications Performance on NAS Intel Paragon XP/S - 15#

    Science.gov (United States)

    Saini, Subhash; Simon, Horst D.; Copper, D. M. (Technical Monitor)

    1994-01-01

    The Numerical Aerodynamic Simulation (NAS) Systems Division received an Intel Touchstone Sigma prototype model Paragon XP/S- 15 in February, 1993. The i860 XP microprocessor with an integrated floating point unit and operating in dual -instruction mode gives peak performance of 75 million floating point operations (NIFLOPS) per second for 64 bit floating point arithmetic. It is used in the Paragon XP/S-15 which has been installed at NAS, NASA Ames Research Center. The NAS Paragon has 208 nodes and its peak performance is 15.6 GFLOPS. Here, we will report on early experience using the Paragon XP/S- 15. We have tested its performance using both kernels and applications of interest to NAS. We have measured the performance of BLAS 1, 2 and 3 both assembly-coded and Fortran coded on NAS Paragon XP/S- 15. Furthermore, we have investigated the performance of a single node one-dimensional FFT, a distributed two-dimensional FFT and a distributed three-dimensional FFT Finally, we measured the performance of NAS Parallel Benchmarks (NPB) on the Paragon and compare it with the performance obtained on other highly parallel machines, such as CM-5, CRAY T3D, IBM SP I, etc. In particular, we investigated the following issues, which can strongly affect the performance of the Paragon: a. Impact of the operating system: Intel currently uses as a default an operating system OSF/1 AD from the Open Software Foundation. The paging of Open Software Foundation (OSF) server at 22 MB to make more memory available for the application degrades the performance. We found that when the limit of 26 NIB per node out of 32 MB available is reached, the application is paged out of main memory using virtual memory. When the application starts paging, the performance is considerably reduced. We found that dynamic memory allocation can help applications performance under certain circumstances. b. Impact of data cache on the i860/XP: We measured the performance of the BLAS both assembly coded and Fortran

  12. Assessing the Performance of a Machine Learning Algorithm in Identifying Bubbles in Dust Emission

    Science.gov (United States)

    Xu, Duo; Offner, Stella S. R.

    2017-12-01

    Stellar feedback created by radiation and winds from massive stars plays a significant role in both physical and chemical evolution of molecular clouds. This energy and momentum leaves an identifiable signature (“bubbles”) that affects the dynamics and structure of the cloud. Most bubble searches are performed “by eye,” which is usually time-consuming, subjective, and difficult to calibrate. Automatic classifications based on machine learning make it possible to perform systematic, quantifiable, and repeatable searches for bubbles. We employ a previously developed machine learning algorithm, Brut, and quantitatively evaluate its performance in identifying bubbles using synthetic dust observations. We adopt magnetohydrodynamics simulations, which model stellar winds launching within turbulent molecular clouds, as an input to generate synthetic images. We use a publicly available three-dimensional dust continuum Monte Carlo radiative transfer code, HYPERION, to generate synthetic images of bubbles in three Spitzer bands (4.5, 8, and 24 μm). We designate half of our synthetic bubbles as a training set, which we use to train Brut along with citizen-science data from the Milky Way Project (MWP). We then assess Brut’s accuracy using the remaining synthetic observations. We find that Brut’s performance after retraining increases significantly, and it is able to identify yellow bubbles, which are likely associated with B-type stars. Brut continues to perform well on previously identified high-score bubbles, and over 10% of the MWP bubbles are reclassified as high-confidence bubbles, which were previously marginal or ambiguous detections in the MWP data. We also investigate the influence of the size of the training set, dust model, evolutionary stage, and background noise on bubble identification.

  13. Identifying and overcoming barriers to technology implementation

    International Nuclear Information System (INIS)

    Bailey, M.; Warren, S.; McCune, M.

    1996-01-01

    In a recent General Accounting Office report, the Department of Energy's (DOE) Office of Environmental Management was found to be ineffective in integrating their environmental technology development efforts with the cleanup actions. As a result of these findings, a study of remediation documents was performed by the Technology Applications Team within DOE's Office of Environmental Restoration (EM-40) to validate this finding and to understand why it was occurring. A second initiative built on the foundation of the remediation document study and evaluated solutions to the ineffective implementation of improved technologies. The Technology Applications Team examined over 50 remediation documents (17 projects) which included nearly 600 proposed remediation technologies. It was determined that very few technologies are reaching the Records of Decision documents. In fact, most are eliminated in the early stages of consideration. These observations stem from regulators' and stakeholders' uncertainties in cost and performance of the technology and the inability of the technology to meet site specific conditions. The Technology Applications Team also set out to identify and evaluate solutions to barriers to implementing innovative technology into the DOE's environmental management activities. Through the combined efforts of DOE and the Hazardous Waste Action Coalition (HWAC), a full day workshop was conducted at the annual HWAC meeting in June 1995 to solve barriers to innovative technology implementation. Three barriers were identified as widespread throughout the DOE complex and industry. Identified barriers included a lack of verified or certified cost and performance data for innovative technologies; risk of failure to reach cleanup goals using innovative technologies; and communication barriers that are present at virtually every stage of the characterization/remediation process from development through implementation

  14. Identifying poor performance among doctors in NHS organizations.

    Science.gov (United States)

    Locke, Rachel; Scallan, Samantha; Leach, Camilla; Rickenbach, Mark

    2013-10-01

    To account for the means by which poor performance among career doctors is identified by National Health Service organizations, whether the tools are considered effective and how these processes may be strengthened in the light of revalidation and the requirement for doctors to demonstrate their fitness to practice. This study sought to look beyond the 'doctor as individual'; as well as considering the typical approaches to managing the practice of an individual, the systems within which the doctor is working were reviewed, as these are also relevant to standards of performance. A qualitative review was undertaken consisting of a literature review of current practice, a policy review of current documentation from 15 trusts in one deanery locality, and 14 semi-structured interviews with respondents with an overview of processes in use. The framework for the analysis of the data considered tools at three levels: individual, team and organizational. Tools are, in the main, reactive--with an individual focus. They rely on colleagues and others to speak out, so their effectiveness is hindered by a reluctance to do so. Tools can lack an evidence base for their use, and there is limited linking of data across contexts and tools. There is more work to be done in evaluating current tools and developing stronger processes. Linkage between data sources needs to be improved and proactive tools at the organizational level need further development to help with the early identification of performance issues. This would also assist in balancing a wider systems approach with a current over emphasis on individual doctors. © 2012 John Wiley & Sons Ltd.

  15. Application of objective clinical human reliability analysis (OCHRA) in assessment of technical performance in laparoscopic rectal cancer surgery.

    Science.gov (United States)

    Foster, J D; Miskovic, D; Allison, A S; Conti, J A; Ockrim, J; Cooper, E J; Hanna, G B; Francis, N K

    2016-06-01

    Laparoscopic rectal resection is technically challenging, with outcomes dependent upon technical performance. No robust objective assessment tool exists for laparoscopic rectal resection surgery. This study aimed to investigate the application of the objective clinical human reliability analysis (OCHRA) technique for assessing technical performance of laparoscopic rectal surgery and explore the validity and reliability of this technique. Laparoscopic rectal cancer resection operations were described in the format of a hierarchical task analysis. Potential technical errors were defined. The OCHRA technique was used to identify technical errors enacted in videos of twenty consecutive laparoscopic rectal cancer resection operations from a single site. The procedural task, spatial location, and circumstances of all identified errors were logged. Clinical validity was assessed through correlation with clinical outcomes; reliability was assessed by test-retest. A total of 335 execution errors identified, with a median 15 per operation. More errors were observed during pelvic tasks compared with abdominal tasks (p technical performance of laparoscopic rectal surgery.

  16. Fragrance contact allergens in 5588 cosmetic products identified through a novel smartphone application

    DEFF Research Database (Denmark)

    Bennike, N H; Oturai, N B; Müller, S

    2018-01-01

    -on and 100 ppm or above in wash-off cosmetics. OBJECTIVE: To examine exposure, based on ingredient labelling, to the 26 fragrances in a sample of 5588 fragranced cosmetic products. METHODS: The investigated products were identified through a novel, non-profit smartphone application (app), designed to provide...

  17. Wireless ad hoc and sensor networks management, performance, and applications

    CERN Document Server

    He, Jing

    2013-01-01

    Although wireless sensor networks (WSNs) have been employed across a wide range of applications, there are very few books that emphasize the algorithm description, performance analysis, and applications of network management techniques in WSNs. Filling this need, Wireless Ad Hoc and Sensor Networks: Management, Performance, and Applications summarizes not only traditional and classical network management techniques, but also state-of-the-art techniques in this area. The articles presented are expository, but scholarly in nature, including the appropriate history background, a review of current

  18. 40 CFR 141.723 - Requirements to respond to significant deficiencies identified in sanitary surveys performed by EPA.

    Science.gov (United States)

    2010-07-01

    ... deficiencies identified in sanitary surveys performed by EPA. 141.723 Section 141.723 Protection of Environment... performed by EPA, systems must respond in writing to significant deficiencies identified in sanitary survey... will address significant deficiencies noted in the survey. (d) Systems must correct significant...

  19. Patient’s Cross-border Mobility Directive: Application, Performance and Perceptions Two Years after Transposition

    Directory of Open Access Journals (Sweden)

    Riedel Rafał

    2016-10-01

    Full Text Available This paper seeks to analyse the directive on the application of patients’ rights in cross-border healthcare. Two years after the transposition, it is time for first evaluations of its application, performance and perception. The analysis consists of three major elements: reconstruction of the legal scope and subject matter of the new legislation, conclusions of the evaluative reports monitoring its implementation and performance as well as the public opinion polls revealing the EU citizens’ perception of its details. These three components combined together deliver a picture of the state of play about the pan-European cross-border patients’ mobility. The bottomline conclusions negatively verify the supposition present in some earlier literature on patients’ cross-border mobility that the directive has a transformative potential leading towards the creation of truly competitive pan-European medical market. After two years of its operation, there is still no increased patients’ mobility across EU internal borders observed. As regards the speculations for the future, there are only some weak symptoms identified and they may result in intensified cross-border mobility for healthcare.

  20. Optimisation of multiplet identifier processing on a PLAYSTATION® 3

    Science.gov (United States)

    Hattori, Masami; Mizuno, Takashi

    2010-02-01

    To enable high-performance computing (HPC) for applications with large datasets using a Sony® PLAYSTATION® 3 (PS3™) video game console, we configured a hybrid system consisting of a Windows® PC and a PS3™. To validate this system, we implemented the real-time multiplet identifier (RTMI) application, which identifies multiplets of microearthquakes in terms of the similarity of their waveforms. The cross-correlation computation, which is a core algorithm of the RTMI application, was optimised for the PS3™ platform, while the rest of the computation, including data input and output remained on the PC. With this configuration, the core part of the algorithm ran 69 times faster than the original program, accelerating total computation speed more than five times. As a result, the system processed up to 2100 total microseismic events, whereas the original implementation had a limit of 400 events. These results indicate that this system enables high-performance computing for large datasets using the PS3™, as long as data transfer time is negligible compared with computation time.

  1. Application of multi-locus analytical methods to identify interacting loci in case-control studies.

    NARCIS (Netherlands)

    Vermeulen, S.; Heijer, M. den; Sham, P.; Knight, J.

    2007-01-01

    To identify interacting loci in genetic epidemiological studies the application of multi-locus methods of analysis is warranted. Several more advanced classification methods have been developed in the past years, including multiple logistic regression, sum statistics, logic regression, and the

  2. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  3. Bimanual Psychomotor Performance in Neurosurgical Resident Applicants Assessed Using NeuroTouch, a Virtual Reality Simulator.

    Science.gov (United States)

    Winkler-Schwartz, Alexander; Bajunaid, Khalid; Mullah, Muhammad A S; Marwa, Ibrahim; Alotaibi, Fahad E; Fares, Jawad; Baggiani, Marta; Azarnoush, Hamed; Zharni, Gmaan Al; Christie, Sommer; Sabbagh, Abdulrahman J; Werthner, Penny; Del Maestro, Rolando F

    Current selection methods for neurosurgical residents fail to include objective measurements of bimanual psychomotor performance. Advancements in computer-based simulation provide opportunities to assess cognitive and psychomotor skills in surgically naive populations during complex simulated neurosurgical tasks in risk-free environments. This pilot study was designed to answer 3 questions: (1) What are the differences in bimanual psychomotor performance among neurosurgical residency applicants using NeuroTouch? (2) Are there exceptionally skilled medical students in the applicant cohort? and (3) Is there an influence of previous surgical exposure on surgical performance? Participants were instructed to remove 3 simulated brain tumors with identical visual appearance, stiffness, and random bleeding points. Validated tier 1, tier 2, and advanced tier 2 metrics were used to assess bimanual psychomotor performance. Demographic data included weeks of neurosurgical elective and prior operative exposure. This pilot study was carried out at the McGill Neurosurgical Simulation Research and Training Center immediately following neurosurgical residency interviews at McGill University, Montreal, Canada. All 17 medical students interviewed were asked to participate, of which 16 agreed. Performances were clustered in definable top, middle, and bottom groups with significant differences for all metrics. Increased time spent playing music, increased applicant self-evaluated technical skills, high self-ratings of confidence, and increased skin closures statistically influenced performance on univariate analysis. A trend for both self-rated increased operating room confidence and increased weeks of neurosurgical exposure to increased blood loss was seen in multivariate analysis. Simulation technology identifies neurosurgical residency applicants with differing levels of technical ability. These results provide information for studies being developed for longitudinal studies on the

  4. An Application-Based Performance Evaluation of NASAs Nebula Cloud Computing Platform

    Science.gov (United States)

    Saini, Subhash; Heistand, Steve; Jin, Haoqiang; Chang, Johnny; Hood, Robert T.; Mehrotra, Piyush; Biswas, Rupak

    2012-01-01

    The high performance computing (HPC) community has shown tremendous interest in exploring cloud computing as it promises high potential. In this paper, we examine the feasibility, performance, and scalability of production quality scientific and engineering applications of interest to NASA on NASA's cloud computing platform, called Nebula, hosted at Ames Research Center. This work represents the comprehensive evaluation of Nebula using NUTTCP, HPCC, NPB, I/O, and MPI function benchmarks as well as four applications representative of the NASA HPC workload. Specifically, we compare Nebula performance on some of these benchmarks and applications to that of NASA s Pleiades supercomputer, a traditional HPC system. We also investigate the impact of virtIO and jumbo frames on interconnect performance. Overall results indicate that on Nebula (i) virtIO and jumbo frames improve network bandwidth by a factor of 5x, (ii) there is a significant virtualization layer overhead of about 10% to 25%, (iii) write performance is lower by a factor of 25x, (iv) latency for short MPI messages is very high, and (v) overall performance is 15% to 48% lower than that on Pleiades for NASA HPC applications. We also comment on the usability of the cloud platform.

  5. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  6. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Kohlhof, Hendrik; Heidt, Christoph; Bähler, Alexandrine; Kohl, Sandro; Gravius, Sascha; Friedrich, Max J.; Ziebarth, Kai; Stranzinger, Enno

    2015-01-01

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  7. Identifying key performance indicators for nursing and midwifery care using a consensus approach.

    Science.gov (United States)

    McCance, Tanya; Telford, Lorna; Wilson, Julie; Macleod, Olive; Dowd, Audrey

    2012-04-01

    The aim of this study was to gain consensus on key performance indicators that are appropriate and relevant for nursing and midwifery practice in the current policy context. There is continuing demand to demonstrate effectiveness and efficiency in health and social care and to communicate this at boardroom level. Whilst there is substantial literature on the use of clinical indicators and nursing metrics, there is less evidence relating to indicators that reflect the patient experience. A consensus approach was used to identify relevant key performance indicators. A nominal group technique was used comprising two stages: a workshop involving all grades of nursing and midwifery staff in two HSC trusts in Northern Ireland (n = 50); followed by a regional Consensus Conference (n = 80). During the workshop, potential key performance indicators were identified. This was used as the basis for the Consensus Conference, which involved two rounds of consensus. Analysis was based on aggregated scores that were then ranked. Stage one identified 38 potential indicators and stage two prioritised the eight top-ranked indicators as a core set for nursing and midwifery. The relevance and appropriateness of these indicators were confirmed with nurses and midwives working in a range of settings and from the perspective of service users. The eight indicators identified do not conform to the majority of other nursing metrics generally reported in the literature. Furthermore, they are strategically aligned to work on the patient experience and are reflective of the fundamentals of nursing and midwifery practice, with the focus on person-centred care. Nurses and midwives have a significant contribution to make in determining the extent to which these indicators are achieved in practice. Furthermore, measurement of such indicators provides an opportunity to evidence of the unique impact of nursing/midwifery care on the patient experience. © 2011 Blackwell Publishing Ltd.

  8. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  9. Instruction-level performance modeling and characterization of multimedia applications

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Los Alamos National Lab., NM (United States). Scientific Computing Group; Cameron, K.W. [Louisiana State Univ., Baton Rouge, LA (United States). Dept. of Computer Science

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based on microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.

  10. Identifying context factors explaining physician's low performance in communication assessment: an explorative study in general practice.

    NARCIS (Netherlands)

    Essers, G.; Dulmen, S. van; Weel, C. van; Vleuten, C. van der; Kramer, A.

    2011-01-01

    BACKGROUND: Communication is a key competence for health care professionals. Analysis of registrar and GP communication performance in daily practice, however, suggests a suboptimal application of communication skills. The influence of context factors could reveal why communication performance

  11. Review of the performance assessment in the WIPP draft compliance application

    International Nuclear Information System (INIS)

    Lee, W.W.L.

    1996-01-01

    On March 31, 1995, the U.S. Department of Energy (USDOE) filed a draft compliance certification application (DCCA) with the U.S. Environmental Protection Agency (USEPA) to show the Waste Isolation Pilot Plant's compliance with the USEPA's environmental standards for disposal of high-level and transuranic waste. Demonstration of compliance is by a performance assessment. This paper is an early review of the performance assessment in the draft application, by the Environmental Evaluation Group, an oversight group. The performance assessment in the draft application is incomplete. Not all relevant scenarios have been analyzed. The calculation of potential consequences often does not use experimental data but rather estimates by workers developing additional data. The final compliance application, scheduled for October 1996, needs to consider additional scenarios, and be fully based on experimental data

  12. Using Modeling and Simulation to Analyze Application and Network Performance at the Radioactive Waste and Nuclear Material Disposition Facility

    International Nuclear Information System (INIS)

    LIFE, ROY A.; MAESTAS, JOSEPH H.; BATEMAN, DENNIS B.

    2003-01-01

    Telecommunication services customers at the Radioactive Waste and Nuclear Material Disposition Facility (RWNMDF) have endured regular service outages that seem to be associated with a custom Microsoft Access Database. In addition, the same customers have noticed periods when application response times are noticeably worse than at others. To the customers, the two events appear to be correlated. Although many network design activities can be accomplished using trial-and-error methods, there are as many, if not more occasions where computerized analysis is necessary to verify the benefits of implementing one design alternative versus another. This is particularly true when network design is performed with application flows and response times in mind. More times than not, it is unclear whether upgrading certain aspects of the network will provide sufficient benefit to justify the corresponding costs, and network modeling tools can be used to help staff make these decisions. This report summarizes our analysis of the situation at the RWNMDF, in which computerized analysis was used to accomplish four objectives: (1) identify the source of the problem; (2) identify areas where improvements make the most sense; (3) evaluate various scenarios ranging from upgrading the network infrastructure, installing an additional fiber trunk as a way to improve local network performance, and re-locating the RWNMDF database onto corporate servers; and (4) demonstrate a methodology for network design using actual application response times to predict, select, and implement the design alternatives that provide the best performance and cost benefits

  13. Experiential knowledge of expert coaches can help identify informational constraints on performance of dynamic interceptive actions.

    Science.gov (United States)

    Greenwood, Daniel; Davids, Keith; Renshaw, Ian

    2014-01-01

    Coordination of dynamic interceptive movements is predicated on cyclical relations between an individual's actions and information sources from the performance environment. To identify dynamic informational constraints, which are interwoven with individual and task constraints, coaches' experiential knowledge provides a complementary source to support empirical understanding of performance in sport. In this study, 15 expert coaches from 3 sports (track and field, gymnastics and cricket) participated in a semi-structured interview process to identify potential informational constraints which they perceived to regulate action during run-up performance. Expert coaches' experiential knowledge revealed multiple information sources which may constrain performance adaptations in such locomotor pointing tasks. In addition to the locomotor pointing target, coaches' knowledge highlighted two other key informational constraints: vertical reference points located near the locomotor pointing target and a check mark located prior to the locomotor pointing target. This study highlights opportunities for broadening the understanding of perception and action coupling processes, and the identified information sources warrant further empirical investigation as potential constraints on athletic performance. Integration of experiential knowledge of expert coaches with theoretically driven empirical knowledge represents a promising avenue to drive future applied science research and pedagogical practice.

  14. How to Identify Possible Applications of Product Configuration Systems in Engineer-to-Order Companies

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Hvam, Lars

    2017-01-01

    -toorder (ETO) companies that support gradual implementation of PCS due to large product variety and, several times, higher complexity of products and processes. The overall PCS process can thereby be broken down, and the risk minimised. This paper provides a three-step framework to identify different......Product configuration systems (PCS) play an essential role when providing customised and engineered products efficiently. Literature in the field describes numerous strategies to develop PCS but neglects to identify different application areas. This topic is particularly important for engineer...

  15. Assessing Confidence in Performance Assessments Using an Evidence Support Logic Methodology: An Application of Tesla

    International Nuclear Information System (INIS)

    Egan, M.; Paulley, A.; Lehman, L.; Lowe, J.; Rochette, E.; Baker, St.

    2009-01-01

    The assessment of uncertainties and their implications is a key requirement when undertaking performance assessment (PA) of radioactive waste facilities. Decisions based on the outcome of such assessments become translated into judgments about confidence in the information they provide. This confidence, in turn, depends on uncertainties in the underlying evidence. Even if there is a large amount of information supporting an assessment, it may be only partially relevant, incomplete or less than completely reliable. In order to develop a measure of confidence in the outcome, sources of uncertainty need to be identified and adequately addressed in the development of the PA, or in any overarching strategic decision-making processes. This paper describes a trial application of the technique of Evidence Support Logic (ESL), which has been designed for application in support of 'high stakes' decisions, where important aspects of system performance are subject to uncertainty. The aims of ESL are to identify the amount of uncertainty or conflict associated with evidence relating to a particular decision, and to guide understanding of how evidence combines to support confidence in judgments. Elicitation techniques are used to enable participants in the process to develop a logical hypothesis model that best represents the relationships between different sources of evidence to the proposition under examination. The aim is to identify key areas of subjectivity and other sources of potential bias in the use of evidence (whether for or against the proposition) to support judgments of confidence. Propagation algorithms are used to investigate the overall implications of the logic according to the strength of the underlying evidence and associated uncertainties. (authors)

  16. Performance of the Lot Quality Assurance Sampling Method Compared to Surveillance for Identifying Inadequately-performing Areas in Matlab, Bangladesh

    OpenAIRE

    Bhuiya, Abbas; Hanifi, S.M.A.; Roy, Nikhil; Streatfield, P. Kim

    2007-01-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers i...

  17. Common genetic variants associated with cognitive performance identified using the proxy-phenotype method

    NARCIS (Netherlands)

    C.A. Rietveld (Niels); T. Esko (Tõnu); G. Davies (Gail); T.H. Pers (Tune); P. Turley (Patrick); B. Benyamin (Beben); C.F. Chabris (Christopher F.); V. Emilsson (Valur); A.D. Johnson (Andrew); J.J. Lee (James J.); C. de Leeuw (Christiaan); R.E. Marioni (Riccardo); S.E. Medland (Sarah Elizabeth); M. Miller (Mike); O. Rostapshova (Olga); S.J. van der Lee (Sven); A.A.E. Vinkhuyzen (Anna A.); N. Amin (Najaf); D. Conley (Dalton); J. Derringer; C.M. van Duijn (Cornelia); R.S.N. Fehrmann (Rudolf); L. Franke (Lude); E.L. Glaeser (Edward L.); N.K. Hansell (Narelle); C. Hayward (Caroline); W.G. Iacono (William); C.A. Ibrahim-Verbaas (Carla); V.W.V. Jaddoe (Vincent); J. Karjalainen (Juha); D. Laibson (David); P. Lichtenstein (Paul); D.C. Liewald (David C.); P.K. Magnusson (Patrik); N.G. Martin (Nicholas); M. McGue (Matt); G. Mcmahon (George); N.L. Pedersen (Nancy); S. Pinker (Steven); D.J. Porteous (David J.); D. Posthuma (Danielle); F. Rivadeneira Ramirez (Fernando); B.H. Smithk (Blair H.); J.M. Starr (John); H.W. Tiemeier (Henning); N.J. Timpsonm (Nicholas J.); M. Trzaskowskin (Maciej); A.G. Uitterlinden (André); F.C. Verhulst (Frank); M.E. Ward (Mary); M.J. Wright (Margaret); G.D. Smith; I.J. Deary (Ian J.); M. Johannesson (Magnus); R. Plomin (Robert); P.M. Visscher (Peter); D.J. Benjamin (Daniel J.); D. Cesarini (David); Ph.D. Koellinger (Philipp)

    2014-01-01

    textabstractWe identify common genetic variants associated with cognitive performance using a two-stage approach, which we call the proxyphenotype method. First, we conduct a genome-wide association study of educational attainment in a large sample (n = 106,736), which produces a set of 69

  18. Performance of Student Software Development Teams: The Influence of Personality and Identifying as Team Members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms…

  19. Identifying Knowledge Gaps in Clinicians Who Evaluate and Treat Vocal Performing Artists in College Health Settings.

    Science.gov (United States)

    McKinnon-Howe, Leah; Dowdall, Jayme

    2018-05-01

    The goal of this study was to identify knowledge gaps in clinicians who evaluate and treat performing artists for illnesses and injuries that affect vocal function in college health settings. This pilot study utilized a web-based cross-sectional survey design incorporating common clinical scenarios to test knowledge of evaluation and management strategies in the vocal performing artist. A web-based survey was administered to a purposive sample of 28 clinicians to identify the approach utilized to evaluate and treat vocal performing artists in college health settings, and factors that might affect knowledge gaps and influence referral patterns to voice specialists. Twenty-eight clinicians were surveyed, with 36% of respondents incorrectly identifying appropriate vocal hygiene measures, 56% of respondents failing to identify symptoms of vocal fold hemorrhage, 84% failing to identify other indications for referral to a voice specialist, 96% of respondents acknowledging unfamiliarity with the Voice Handicap Index and the Singers Voice Handicap Index, and 68% acknowledging unfamiliarity with the Reflux Symptom Index. The data elucidated specific knowledge gaps in college health providers who are responsible for evaluating and treating common illnesses that affect vocal function, and triaging and referring students experiencing symptoms of potential vocal emergencies. Future work is needed to improve the standard of care for this population. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  20. Simulation-based Assessment to Reliably Identify Key Resident Performance Attributes.

    Science.gov (United States)

    Blum, Richard H; Muret-Wagstaff, Sharon L; Boulet, John R; Cooper, Jeffrey B; Petrusa, Emil R; Baker, Keith H; Davidyuk, Galina; Dearden, Jennifer L; Feinstein, David M; Jones, Stephanie B; Kimball, William R; Mitchell, John D; Nadelberg, Robert L; Wiser, Sarah H; Albrecht, Meredith A; Anastasi, Amanda K; Bose, Ruma R; Chang, Laura Y; Culley, Deborah J; Fisher, Lauren J; Grover, Meera; Klainer, Suzanne B; Kveraga, Rikante; Martel, Jeffrey P; McKenna, Shannon S; Minehart, Rebecca D; Mitchell, John D; Mountjoy, Jeremi R; Pawlowski, John B; Pilon, Robert N; Shook, Douglas C; Silver, David A; Warfield, Carol A; Zaleski, Katherine L

    2018-04-01

    Obtaining reliable and valid information on resident performance is critical to patient safety and training program improvement. The goals were to characterize important anesthesia resident performance gaps that are not typically evaluated, and to further validate scores from a multiscenario simulation-based assessment. Seven high-fidelity scenarios reflecting core anesthesiology skills were administered to 51 first-year residents (CA-1s) and 16 third-year residents (CA-3s) from three residency programs. Twenty trained attending anesthesiologists rated resident performances using a seven-point behaviorally anchored rating scale for five domains: (1) formulate a clear plan, (2) modify the plan under changing conditions, (3) communicate effectively, (4) identify performance improvement opportunities, and (5) recognize limits. A second rater assessed 10% of encounters. Scores and variances for each domain, each scenario, and the total were compared. Low domain ratings (1, 2) were examined in detail. Interrater agreement was 0.76; reliability of the seven-scenario assessment was r = 0.70. CA-3s had a significantly higher average total score (4.9 ± 1.1 vs. 4.6 ± 1.1, P = 0.01, effect size = 0.33). CA-3s significantly outscored CA-1s for five of seven scenarios and domains 1, 2, and 3. CA-1s had a significantly higher proportion of worrisome ratings than CA-3s (chi-square = 24.1, P < 0.01, effect size = 1.50). Ninety-eight percent of residents rated the simulations more educational than an average day in the operating room. Sensitivity of the assessment to CA-1 versus CA-3 performance differences for most scenarios and domains supports validity. No differences, by experience level, were detected for two domains associated with reflective practice. Smaller score variances for CA-3s likely reflect a training effect; however, worrisome performance scores for both CA-1s and CA-3s suggest room for improvement.

  1. Medical School Applicant Characteristics Associated With Performance in Multiple Mini-Interviews Versus Traditional Interviews: A Multi-Institutional Study.

    Science.gov (United States)

    Henderson, Mark C; Kelly, Carolyn J; Griffin, Erin; Hall, Theodore R; Jerant, Anthony; Peterson, Ellena M; Rainwater, Julie A; Sousa, Francis J; Wofsy, David; Franks, Peter

    2017-10-31

    To examine applicant characteristics associated with multi mini-interview (MMI) or traditional interview (TI) performance at five California public medical schools. Of the five California Longitudinal Evaluation of Admissions Practices (CA-LEAP) consortium schools, three used TIs and two used MMIs. Schools provided the following retrospective data on all 2011-2013 admissions cycle interviewees: age, gender, race/ethnicity (under-represented in medicine [UIM] or not), self-identified disadvantaged (DA) status, undergraduate GPA, Medical College Admission Test (MCAT) score, and interview score (standardized as z-score, mean = 0, SD = 1). Adjusted linear regression analyses, stratified by interview type, examined associations with interview performance. The 4,993 applicants who completed 7,516 interviews included 931 (18.6%) UIM and 962 (19.3%) DA individuals; 3,226 (64.6%) had one interview. Mean age was 24.4 (SD = 2.7); mean GPA and MCAT score were 3.72 (SD = 0.22) and 33.6 (SD = 3.7), respectively. Older age, female gender, and number of prior interviews were associated with better performance on both MMIs and TIs. Higher GPA was associated with lower MMI scores (z-score, per unit GPA = -0.26, 95% CI [-0.45, -0.06]), but unrelated to TI scores. DA applicants had higher TI scores (z-score = 0.17, 95% CI [0.07, 0.28]), but lower MMI scores (z-score = -0.18, 95% CI [-0.28, -.08]) than non-DA applicants. Neither UIM status nor MCAT score were associated with interview performance. These findings have potentially important workforce implications, particularly regarding DA applicants, and illustrate the need for other multi-institutional studies of medical school admissions processes.

  2. Enhancing Application Performance Using Mini-Apps: Comparison of Hybrid Parallel Programming Paradigms

    Science.gov (United States)

    Lawson, Gary; Sosonkina, Masha; Baurle, Robert; Hammond, Dana

    2017-01-01

    In many fields, real-world applications for High Performance Computing have already been developed. For these applications to stay up-to-date, new parallel strategies must be explored to yield the best performance; however, restructuring or modifying a real-world application may be daunting depending on the size of the code. In this case, a mini-app may be employed to quickly explore such options without modifying the entire code. In this work, several mini-apps have been created to enhance a real-world application performance, namely the VULCAN code for complex flow analysis developed at the NASA Langley Research Center. These mini-apps explore hybrid parallel programming paradigms with Message Passing Interface (MPI) for distributed memory access and either Shared MPI (SMPI) or OpenMP for shared memory accesses. Performance testing shows that MPI+SMPI yields the best execution performance, while requiring the largest number of code changes. A maximum speedup of 23 was measured for MPI+SMPI, but only 11 was measured for MPI+OpenMP.

  3. Identifiability of PBPK Models with Applications to ...

    Science.gov (United States)

    Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy

  4. The Relationship between Cost Leadership Strategy, Total Quality Management Applications and Financial Performance

    Directory of Open Access Journals (Sweden)

    Ali KURT

    2016-03-01

    Full Text Available Firms need to implement some competition strategies and total quality management applications to overcome the fierce competition among others. The purpose of this study is to show the relationship between cost leadership strategy, total quality management applications and firms’ financial performance with literature review and empirical analysis. 449 questionnaires were conducted to the managers of 142 big firms. The data gathered was assessed with AMOS. As a result, the relationship between cost leadership strategy, total quality management applications and firms’ financial performance has been gathered. In addition, the relationship between TQM applications and financial performance has also been gathered.

  5. School Correlates of Academic Behaviors and Performance among McKinney-Vento Identified Youth

    Science.gov (United States)

    Stone, Susan; Uretsky, Mathew

    2016-01-01

    We utilized a pooled sample of elementary, middle, and high school-aged children identified as homeless via definitions set forth by McKinney-Vento legislation in a large urban district in California to estimate the extent to which school factors contributed to student attendance, suspensions, test-taking behaviors, and performance on state…

  6. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel's MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  7. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  8. Predictive Performance Tuning of OpenACC Accelerated Applications

    KAUST Repository

    Siddiqui, Shahzeb; Feki, Saber

    2014-01-01

    , with the introduction of high level programming models such as OpenACC [1] and OpenMP 4.0 [2], these devices are becoming more accessible and practical to use by a larger scientific community. However, performance optimization of OpenACC accelerated applications usually

  9. An application of data mining in district heating substations for improving energy performance

    Science.gov (United States)

    Xue, Puning; Zhou, Zhigang; Chen, Xin; Liu, Jing

    2017-11-01

    Automatic meter reading system is capable of collecting and storing a huge number of district heating (DH) data. However, the data obtained are rarely fully utilized. Data mining is a promising technology to discover potential interesting knowledge from vast data. This paper applies data mining methods to analyse the massive data for improving energy performance of DH substation. The technical approach contains three steps: data selection, cluster analysis and association rule mining (ARM). Two-heating-season data of a substation are used for case study. Cluster analysis identifies six distinct heating patterns based on the primary heat of the substation. ARM reveals that secondary pressure difference and secondary flow rate have a strong correlation. Using the discovered rules, a fault occurring in remote flow meter installed at secondary network is detected accurately. The application demonstrates that data mining techniques can effectively extrapolate potential useful knowledge to better understand substation operation strategies and improve substation energy performance.

  10. Clean Technology Application : Kupola Model Burner for Increasing the Performance of Spent Accu Recycle

    International Nuclear Information System (INIS)

    Titiresmi

    2000-01-01

    Recycling of used battery for recovering lead done by either small household/small scale industries has been identified as a source of air pollution, especially by heavy metal (Pb). This condition give an adverse impact toward workers and societies. Technological aspect is one of the causal. The process apply an open system. Therefore, a lot of energy, as well as dust wasted to the air without prior treatment. For overcoming this condition, closed system by utilizing Cupola furnace will be offered as one of the alternatives clean technology application and to increase the recovering performance in order to set an effective and efficient result. (author)

  11. Multiphase pumping: indoor performance test and oilfield application

    Science.gov (United States)

    Kong, Xiangling; Zhu, Hongwu; Zhang, Shousen; Li, Jifeng

    2010-03-01

    Multiphase pumping is essentially a means of adding energy to the unprocessed effluent which enables the liquid and gas mixture to be transported over a long distances without prior separation. A reduction, consolidation, or elimination of the production infrastructure, such as separation equipments and offshore platforms can be developed more economically. Also it successfully lowed the backpressure of wells, revived dead wells and improved the production and efficiency of oilfield. This paper reviews the issues related to indoor performance test and an oilfield application of the helico-axial multiphase pump designed by China University of Petroleum (Beijing). Pump specification and its hydraulic design are given. Results of performance testing under different condition, such as operational speed and gas volume fraction (GVF) etc are presented. Experimental studies on combination of theoretical analysis showed the multiphase pump satisfies the similitude rule, which can be used in the development of new MPP design and performance prediction. Test results showed that rising the rotation speed and suction pressure could better its performance, pressure boost improved, high efficiency zone expanding and the flow rate related to the optimum working condition increased. The pump worked unstable as GVF increased to a certain extent and slip occurred between two phases in the pump, creating surging and gas lock at a high GVF. A case of application in Nanyang oilfield is also studied.

  12. Males Perform Better in Identifying Voices During Menstruation Than Females: A Pilot Study.

    Science.gov (United States)

    Wang, Xue; Xu, Xin; Liu, Yangyang

    2016-10-01

    The objective of the present study is to investigate gender differences in the ability to identify females' voice during menstruation. In Study 1, 55 male participants (M age = 19.6 years, SD = 1.0) were asked to listen to vocal samples from women during both ovulation and menstruation and to identify which recordings featured menstruating women. The results showed that the accuracy of men's responses (M = 56.73%, SD = 0.21) was significantly higher than 50%. In Study 2, 118 female students (M age = 19.4 years, SD = 1.6) completed the same task. The results indicated that the accuracy of women's performance was nearly 50%. These preliminary findings suggest that men are better able to identify women's voices during menstruation than women. Future work could consider several significant variables for the purpose of validating the results. © The Author(s) 2016.

  13. Web-based application on employee performance assessment using exponential comparison method

    Science.gov (United States)

    Maryana, S.; Kurnia, E.; Ruyani, A.

    2017-02-01

    Employee performance assessment is also called a performance review, performance evaluation, or assessment of employees, is an effort to assess the achievements of staffing performance with the aim to increase productivity of employees and companies. This application helps in the assessment of employee performance using five criteria: Presence, Quality of Work, Quantity of Work, Discipline, and Teamwork. The system uses the Exponential Comparative Method and Weighting Eckenrode. Calculation results using graphs were provided to see the assessment of each employee. Programming language used in this system is written in Notepad++ and MySQL database. The testing result on the system can be concluded that this application is correspond with the design and running properly. The test conducted is structural test, functional test, and validation, sensitivity analysis, and SUMI testing.

  14. High-performance insulator structures for accelerator applications

    International Nuclear Information System (INIS)

    Sampayan, S.E.; Caporaso, G.J.; Sanders, D.M.; Stoddard, R.D.; Trimble, D.O.; Elizondo, J.; Krogh, M.L.; Wieskamp, T.F.

    1997-05-01

    A new, high gradient insulator technology has been developed for accelerator systems. The concept involves the use of alternating layers of conductors and insulators with periods of order 1 mm or less. These structures perform many times better (about 1.5 to 4 times higher breakdown electric field) than conventional insulators in long pulse, short pulse, and alternating polarity applications. We describe our ongoing studies investigating the degradation of the breakdown electric field resulting from alternate fabrication techniques, the effect of gas pressure, the effect of the insulator-to-electrode interface gap spacing, and the performance of the insulator structure under bi-polar stress

  15. Develop feedback system for intelligent dynamic resource allocation to improve application performance.

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, Ann C.; Brandt, James M.; Tucker, Thomas (Open Grid Computing, Inc., Austin, TX); Thompson, David

    2011-09-01

    This report provides documentation for the completion of the Sandia Level II milestone 'Develop feedback system for intelligent dynamic resource allocation to improve application performance'. This milestone demonstrates the use of a scalable data collection analysis and feedback system that enables insight into how an application is utilizing the hardware resources of a high performance computing (HPC) platform in a lightweight fashion. Further we demonstrate utilizing the same mechanisms used for transporting data for remote analysis and visualization to provide low latency run-time feedback to applications. The ultimate goal of this body of work is performance optimization in the face of the ever increasing size and complexity of HPC systems.

  16. Performance-oriented packaging: A guide to identifying and designing. Identifying and designing hazardous materials packaging for compliance with post HM-181 DOT Regulations

    International Nuclear Information System (INIS)

    1994-08-01

    With the initial publication of Docket HM-181 (hereafter referred to as HM-181), the U.S. Department of Energy (DOE), Headquarters, Transportation Management Division decided to produce guidance to help the DOE community transition to performance-oriented packagings (POP). As only a few individuals were familiar with the new requirements, elementary guidance was desirable. The decision was to prepare the guidance at a level easily understood by a novice to regulatory requirements. This document identifies design development strategies for use in obtaining performance-oriented packagings that are not readily available commercially. These design development strategies will be part of the methodologies for compliance with post HM-181 U.S. Department of Transportation (DOT) packaging regulations. This information was prepared for use by the DOE and its contractors. The document provides guidance for making decisions associated with designing performance-oriented packaging, and not for identifying specific material or fabrication design details. It does provide some specific design considerations. Having a copy of the regulations handy when reading this document is recommended to permit a fuller understanding of the requirements impacting the design effort. While this document is not written for the packaging specialist, it does contain guidance important to those not familiar with the new POP requirements

  17. Persistent Identifier Practice for Big Data Management at NCI

    Directory of Open Access Journals (Sweden)

    Jingbo Wang

    2017-04-01

    Full Text Available The National Computational Infrastructure (NCI manages over 10 PB research data, which is co-located with the high performance computer (Raijin and an HPC class 3000 core OpenStack cloud system (Tenjin. In support of this integrated High Performance Computing/High Performance Data (HPC/HPD infrastructure, NCI’s data management practices includes building catalogues, DOI minting, data curation, data publishing, and data delivery through a variety of data services. The metadata catalogues, DOIs, THREDDS, and Vocabularies, all use different Uniform Resource Locator (URL styles. A Persistent IDentifier (PID service provides an important utility to manage URLs in a consistent, controlled and monitored manner to support the robustness of our national ‘Big Data’ infrastructure. In this paper we demonstrate NCI’s approach of utilising the NCI’s 'PID Service 'to consistently manage its persistent identifiers with various applications.

  18. Identifying blood biomarkers and physiological processes that distinguish humans with superior performance under psychological stress.

    Directory of Open Access Journals (Sweden)

    Amanda M Cooksey

    2009-12-01

    Full Text Available Attrition of students from aviation training is a serious financial and operational concern for the U.S. Navy. Each late stage navy aviator training failure costs the taxpayer over $1,000,000 and ultimately results in decreased operational readiness of the fleet. Currently, potential aviators are selected based on the Aviation Selection Test Battery (ASTB, which is a series of multiple-choice tests that evaluate basic and aviation-related knowledge and ability. However, the ASTB does not evaluate a person's response to stress. This is important because operating sophisticated aircraft demands exceptional performance and causes high psychological stress. Some people are more resistant to this type of stress, and consequently better able to cope with the demands of naval aviation, than others.Although many psychological studies have examined psychological stress resistance none have taken advantage of the human genome sequence. Here we use high-throughput -omic biology methods and a novel statistical data normalization method to identify plasma proteins associated with human performance under psychological stress. We identified proteins involved in four basic physiological processes: innate immunity, cardiac function, coagulation and plasma lipid physiology.The proteins identified here further elucidate the physiological response to psychological stress and suggest a hypothesis that stress-susceptible pilots may be more prone to shock. This work also provides potential biomarkers for screening humans for capability of superior performance under stress.

  19. An applicable approach for performance auditing in ERP

    Directory of Open Access Journals (Sweden)

    Wan Jian Guo

    2016-01-01

    Full Text Available This paper aims at the realistic problem of performance auditing in ERP environment. Traditional performance auditing methods and existing approaches for performance evaluation of ERP implementation could not work well, because they are either difficult to work or contains certain subjective elements. This paper proposed an applicable performance auditing approach for SAP ERP based on quantitative analysis. This approach consists of 3 parts which are system utilization, data quality and the effectiveness of system control. In each part, we provide the main process to conduct the operation, especially how to calculate the online settlement rate of SAP system. This approach has played an important role in the practical auditing work. A practical case is provided at the end of this paper to describe the effectiveness of this approach. Implementation of this approach also has some significance to the performance auditing of other ERP products.

  20. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2011-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  1. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu

    2011-08-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  2. Thermodynamic performance assessment of wind energy systems: An application

    International Nuclear Information System (INIS)

    Redha, Adel Mohammed; Dincer, Ibrahim; Gadalla, Mohamed

    2011-01-01

    In this paper, the performance of wind energy system is assessed thermodynamically, from resource and technology perspectives. The thermodynamic characteristics of wind through energy and exergy analyses are considered and both energetic and exergetic efficiencies are studied. Wind speed is affected by air temperature and pressure and has a subsequent effect on wind turbine performance based on wind reference temperature and Bernoulli's equation. VESTAS V52 wind turbine is selected for (Sharjah/UAE). Energy and exergy efficiency equations for wind energy systems are further developed for practical applications. The results show that there are noticeable differences between energy and exergy efficiencies and that exergetic efficiency reflects the right/actual performance. Finally, exergy analysis has been proven to be the right tool used in design, simulation, and performance evaluation of all renewable energy systems. -- Highlights: → In this research the performance of wind energy system is assessed thermodynamically, from resource and technology perspectives. → Energy and exergy equations for wind energy systems are further developed for practical applications. → Thermodynamic characteristics of wind turbine systems through energetic and exergetic efficiencies are evaluated from January till March 2010. → Exergy efficiency describes the system irreversibility and the minimum irreversibility exists when the wind speed reaches 11 m/s. → The power production during March was about 17% higher than the month of February and 66% higher than January.

  3. Investigating And Evaluating Of Network Failures And Performance Over Distributed WAN In Application Protocol Layer

    Directory of Open Access Journals (Sweden)

    Enoch Okoh Kofi

    2015-08-01

    Full Text Available The Experiment was done to find out network failures and application performance relationship over distributed Wide Area Net WAN. In order to access related application over the cloud there must be an internet connectivity which will help the respective workstations to access the remote server for applications being deployed over the network. Bandwidth improvement helps in reducing utilization over the network and it also helps in improving Application Efficiency of these Applications in terms of Response Time. Routers were configured under Enhance Interior Gateway Routing Protocol EIGRP to reduce utilization and to ensure load sharing over the network. Three scenarios were modeled and their performance efficiency was evaluated. A modeled computer Network with and without a fail Router under different scenarios and such Network was simulated with emphasis on the Application Performance. The Experiment was done for fifty workstations under three scenarios and these three scenarios were accessed and evaluated on experimental basis using Riverbed modeler to show the Effect of Application Network performance. The performance results show that increasing the bandwidth reduces utilization and also with the failure of one communication bandwidth users can still access Network Application with a minimal cost.

  4. Performance of ceramics in ring/cylinder applications

    International Nuclear Information System (INIS)

    Dufrane, K.F.; Glaeser, W.A.

    1987-01-01

    In support of the efforts to apply ceramics to advanced heat engines, a study is being performed of the performance of ceramics at the ring/cylinder interface of advanced (low heat rejection) engines. The objective of the study, managed by the Oak Ridge National Laboratory, is to understand the basic mechanisms controlling the wear of ceramics and thereby identify means for applying ceramics effectively. Attempts to operate three different zirconias, silicon carbide, silicon nitride, and plasma-sprayed ceramic coatings without lubrication have not been successful because of excessive friction and high wear rates. Silicon carbide and silicon nitride perform well at ambient temperatures with fully formulated mineral oil lubrication, but are limited to temperatures of 500F because of the lack of suitable liquid lubricants for higher temperatures

  5. Evaluation of performance of silicon photomultipliers in lidar applications

    Science.gov (United States)

    Vinogradov, Sergey L.

    2017-05-01

    Silicon Photomultipliers (SiPMs) are a well-recognized new generation of photon number resolving avalanche photodetectors. Many advantages - a high gain with an ultra-low excess noise of multiplication, multi-pixel architecture, relatively low operating voltage - make SiPMs very competitive in a growing number of applications. Challenging demands of LIDAR applications for a receiver having high sensitivity starting from single photons, superior time-offlight resolution, robustness including surviving at bright light flashes, solid-state compactness and more, are expected to be feasible for the SiPMs. Despite some known drawbacks, namely crosstalk, afterpulsing, dark noise, limited dynamic range, SiPMs are already considered as promising substitutes for conventional APDs and PMTs in LIDAR applications. However, these initial considerations are based on a rather simplified representation of the SiPM as a generic LIDAR receiver described by generic expressions. This study is focused on a comprehensive evaluation of a SiPM potential considering essential features of this new technology, which could affect applicability and performance of SiPMs as LIDAR receivers. Namely, an excess noise due to correlated processes of crosstalk and afterpulsing, are included into account utilizing the well-established framework of analytical probabilistic models. The analysis of SiPM performance in terms of a photon number and time resolution clarifies their competitiveness over conventional APD and PMT and anticipates the development of next SiPM generations.

  6. Application of FEPs analysis to identify research priorities relevant to the safety case for an Australian radioactive waste facility

    International Nuclear Information System (INIS)

    Payne, T.E.; McGlinn, P.J.

    2007-01-01

    The Australian Nuclear Science and Technology Organisation (ANSTO) has established a project to undertake research relevant to the safety case for the proposed Australian radioactive waste facility. This facility will comprise a store for intermediate level radioactive waste, and either a store or a near-surface repository for low-level waste. In order to identify the research priorities for this project, a structured analysis of the features, events and processes (FEPs) relevant to the performance of the facility was undertaken. This analysis was based on the list of 137 FEPs developed by the IAEA project on 'Safety Assessment Methodologies for Near Surface Disposal Facilities' (ISAM). A number of key research issues were identified, and some factors which differ in significance for the store, compared to the repository concept, were highlighted. For example, FEPs related to long-term groundwater transport of radionuclides are considered to be of less significance for a store than a repository. On the other hand, structural damage from severe weather, accident or human interference is more likely for a store. The FEPs analysis has enabled the scientific research skills required for the inter-disciplinary project team to be specified. The outcomes of the research will eventually be utilised in developing the design, and assessing the performance, of the future facility. It is anticipated that a more detailed application of the FEPs methodology will be undertaken to develop the safety case for the proposed radioactive waste management facility. (authors)

  7. Evaluation of Software Quality to Improve Application Performance Using Mc Call Model

    Directory of Open Access Journals (Sweden)

    Inda D Lestantri

    2018-04-01

    Full Text Available The existence of software should have more value to improve the performance of the organization in addition to having the primary function to automate. Before being implemented in an operational environment, software must pass the test gradually to ensure that the software is functioning properly, meeting user needs and providing convenience for users to use it. This test is performed on a web-based application, by taking a test case in an e-SAP application. E-SAP is an application used to monitor teaching and learning activities used by a university in Jakarta. To measure software quality, testing can be done on users randomly. The user samples selected in this test are users with an age range of 18 years old up to 25 years, background information technology. This test was conducted on 30 respondents. This test is done by using Mc Call model. Model of testing Mc Call consists of 11 dimensions are grouped into 3 categories. This paper describes the testing with reference to the category of product operation, which includes 5 dimensions. The dimensions of testing performed include the dimensions of correctness, usability, efficiency, reliability, and integrity. This paper discusses testing on each dimension to measure software quality as an effort to improve performance. The result of research is e-SAP application has good quality with product operation value equal to 85.09%. This indicates that the e-SAP application has a great quality, so this application deserves to be examined in the next stage on the operational environment.

  8. Development and application of a methodology for identifying and characterising scenarios

    International Nuclear Information System (INIS)

    Billington, D.; Bailey, L.

    1998-01-01

    interval along each timeline. This report presents illustrative examples of the application of the above methodology to achieve this aim. The results of risk calculations and assigned weights are plotted on a 'weight-risk diagram', which is used to judge the relative significance of the different variant scenarios in relation to the base scenario and the regulatory risk target. The application of this methodology is consistent with a staged approach to performance assessment, in which effort is focused initially on scoping calculations of conditional risk. Only those variant scenarios giving a higher conditional risk than the base scenario are subject to more detailed evaluation, including the assignment of an appropriate weight. From the limited trialling that has been undertaken, the indications are that a tractable approach, consistent with the objectives of comprehensiveness, traceability and clarity, has been achieved. (author)

  9. 12 CFR 563e.29 - Effect of CRA performance on applications.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Effect of CRA performance on applications. 563e.29 Section 563e.29 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY COMMUNITY REINVESTMENT Standards for Assessing Performance § 563e.29 Effect of CRA performance on...

  10. Application of multiple tracers (SF6 and chloride) to identify the transport by characteristics of contaminant at two separate contaminated sites

    Science.gov (United States)

    Lee, K. K.; Lee, S. S.; Kim, H. H.; Koh, E. H.; Kim, M. O.; Lee, K.; Kim, H. J.

    2016-12-01

    Multiple tracers were applied for source and pathway detection at two different sites. CO2 gas injected in the subsurface for a shallow-depth CO2 injection and leak test can be regarded as a potential contaminant source. Therefore, it is necessary to identify the migration pattern of CO2 gas. Also, at a DNAPL contaminated site, it is important to figure out the characteristics of plume evolution from the source zone. In this study, multiple tracers (SF6 and chloride) were used to evaluate the applicability of volatile and non-volatile tracers and to identify the characteristics of contaminant transport at each CO2 injection and leak test site and DNAPL contaminated site. Firstly, at the CO2 test site, multiple tracers were used to perform the single well push-drift-pull tracer test at total 3 specific depth zones. As results of tests, volatile and non-volatile tracers showed different mass recovery percentage. Most of chloride mass was recovered but less than half of SF6 mass was recovered due to volatile property. This means that only gaseous SF6 leak out to unsaturated zone. However, breakthrough curves of both tracers indicated similar peak time, effective porosity, and regional groundwater velocity. Also, at both contaminated sites, natural gradient tracer tests were performed with multiple tracers. With the results of natural gradient tracer test, it was possible to confirm the applicability of multiple tracers and to understand the contaminant transport in highly heterogeneous aquifer systems through the long-term monitoring of tracers. Acknowledgement: financial support was provided by the R&D Project on Environmental Management of Geologic CO2 Storage)" from the KEITI (Project Number: 2014001810003) and Korea Ministry of Environment as "The GAIA project (2014000540010)".

  11. The Use of a Performance Assessment for Identifying Gifted Lebanese Students: Is DISCOVER Effective?

    Science.gov (United States)

    Sarouphim, Ketty M.

    2009-01-01

    The purpose of this study was to investigate the effectiveness of DISCOVER, a performance- based assessment in identifying gifted Lebanese students. The sample consisted of 248 students (121 boys, 127 girls) from Grades 3-5 at two private schools in Beirut, Lebanon. Students were administered DISCOVER and the Raven Standard Progressive Matrices…

  12. Ultrasonicly identified seals for safeguards and physical protection purposes

    International Nuclear Information System (INIS)

    Crutzen, S.

    The paper provides a general review of an ultrasonic technique available for sealing, marking or otherwise identifying material in such a way that its recognition and guarantee of integrity are inequivocally ensured. Development work on several types of seals and their ultrasonic identification has been performed at Ispra in collaboration with external companies for application to MTRs, HWRs and FBRs

  13. Application of performance assessment as a tool for guiding project work

    International Nuclear Information System (INIS)

    McCombie, C.; Zuidema, P.

    1992-01-01

    The ultimate aim of the performance assessment methodology developed over the last 10-15 years is to predict quantitatively the behavior of disposal systems over periods of time into the far future. The methodology can, however, also be applied in range of tasks during repository development and is in many programmes used as a tool for improving or optimizing the design of subsystem components of for guiding the course of project planning. In Swiss waste management program, there are several examples of the use of performance assessment as a tool in the manner mentioned above. The interaction between research models, assessment models and simplified models is considered to be of key importance and corresponding measures are taken to properly structure the process and to track the data: first, the results of all applications of the models are included in a consistent manner in the scenario analyses for the different sites and systems and, second, consistency in the underlying assumptions and in the data used in the different model calculations is assured by the consequent application of a configuration data management system (CDM). Almost all the applications of performance assessment have been included in Swiss work, but for this paper, only two examples have been selected: applications of performance assessment in both the HLW and the LLW program; and acceptance of specific waste types and their allocation to an appropriate repository on the basis of simplified safety analyses

  14. Identifying context factors explaining physician's low performance in communication assessment: an explorative study in general practice.

    NARCIS (Netherlands)

    Essers, G.T.J.M.; Dulmen, A.M. van; Weel, C. van; Vleuten, C.P.M. van der; Kramer, A.W.

    2011-01-01

    ABSTRACT: BACKGROUND: Communication is a key competence for health care professionals. Analysis of registrar and GP communication performance in daily practice, however, suggests a suboptimal application of communication skills. The influence of context factors could reveal why communication

  15. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  16. Measuring individual work performance: Identifying and selecting indicators

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; de Vet, H.C.W.; van der Beek, A.J.

    2014-01-01

    BACKGROUND: Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions.

  17. Students' Performance When Aurally Identifying Musical Harmonic Intervals: Experimentation of a Teaching Innovation Proposal

    Science.gov (United States)

    Ponsatí, Imma; Miranda, Joaquim; Amador, Miquel; Godall, Pere

    2016-01-01

    The aim of the study was to measure the performance reached by students (N = 138) when aurally identifying musical harmonic intervals (from m2 to P8) after having experienced a teaching innovation proposal for the Music Conservatories of Catalonia (Spain) based on observational methodology. Its design took into account several issues, which had…

  18. Measuring individual work performance: identifying and selecting indicators

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Vet, H.C de; Beek, A.J. van der

    2014-01-01

    BACKGROUND: Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions. OBJECTIVE: This

  19. How physicians identify with predetermined personalities and links to perceived performance and wellness outcomes: a cross-sectional study.

    Science.gov (United States)

    Lemaire, Jane B; Wallace, Jean E

    2014-11-29

    Certain personalities are ascribed to physicians. This research aims to measure the extent to which physicians identify with three predetermined personalities (workaholic, Type A and control freak) and to explore links to perceptions of professional performance, and wellness outcomes. This is a cross-sectional study using a mail-out questionnaire sent to all practicing physicians (2957 eligible, 1178 responses, 40% response rate) in a geographical health region within a western Canadian province. Survey items were used to assess the extent to which participants felt they are somewhat of a workaholic, Type A and/or control freak, and if they believed that having these personalities makes one a better doctor. Participants' wellness outcomes were also measured. Zero-order correlations were used to determine the relationships between physicians identifying with a personality and feeling it makes one a better doctor. T-tests were used to compare measures of physician wellness for those who identified with the personality versus those who did not. 53% of participants identified with the workaholic personality, 62% with the Type A, and 36% with the control freak. Identifying with any one of the personalities was correlated with feeling it makes one a better physician. There were statistically significant differences in several wellness outcomes comparing participants who identified with the personalities versus those who did not. These included higher levels of emotional exhaustion (workaholic, Type A and control freak), higher levels of anxiety (Type A and control freak) and higher levels of depression, poorer mental health and lower levels of job satisfaction (control freak). Participants who identified with the workaholic personality versus those who did not reported higher levels of job satisfaction, rewarding patient experiences and career commitment. Most participants identified with at least one of the three personalities. The beliefs of some participants that

  20. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  1. Identifying Domain-General and Domain-Specific Predictors of Low Mathematics Performance: A Classification and Regression Tree Analysis

    Directory of Open Access Journals (Sweden)

    David J. Purpura

    2017-12-01

    Full Text Available Many children struggle to successfully acquire early mathematics skills. Theoretical and empirical evidence has pointed to deficits in domain-specific skills (e.g., non-symbolic mathematics skills or domain-general skills (e.g., executive functioning and language as underlying low mathematical performance. In the current study, we assessed a sample of 113 three- to five-year old preschool children on a battery of domain-specific and domain-general factors in the fall and spring of their preschool year to identify Time 1 (fall factors associated with low performance in mathematics knowledge at Time 2 (spring. We used the exploratory approach of classification and regression tree analyses, a strategy that uses step-wise partitioning to create subgroups from a larger sample using multiple predictors, to identify the factors that were the strongest classifiers of low performance for younger and older preschool children. Results indicated that the most consistent classifier of low mathematics performance at Time 2 was children’s Time 1 mathematical language skills. Further, other distinct classifiers of low performance emerged for younger and older children. These findings suggest that risk classification for low mathematics performance may differ depending on children’s age.

  2. Application of data mining in performance measures

    Science.gov (United States)

    Chan, Michael F. S.; Chung, Walter W.; Wong, Tai Sun

    2001-10-01

    This paper proposes a structured framework for exploiting data mining application for performance measures. The context is set in an airline company is illustrated for the use of such framework. The framework takes in consideration of how a knowledge worker interacts with performance information at the enterprise level to support them to make informed decision in managing the effectiveness of operations. A case study of applying data mining technology for performance data in an airline company is illustrated. The use of performance measures is specifically applied to assist in the aircraft delay management process. The increasingly dispersed and complex operations of airline operation put much strain on the part of knowledge worker in using search, acquiring and analyzing information to manage performance. One major problem faced with knowledge workers is the identification of root causes of performance deficiency. The large amount of factors involved in the analyze the root causes can be time consuming and the objective of applying data mining technology is to reduce the time and resources needed for such process. The increasing market competition for better performance management in various industries gives rises to need of the intelligent use of data. Because of this, the framework proposed here is very much generalizable to industries such as manufacturing. It could assist knowledge workers who are constantly looking for ways to improve operation effectiveness through new initiatives and the effort is required to be quickly done to gain competitive advantage in the marketplace.

  3. Roofline Analysis in the Intel® Advisor to Deliver Optimized Performance for applications on Intel® Xeon Phi™ Processor

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, Tuomas S.; Lobet, Mathieu; Deslippe, Jack; Matveev, Zakhar

    2017-05-23

    In this session we show, in two case studies, how the roofline feature of Intel Advisor has been utilized to optimize the performance of kernels of the XGC1 and PICSAR codes in preparation for Intel Knights Landing architecture. The impact of the implemented optimizations and the benefits of using the automatic roofline feature of Intel Advisor to study performance of large applications will be presented. This demonstrates an effective optimization strategy that has enabled these science applications to achieve up to 4.6 times speed-up and prepare for future exascale architectures. # Goal/Relevance of Session The roofline model [1,2] is a powerful tool for analyzing the performance of applications with respect to the theoretical peak achievable on a given computer architecture. It allows one to graphically represent the performance of an application in terms of operational intensity, i.e. the ratio of flops performed and bytes moved from memory in order to guide optimization efforts. Given the scale and complexity of modern science applications, it can often be a tedious task for the user to perform the analysis on the level of functions or loops to identify where performance gains can be made. With new Intel tools, it is now possible to automate this task, as well as base the estimates of peak performance on measurements rather than vendor specifications. The goal of this session is to demonstrate how the roofline feature of Intel Advisor can be used to balance memory vs. computation related optimization efforts and effectively identify performance bottlenecks. A series of typical optimization techniques: cache blocking, structure refactoring, data alignment, and vectorization illustrated by the kernel cases will be addressed. # Description of the codes ## XGC1 The XGC1 code [3] is a magnetic fusion Particle-In-Cell code that uses an unstructured mesh for its Poisson solver that allows it to accurately resolve the edge plasma of a magnetic fusion device. After

  4. Orion: a web-based application designed to monitor resident and fellow performance on-call.

    Science.gov (United States)

    Itri, Jason N; Kim, Woojin; Scanlon, Mary H

    2011-10-01

    Radiology residency and fellowship training provides a unique opportunity to evaluate trainee performance and determine the impact of various educational interventions. We have developed a simple software application (Orion) using open-source tools to facilitate the identification and monitoring of resident and fellow discrepancies in on-call preliminary reports. Over a 6-month period, 19,200 on-call studies were interpreted by 20 radiology residents, and 13,953 on-call studies were interpreted by 25 board-certified radiology fellows representing eight subspecialties. Using standard review macros during faculty interpretation, each of these reports was classified as "agreement", "minor discrepancy", and "major discrepancy" based on the potential to impact patient management or outcome. Major discrepancy rates were used to establish benchmarks for resident and fellow performance by year of training, modality, and subspecialty, and to identify residents and fellows demonstrating a significantly higher major discrepancy rate compared with their classmates. Trends in discrepancies were used to identify subspecialty-specific areas of increased major discrepancy rates in an effort to tailor the didactic and case-based curriculum. A series of missed-case conferences were developed based on trends in discrepancies, and the impact of these conferences is currently being evaluated. Orion is a powerful information technology tool that can be used by residency program directors, fellowship programs directors, residents, and fellows to improve radiology education and training.

  5. Evaluation of Smartphone Inertial Sensor Performance for Cross-Platform Mobile Applications

    Science.gov (United States)

    Kos, Anton; Tomažič, Sašo; Umek, Anton

    2016-01-01

    Smartphone sensors are being increasingly used in mobile applications. The performance of sensors varies considerably among different smartphone models and the development of a cross-platform mobile application might be a very complex and demanding task. A publicly accessible resource containing real-life-situation smartphone sensor parameters could be of great help for cross-platform developers. To address this issue we have designed and implemented a pilot participatory sensing application for measuring, gathering, and analyzing smartphone sensor parameters. We start with smartphone accelerometer and gyroscope bias and noise parameters. The application database presently includes sensor parameters of more than 60 different smartphone models of different platforms. It is a modest, but important start, offering information on several statistical parameters of the measured smartphone sensors and insights into their performance. The next step, a large-scale cloud-based version of the application, is already planned. The large database of smartphone sensor parameters may prove particularly useful for cross-platform developers. It may also be interesting for individual participants who would be able to check-up and compare their smartphone sensors against a large number of similar or identical models. PMID:27049391

  6. Evaluation of Smartphone Inertial Sensor Performance for Cross-Platform Mobile Applications

    Directory of Open Access Journals (Sweden)

    Anton Kos

    2016-04-01

    Full Text Available Smartphone sensors are being increasingly used in mobile applications. The performance of sensors varies considerably among different smartphone models and the development of a cross-platform mobile application might be a very complex and demanding task. A publicly accessible resource containing real-life-situation smartphone sensor parameters could be of great help for cross-platform developers. To address this issue we have designed and implemented a pilot participatory sensing application for measuring, gathering, and analyzing smartphone sensor parameters. We start with smartphone accelerometer and gyroscope bias and noise parameters. The application database presently includes sensor parameters of more than 60 different smartphone models of different platforms. It is a modest, but important start, offering information on several statistical parameters of the measured smartphone sensors and insights into their performance. The next step, a large-scale cloud-based version of the application, is already planned. The large database of smartphone sensor parameters may prove particularly useful for cross-platform developers. It may also be interesting for individual participants who would be able to check-up and compare their smartphone sensors against a large number of similar or identical models.

  7. Associations between Otolaryngology Applicant Characteristics and Future Performance in Residency or Practice: A Systematic Review.

    Science.gov (United States)

    Bowe, Sarah N; Laury, Adrienne M; Gray, Stacey T

    2017-06-01

    Objective This systematic review aims to evaluate which applicant characteristics available to an otolaryngology selection committee are associated with future performance in residency or practice. Data Sources PubMed, Scopus, ERIC, Health Business, Psychology and Behavioral Sciences Collection, and SocINDEX. Review Methods Study eligibility was performed by 2 independent investigators in accordance with the PRISMA protocol (Preferred Reporting Items for Systematic Reviews and Meta-analyses). Data obtained from each article included research questions, study design, predictors, outcomes, statistical analysis, and results/findings. Study bias was assessed with the Quality in Prognosis Studies tool. Results The initial search identified 439 abstracts. Six articles fulfilled all inclusion and exclusion criteria. All studies were retrospective cohort studies (level 4). Overall, the studies yielded relatively few criteria that correlated with residency success, with generally conflicting results. Most studies were found to have a high risk of bias. Conclusion Previous resident selection research has lacked a theoretical background, thus predisposing this work to inconsistent results and high risk of bias. The included studies provide historical insight into the predictors and criteria (eg, outcomes) previously deemed pertinent by the otolaryngology field. Additional research is needed, possibly integrating aspects of personnel selection, to engage in an evidence-based approach to identify highly qualified candidates who will succeed as future otolaryngologists.

  8. MAPLE research reactor beam-tube performance

    International Nuclear Information System (INIS)

    Lee, A.G.; Lidstone, R.F.; Gillespie, G.E.

    1989-05-01

    Atomic Energy of Canada Limited (AECL) has been developing the MAPLE (Multipurpose Applied Physics Lattice Experimental) reactor concept as a medium-flux neutron source to meet contemporary research reactor applications. This paper gives a brief description of the MAPLE reactor and presents some results of computer simulations used to analyze the neutronic performance. The computer simulations were performed to identify how the MAPLE reactor may be adapted to beam-tube applications such as neutron radiography

  9. Development and application of the Safe Performance Index as a risk-based methodology for identifying major hazard-related safety issues in underground coal mines

    Science.gov (United States)

    Kinilakodi, Harisha

    The underground coal mining industry has been under constant watch due to the high risk involved in its activities, and scrutiny increased because of the disasters that occurred in 2006-07. In the aftermath of the incidents, the U.S. Congress passed the Mine Improvement and New Emergency Response Act of 2006 (MINER Act), which strengthened the existing regulations and mandated new laws to address the various issues related to a safe working environment in the mines. Risk analysis in any form should be done on a regular basis to tackle the possibility of unwanted major hazard-related events such as explosions, outbursts, airbursts, inundations, spontaneous combustion, and roof fall instabilities. One of the responses by the Mine Safety and Health Administration (MSHA) in 2007 involved a new pattern of violations (POV) process to target mines with a poor safety performance, specifically to improve their safety. However, the 2010 disaster (worst in 40 years) gave an impression that the collective effort of the industry, federal/state agencies, and researchers to achieve the goal of zero fatalities and serious injuries has gone awry. The Safe Performance Index (SPI) methodology developed in this research is a straight-forward, effective, transparent, and reproducible approach that can help in identifying and addressing some of the existing issues while targeting (poor safety performance) mines which need help. It combines three injury and three citation measures that are scaled to have an equal mean (5.0) in a balanced way with proportionate weighting factors (0.05, 0.15, 0.30) and overall normalizing factor (15) into a mine safety performance evaluation tool. It can be used to assess the relative safety-related risk of mines, including by mine-size category. Using 2008 and 2009 data, comparisons were made of SPI-associated, normalized safety performance measures across mine-size categories, with emphasis on small-mine safety performance as compared to large- and

  10. 45 CFR 305.33 - Determination of applicable percentages based on performance levels.

    Science.gov (United States)

    2010-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES PROGRAM PERFORMANCE MEASURES, STANDARDS, FINANCIAL INCENTIVES, AND PENALTIES § 305.33 Determination of applicable percentages based on performance levels. (a) A State's... performance levels. 305.33 Section 305.33 Public Welfare Regulations Relating to Public Welfare OFFICE OF...

  11. Performance of fertigation technique for phosphorus application in cotton

    Directory of Open Access Journals (Sweden)

    M. Aslam

    2009-05-01

    Full Text Available Low native soil phosphorus availability coupled with poor utilization of added phosphorus is one of the major constraints limiting the productivity of the crops. With a view of addressing this issue, field studies were conducted to compare the relative efficacy of broadcast and fertigation techniques for phosphorus application during 2005-2006 using cotton as a test crop. Two methods of phosphorus application i.e. broadcast and fertigation were evaluated using five levels of P2O5 (0, 30, 45, 60 and 75 kg P2O5 ha -1. Fertigation showed an edge over broadcast method at all levels of phosphorus application. The highest seed cotton yield was recorded with 75 kg P2O5 ha-1. Fertilizer phosphorus applied at the rate of 60 kg ha-1 through fertigation produced 3.4 tons ha-1 of seed cotton yield, which was statistically identical to 3.3 tons recorded with 75 kg ha-1 of broadcast phosphorus. Agronomic performance of phosphorus was influenced considerably by either method of fertilizer application. The seed cotton yield per kg of fertigation phosphorus was 48% higher than the corresponding broadcast application. The results of these studies showed that fertigation was the most efficient method of phosphorus application compared with the conventional broadcast application of fertilizers.

  12. RAPPORT: running scientific high-performance computing applications on the cloud.

    Science.gov (United States)

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  13. Business intelligence and performance management theory, systems and industrial applications

    CERN Document Server

    2013-01-01

    This book covers all the basic concepts of business intelligence and performance management including strategic support, business applications, methodologies and technologies from the field, and thoroughly explores the benefits, issues and challenges of each.

  14. Application essays and future performance in medical school: are they related?

    Science.gov (United States)

    Dong, Ting; Kay, Allen; Artino, Anthony R; Gilliland, William R; Waechter, Donna M; Cruess, David; DeZee, Kent J; Durning, Steven J

    2013-01-01

    There is a paucity of research on whether application essays are a valid indicator of medical students' future performance. The goal is to score medical school application essays systematically and examine the correlations between these essay scores and several indicators of student performance during medical school and internship. A journalist created a scoring rubric based on the journalism literature and scored 2 required essays of students admitted to our university in 1 year (N = 145). We picked 7 indicators of medical school and internship performance and correlated these measures with overall essay scores: preclinical medical school grade point average (GPA), clinical medical school GPA, cumulative medical school GPA, U.S. Medical Licensing Exam (USMLE) Step 1 and 2 scores, and scores on a program director's evaluation measuring intern professionalism and expertise. We then examined the Pearson and Spearman correlations between essay scores and the outcomes. Essay scores did not vary widely. American Medical College Application Service essay scores ranged from 3.3 to 4.5 (M = 4.11, SD = 0.15), and Uniformed Services University of the Health Sciences essay scores ranged from 2.9 to 4.5 (M = 4.09, SD = 0.17). None of the medical school or internship performance indicators was significantly correlated with the essay scores. These findings raise questions about the utility of matriculation essays, a resource-intensive admission requirement.

  15. Where to from here? Future applications of mental models of complex performance

    International Nuclear Information System (INIS)

    Hahn, H.A.; Nelson, W.R.; Blackman, H.S.

    1988-01-01

    The purpose of this paper is to raise issues for discussion regarding the applications of mental models in the study of complex performance. Applications for training, expert systems and decision aids, job selection, workstation design, and other complex environments are considered. 1 ref

  16. Predicting performance at medical school: can we identify at-risk students?

    Directory of Open Access Journals (Sweden)

    Shaban S

    2011-05-01

    Full Text Available Sami Shaban, Michelle McLeanDepartment of Medical Education, Faculty of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab EmiratesBackground: The purpose of this study was to examine the predictive potential of multiple indicators (eg, preadmission scores, unit, module and clerkship grades, course and examination scores on academic performance at medical school, with a view to identifying students at risk.Methods: An analysis was undertaken of medical student grades in a 6-year medical school program at the Faculty of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab Emirates, over the past 14 years.Results: While high school scores were significantly (P < 0.001 correlated with the final integrated examination, predictability was only 6.8%. Scores for the United Arab Emirates university placement assessment (Common Educational Proficiency Assessment were only slightly more promising as predictors with 14.9% predictability for the final integrated examination. Each unit or module in the first four years was highly correlated with the next unit or module, with 25%–60% predictability. Course examination scores (end of years 2, 4, and 6 were significantly correlated (P < 0.001 with the average scores in that 2-year period (59.3%, 64.8%, and 55.8% predictability, respectively. Final integrated examination scores were significantly correlated (P < 0.001 with National Board of Medical Examiners scores (35% predictability. Multivariate linear regression identified key grades with the greatest predictability of the final integrated examination score at three stages in the program.Conclusion: This study has demonstrated that it may be possible to identify “at-risk” students relatively early in their studies through continuous data archiving and regular analysis. The data analysis techniques used in this study are not unique to this institution.Keywords: at-risk students, grade

  17. Performance of student software development teams: the influence of personality and identifying as team members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms should substantially influence the team's performance. This paper explores the influence of both these perspectives in university software engineering project teams. Eighty students worked to complete a piece of software in small project teams during 2007 or 2008. To reduce limitations in statistical analysis, Monte Carlo simulation techniques were employed to extrapolate from the results of the original sample to a larger simulated sample (2043 cases, within 319 teams). The results emphasise the importance of taking into account personality (particularly conscientiousness), and both team identification and the team's norm of performance, in order to cultivate higher levels of performance in student software engineering project teams.

  18. Performance testing to identify climate-ready trees

    Science.gov (United States)

    E.Gregory McPherson; Alison M. Berry; Natalie S. van Doorn

    2018-01-01

    Urban forests produce ecosystem services that can benefit city dwellers, but are especially vulnerable to climate change stressors such as heat, drought, extreme winds and pests. Tree selection is an important decision point for managers wanting to transition to a more stable and resilient urban forest structure. This study describes a five-step process to identify and...

  19. Applications of Earth Remote Sensing for Identifying Tornado and Severe Weather Damage

    Science.gov (United States)

    Schultz, Lori; Molthan, Andrew; Burks, Jason E.; Bell, Jordan; McGrath, Kevin; Cole, Tony

    2016-01-01

    NASA SPoRT (Short-term Prediction Research and Transition Center) provided MODIS (Moderate Resolution Imaging Spectrometer) and ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) imagery to WFOs (Weather Forecast Offices) in Alabama to support April 27th, 2011 damage assessments across the state. SPoRT was awarded a NASA Applied Science: Disasters Feasibility award to investigate the applicability of including remote sensing imagery and derived products into the NOAA/NWS (National Oceanic and Atmospheric Administration/National Weather System) Damage Assessment Toolkit (DAT). Proposal team was awarded the 3-year proposal to implement a web mapping service and associate data feeds from the USGS (U.S. Geological Survey) to provide satellite imagery and derived products directly to the NWS thru the DAT. In the United States, NOAA/NWS is charged with performing damage assessments when storm or tornado damage is suspected after a severe weather event. This has led to the development of the Damage Assessment Toolkit (DAT), an application for smartphones, tablets and web browsers that allows for the collection, geo-location, and aggregation of various damage indicators collected during storm surveys.

  20. Performance of modified blood pressure-to-height ratio for identifying hypertension in Chinese and American children.

    Science.gov (United States)

    Zhang, Yuanyuan; Ma, Chuanwei; Yang, Lili; Bovet, Pascal; Xi, Bo

    2018-04-06

    Blood pressure-to-height ratio (BPHR) has been reported to perform well for identifying hypertension (HTN) in adolescents but not in young children. Our study was aimed to evaluate the performance of BPHR and modified BPHR (MBPHR) for screening HTN in children. A total of 5268 Chinese children (boys: 53.1%) aged 6-12 years and 5024 American children (boys: 48.1%) aged 8-12 years were included in the present study. BPHR was calculated as BP/height (mmHg/cm). MBPHR7 was calculated as BP/(height + 7*(13-age)). MBPHR3 was calculated as BP/(height + 3*(13-age)). We used receiver-operating characteristic curve analysis to assess the performance of the three ratios for identifying HTN in children as compared to the 2017 U.S. clinical guideline as the "gold standard". The prevalence of HTN in Chinese and American children was 9.4% and 5.4%, respectively, based on the 2017 U.S. guideline. The AUC was larger for MBPHR3 than BPHR and MBPHR7. All three ratios had optimal negative predictive value (~100%). The positive predictive value (PPV) was higher for MBPHR3 than BPHR in both Chinese (43.9% vs. 37.9%) and American (39.1% vs. 26.3%) children. In contrast, the PPV was higher for MBPHR7 than BPHR in Chinese children (47.4% vs. 37.9%) but not in American children (24.8% vs. 26.3%). In summary, MBPHR3 overall performed better than MBPHR7 and BPHR for identifying HTN in children. However, the three ratios had low PPV (<50%) as compared to the 2017 U.S. guidelines, which makes these ratios of limited use for HTN screening in children.

  1. MaMR: High-performance MapReduce programming model for material cloud applications

    Science.gov (United States)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  2. Identifying Dyscalculia Symptoms Related to Magnocellular Reasoning Using Smartphones.

    Science.gov (United States)

    Knudsen, Greger Siem; Babic, Ankica

    2016-01-01

    This paper presents a study that has developed a mobile software application for assisting diagnosis of learning disabilities in mathematics, called dyscalculia, and measuring correlations between dyscalculia symptoms and magnocellular reasoning. Usually, software aids for dyscalculic individuals are focused on both assisting diagnosis and teaching the material. The software developed in this study however maintains a specific focus on the former, and in the process attempts to capture alleged correlations between dyscalculia symptoms and possible underlying causes of the condition. Classification of symptoms is performed by k-Nearest Neighbor algorithm classifying five parameters evaluating user's skills, returning calculated performance in each category as well as correlation strength between detected symptoms and magnocellular reasoning abilities. Expert evaluations has found the application to be appropriate and productive for its intended purpose, proving that mobile software is a suitable and valuable tool for assisting dyscalculia diagnosis and identifying root causes of developing the condition.

  3. Developing and testing an instrument for identifying performance incentives in the Greek health care sector.

    Science.gov (United States)

    Paleologou, Victoria; Kontodimopoulos, Nick; Stamouli, Aggeliki; Aletras, Vassilis; Niakas, Dimitris

    2006-09-13

    In the era of cost containment, managers are constantly pursuing increased organizational performance and productivity by aiming at the obvious target, i.e. the workforce. The health care sector, in which production processes are more complicated compared to other industries, is not an exception. In light of recent legislation in Greece in which efficiency improvement and achievement of specific performance targets are identified as undisputable health system goals, the purpose of this study was to develop a reliable and valid instrument for investigating the attitudes of Greek physicians, nurses and administrative personnel towards job-related aspects, and the extent to which these motivate them to improve performance and increase productivity. A methodological exploratory design was employed in three phases: a) content development and assessment, which resulted in a 28-item instrument, b) pilot testing (N = 74) and c) field testing (N = 353). Internal consistency reliability was tested via Cronbach's alpha coefficient and factor analysis was used to identify the underlying constructs. Tests of scaling assumptions, according to the Multitrait-Multimethod Matrix, were used to confirm the hypothesized component structure. Four components, referring to intrinsic individual needs and external job-related aspects, were revealed and explain 59.61% of the variability. They were subsequently labeled: job attributes, remuneration, co-workers and achievement. Nine items not meeting item-scale criteria were removed, resulting in a 19-item instrument. Scale reliability ranged from 0.782 to 0.901 and internal item consistency and discriminant validity criteria were satisfied. Overall, the instrument appears to be a promising tool for hospital administrations in their attempt to identify job-related factors, which motivate their employees. The psychometric properties were good and warrant administration to a larger sample of employees in the Greek healthcare system.

  4. Developing and testing an instrument for identifying performance incentives in the Greek health care sector

    Directory of Open Access Journals (Sweden)

    Paleologou Victoria

    2006-09-01

    Full Text Available Abstract Background In the era of cost containment, managers are constantly pursuing increased organizational performance and productivity by aiming at the obvious target, i.e. the workforce. The health care sector, in which production processes are more complicated compared to other industries, is not an exception. In light of recent legislation in Greece in which efficiency improvement and achievement of specific performance targets are identified as undisputable health system goals, the purpose of this study was to develop a reliable and valid instrument for investigating the attitudes of Greek physicians, nurses and administrative personnel towards job-related aspects, and the extent to which these motivate them to improve performance and increase productivity. Methods A methodological exploratory design was employed in three phases: a content development and assessment, which resulted in a 28-item instrument, b pilot testing (N = 74 and c field testing (N = 353. Internal consistency reliability was tested via Cronbach's alpha coefficient and factor analysis was used to identify the underlying constructs. Tests of scaling assumptions, according to the Multitrait-Multimethod Matrix, were used to confirm the hypothesized component structure. Results Four components, referring to intrinsic individual needs and external job-related aspects, were revealed and explain 59.61% of the variability. They were subsequently labeled: job attributes, remuneration, co-workers and achievement. Nine items not meeting item-scale criteria were removed, resulting in a 19-item instrument. Scale reliability ranged from 0.782 to 0.901 and internal item consistency and discriminant validity criteria were satisfied. Conclusion Overall, the instrument appears to be a promising tool for hospital administrations in their attempt to identify job-related factors, which motivate their employees. The psychometric properties were good and warrant administration to a larger

  5. Wireless sensor network performance metrics for building applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, W.S. (Department of Civil Engineering Yeungnam University 214-1 Dae-Dong, Gyeongsan-Si Gyeongsangbuk-Do 712-749 South Korea); Healy, W.M. [Building and Fire Research Laboratory, 100 Bureau Drive, Gaithersburg, MD 20899-8632 (United States)

    2010-06-15

    Metrics are investigated to help assess the performance of wireless sensors in buildings. Wireless sensor networks present tremendous opportunities for energy savings and improvement in occupant comfort in buildings by making data about conditions and equipment more readily available. A key barrier to their adoption, however, is the uncertainty among users regarding the reliability of the wireless links through building construction. Tests were carried out that examined three performance metrics as a function of transmitter-receiver separation distance, transmitter power level, and obstruction type. These tests demonstrated, via the packet delivery rate, a clear transition from reliable to unreliable communications at different separation distances. While the packet delivery rate is difficult to measure in actual applications, the received signal strength indication correlated well with the drop in packet delivery rate in the relatively noise-free environment used in these tests. The concept of an equivalent distance was introduced to translate the range of reliability in open field operation to that seen in a typical building, thereby providing wireless system designers a rough estimate of the necessary spacing between sensor nodes in building applications. It is anticipated that the availability of straightforward metrics on the range of wireless sensors in buildings will enable more widespread sensing in buildings for improved control and fault detection. (author)

  6. Summary report on the FHWA LTBP Workshop to identify bridge substructure performance issues : March 4-6, 2010, in Orlando, FL.

    Science.gov (United States)

    2013-01-01

    The Long-Term Bridge Performance (LTBP) program was created to identify, collect, and analyze researchquality : data on the most critical aspects of bridge performance. To complete a thorough investigation of bridge : performance issues, the Federal ...

  7. The circular electrical mobility spectrometer; theory, performances and applications

    International Nuclear Information System (INIS)

    Mesbah, Boualem

    1995-04-01

    A new type of electrical mobility spectrometer (S.M.E.C.) has been designed in the Service d'Etudes et de Recherches en Aerocontamination et en Confinement (CEA) laboratories. It differs from classical electrical mobility spectrometers in its plan circular geometry and its radial flow. This gives some advantages and the possibility of new applications. The theories that we derive for the different versions of this device are confirmed by experimental results obtained using aerosol particles with known electrical mobility. The S.M.E.C's performances are tested for several applications: - controlled surface contamination, - monodisperse aerosol production, - fine and ultrafine aerosol sizing. (author) [fr

  8. BurstMem: A High-Performance Burst Buffer System for Scientific Applications

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Teng [Auburn University, Auburn, Alabama; Oral, H Sarp [ORNL; Wang, Yandong [Auburn University, Auburn, Alabama; Settlemyer, Bradley W [ORNL; Atchley, Scott [ORNL; Yu, Weikuan [Auburn University, Auburn, Alabama

    2014-01-01

    The growth of computing power on large-scale sys- tems requires commensurate high-bandwidth I/O system. Many parallel file systems are designed to provide fast sustainable I/O in response to applications soaring requirements. To meet this need, a novel system is imperative to temporarily buffer the bursty I/O and gradually flush datasets to long-term parallel file systems. In this paper, we introduce the design of BurstMem, a high- performance burst buffer system. BurstMem provides a storage framework with efficient storage and communication manage- ment strategies. Our experiments demonstrate that BurstMem is able to speed up the I/O performance of scientific applications by up to 8.5 on leadership computer systems.

  9. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.

  10. Correlation of Behavioral Interviewing Performance With Obstetrics and Gynecology Residency Applicant Characteristics☆?>.

    Science.gov (United States)

    Breitkopf, Daniel M; Vaughan, Lisa E; Hopkins, Matthew R

    To determine which individual residency applicant characteristics were associated with improved performance on standardized behavioral interviews. Behavioral interviewing has become a common technique for assessing resident applicants. Few data exist on factors that predict success during the behavioral interview component of the residency application process. Interviewers were trained in behavioral interviewing techniques before each application season. Standardized questions were used. Behavioral interview scores and Electronic Residency Application Service data from residency applicants was collected prospectively for 3 years. It included the Accreditation Council for Graduate Medical Education-accredited obstetrics-gynecology residency program at a Midwestern academic medical center. Medical students applying to a single obstetrics-gynecology residency program from 2012 to 2014 participated in the study. Data were collected from 104 applicants during 3 successive interview seasons. Applicant's age was associated with higher overall scores on questions about leadership, coping, and conflict management (for applicants aged ≤25, 26-27, or ≥28y, mean scores were 15.2, 16.0, and 17.2, respectively; p = 0.03), as was a history of employment before medical school (16.8 vs 15.5; p = 0.03). Applicants who participated in collegiate team sports scored lower on questions asking influence/persuasion, initiative, and relationship management compared with those who did not (mean, 15.5 vs 17.1; p = 0.02). Advanced applicant age and history of work experience before medical school may improve skills in dealing with difficult situations and offer opportunities in leadership. In the behavioral interview format, having relevant examples from life experience to share during the interviews may improve the quality of the applicant's responses. Increased awareness of the factors predicting interview performance helps inform the selection process and allows program directors to

  11. Can Medical School Performance Predict Residency Performance? Resident Selection and Predictors of Successful Performance in Obstetrics and Gynecology

    Science.gov (United States)

    Stohl, Hindi E.; Hueppchen, Nancy A.; Bienstock, Jessica L.

    2010-01-01

    Background During the evaluation process, Residency Admissions Committees typically gather data on objective and subjective measures of a medical student's performance through the Electronic Residency Application Service, including medical school grades, standardized test scores, research achievements, nonacademic accomplishments, letters of recommendation, the dean's letter, and personal statements. Using these data to identify which medical students are likely to become successful residents in an academic residency program in obstetrics and gynecology is difficult and to date, not well studied. Objective To determine whether objective information in medical students' applications can help predict resident success. Method We performed a retrospective cohort study of all residents who matched into the Johns Hopkins University residency program in obstetrics and gynecology between 1994 and 2004 and entered the program through the National Resident Matching Program as a postgraduate year-1 resident. Residents were independently evaluated by faculty and ranked in 4 groups according to perceived level of success. Applications from residents in the highest and lowest group were abstracted. Groups were compared using the Fisher exact test and the Student t test. Results Seventy-five residents met inclusion criteria and 29 residents were ranked in the highest and lowest quartiles (15 in highest, 14 in lowest). Univariate analysis identified no variables as consistent predictors of resident success. Conclusion In a program designed to train academic obstetrician-gynecologists, objective data from medical students' applications did not correlate with successful resident performance in our obstetrics-gynecology residency program. We need to continue our search for evaluation criteria that can accurately and reliably select the medical students that are best fit for our specialty. PMID:21976076

  12. Optical packet switching in HPC : an analysis of applications performance

    NARCIS (Netherlands)

    Meyer, Hugo; Sancho, Jose Carlos; Mrdakovic, Milica; Miao, Wang; Calabretta, Nicola

    2018-01-01

    Optical Packet Switches (OPS) could provide the needed low latency transmissions in today large data centers. OPS can deliver lower latency and higher bandwidth than traditional electrical switches. These features are needed for parallel High Performance Computing (HPC) applications. For this

  13. Novel Application of Statistical Methods to Identify New Urinary Incontinence Risk Factors

    Directory of Open Access Journals (Sweden)

    Theophilus O. Ogunyemi

    2012-01-01

    Full Text Available Longitudinal data for studying urinary incontinence (UI risk factors are rare. Data from one study, the hallmark Medical, Epidemiological, and Social Aspects of Aging (MESA, have been analyzed in the past; however, repeated measures analyses that are crucial for analyzing longitudinal data have not been applied. We tested a novel application of statistical methods to identify UI risk factors in older women. MESA data were collected at baseline and yearly from a sample of 1955 men and women in the community. Only women responding to the 762 baseline and 559 follow-up questions at one year in each respective survey were examined. To test their utility in mining large data sets, and as a preliminary step to creating a predictive index for developing UI, logistic regression, generalized estimating equations (GEEs, and proportional hazard regression (PHREG methods were used on the existing MESA data. The GEE and PHREG combination identified 15 significant risk factors associated with developing UI out of which six of them, namely, urinary frequency, urgency, any urine loss, urine loss after emptying, subject’s anticipation, and doctor’s proactivity, are found most highly significant by both methods. These six factors are potential candidates for constructing a future UI predictive index.

  14. Quality Analysis of Mobile Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2011-01-01

    Full Text Available Mobile applications are defined and different types of mobile applications are identified. Characteristics of quality are defined and their indicators are constructed to measure levels. Take into account 11 parameters analysis for mobile applications, which are arranged using weights and do a detailed analysis of the system of weights. For SMSEncrypt application performance measurement is done using an aggregate indicator based on the obtained weights system.

  15. Predicting Resident Performance from Preresidency Factors: A Systematic Review and Applicability to Neurosurgical Training.

    Science.gov (United States)

    Zuckerman, Scott L; Kelly, Patrick D; Dewan, Michael C; Morone, Peter J; Yengo-Kahn, Aaron M; Magarik, Jordan A; Baticulon, Ronnie E; Zusman, Edie E; Solomon, Gary S; Wellons, John C

    2018-02-01

    Neurosurgical educators strive to identify the best applicants, yet formal study of resident selection has proved difficult. We conducted a systematic review to answer the following question: What objective and subjective preresidency factors predict resident success? PubMed, ProQuest, Embase, and the CINAHL databases were queried from 1952 to 2015 for literature reporting the impact of preresidency factors (PRFs) on outcomes of residency success (RS), among neurosurgery and all surgical subspecialties. Due to heterogeneity of specialties and outcomes, a qualitative summary and heat map of significant findings were constructed. From 1489 studies, 21 articles met inclusion criteria, which evaluated 1276 resident applicants across five surgical subspecialties. No neurosurgical studies met the inclusion criteria. Common objective PRFs included standardized testing (76%), medical school performance (48%), and Alpha Omega Alpha (43%). Common subjective PRFs included aggregate rank scores (57%), letters of recommendation (38%), research (33%), interviews (19%), and athletic or musical talent (19%). Outcomes of RS included faculty evaluations, in-training/board exams, chief resident status, and research productivity. Among objective factors, standardized test scores correlated well with in-training/board examinations but poorly correlated with faculty evaluations. Among subjective factors, aggregate rank scores, letters of recommendation, and athletic or musical talent demonstrated moderate correlation with faculty evaluations. Standardized testing most strongly correlated with future examination performance but correlated poorly with faculty evaluations. Moderate predictors of faculty evaluations were aggregate rank scores, letters of recommendation, and athletic or musical talent. The ability to predict success of neurosurgical residents using an evidence-based approach is limited, and few factors have correlated with future resident performance. Given the importance of

  16. Performance and applications of a μ-TPC

    International Nuclear Information System (INIS)

    Miuchi, Kentaro; Kubo, Hidetoshi; Nagayoshi, Tsutomu; Okada, Yoko; Orito, Reiko; Takada, Atsushi; Takeda, Atsushi; Tanimori, Toru; Ueno, Masaru; Bouianov, Oleg; Bouianov, Marina

    2004-01-01

    A μ-TPC, a time projection chamber (TPC) which can detect three-dimensional fine tracks of charged particles, was developed and its performance was measured. We developed a μ-TPC with a detection volume of 10x10x10cm3 based on a novel two-dimensional imaging gaseous detector, or the μ-PIC. Fine tracks of charged particles with large energy depositions (protons and electrons) were detected with the μ-TPC. With a pressurized gas, tracks of the minimum ionizing particles were detected. We showed the principle of the application for the time-resolved neutron imaging detector

  17. Distributed dynamic simulations of networked control and building performance applications

    NARCIS (Netherlands)

    Yahiaoui, Azzedine

    2018-01-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the

  18. The use of application-specific performance targets and engineering considerations to guide hydrogen storage materials development

    Energy Technology Data Exchange (ETDEWEB)

    Stetson, Ned T., E-mail: ned.stetson@ee.doe.gov [U.S. Department of Energy, 1000 Independence Ave., SW, EE-2H, Washington, DC 20585 (United States); Ordaz, Grace; Adams, Jesse; Randolph, Katie [U.S. Department of Energy, 1000 Independence Ave., SW, EE-2H, Washington, DC 20585 (United States); McWhorter, Scott [Savannah River National Laboratory, Aiken, SC 29808 (United States)

    2013-12-15

    Highlights: •Portable power and material handling equipment as early market technology pathways. •Engineering based system-level storage-materials requirements. •Application based targets. -- Abstract: The Hydrogen and Fuel Cells Technologies Office, carried out through the DOE Office of Energy Efficiency and Renewable Energy, maintains a broad portfolio of activities to enable the commercialization of fuel cells across a range of near, mid and long-term applications. Improved, advanced hydrogen storage technologies are seen as a critical need for successful implementation of hydrogen fuel cells in many of these applications. To guide and focus materials development efforts, the DOE develops system performance targets for the specific applications of interest, and carries out system engineering analyses to determine the system-level performance delivered when the materials are incorporated into a complete system. To meet the needs of applications, it is important to consider the system-level performance, not just the material-level properties. An overview of the DOE’s hydrogen storage efforts in developing application-specific performance targets and systems engineering to guide hydrogen storage materials identification and development is herein provided.

  19. The use of application-specific performance targets and engineering considerations to guide hydrogen storage materials development

    International Nuclear Information System (INIS)

    Stetson, Ned T.; Ordaz, Grace; Adams, Jesse; Randolph, Katie; McWhorter, Scott

    2013-01-01

    Highlights: •Portable power and material handling equipment as early market technology pathways. •Engineering based system-level storage-materials requirements. •Application based targets. -- Abstract: The Hydrogen and Fuel Cells Technologies Office, carried out through the DOE Office of Energy Efficiency and Renewable Energy, maintains a broad portfolio of activities to enable the commercialization of fuel cells across a range of near, mid and long-term applications. Improved, advanced hydrogen storage technologies are seen as a critical need for successful implementation of hydrogen fuel cells in many of these applications. To guide and focus materials development efforts, the DOE develops system performance targets for the specific applications of interest, and carries out system engineering analyses to determine the system-level performance delivered when the materials are incorporated into a complete system. To meet the needs of applications, it is important to consider the system-level performance, not just the material-level properties. An overview of the DOE’s hydrogen storage efforts in developing application-specific performance targets and systems engineering to guide hydrogen storage materials identification and development is herein provided

  20. Application of gene network analysis techniques identifies AXIN1/PDIA2 and endoglin haplotypes associated with bicuspid aortic valve.

    Directory of Open Access Journals (Sweden)

    Eric C Wooten

    2010-01-01

    Full Text Available Bicuspid Aortic Valve (BAV is a highly heritable congenital heart defect. The low frequency of BAV (1% of general population limits our ability to perform genome-wide association studies. We present the application of four a priori SNP selection techniques, reducing the multiple-testing penalty by restricting analysis to SNPs relevant to BAV in a genome-wide SNP dataset from a cohort of 68 BAV probands and 830 control subjects. Two knowledge-based approaches, CANDID and STRING, were used to systematically identify BAV genes, and their SNPs, from the published literature, microarray expression studies and a genome scan. We additionally tested Functionally Interpolating SNPs (fitSNPs present on the array; the fourth consisted of SNPs selected by Random Forests, a machine learning approach. These approaches reduced the multiple testing penalty by lowering the fraction of the genome probed to 0.19% of the total, while increasing the likelihood of studying SNPs within relevant BAV genes and pathways. Three loci were identified by CANDID, STRING, and fitSNPS. A haplotype within the AXIN1-PDIA2 locus (p-value of 2.926x10(-06 and a haplotype within the Endoglin gene (p-value of 5.881x10(-04 were found to be strongly associated with BAV. The Random Forests approach identified a SNP on chromosome 3 in association with BAV (p-value 5.061x10(-06. The results presented here support an important role for genetic variants in BAV and provide support for additional studies in well-powered cohorts. Further, these studies demonstrate that leveraging existing expression and genomic data in the context of GWAS studies can identify biologically relevant genes and pathways associated with a congenital heart defect.

  1. Routing performance analysis and optimization within a massively parallel computer

    Science.gov (United States)

    Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen

    2013-04-16

    An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.

  2. DEVICE TECHNOLOGY. Nanomaterials in transistors: From high-performance to thin-film applications.

    Science.gov (United States)

    Franklin, Aaron D

    2015-08-14

    For more than 50 years, silicon transistors have been continuously shrunk to meet the projections of Moore's law but are now reaching fundamental limits on speed and power use. With these limits at hand, nanomaterials offer great promise for improving transistor performance and adding new applications through the coming decades. With different transistors needed in everything from high-performance servers to thin-film display backplanes, it is important to understand the targeted application needs when considering new material options. Here the distinction between high-performance and thin-film transistors is reviewed, along with the benefits and challenges to using nanomaterials in such transistors. In particular, progress on carbon nanotubes, as well as graphene and related materials (including transition metal dichalcogenides and X-enes), outlines the advances and further research needed to enable their use in transistors for high-performance computing, thin films, or completely new technologies such as flexible and transparent devices. Copyright © 2015, American Association for the Advancement of Science.

  3. Thoughts on identifiers

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    As business processes and information transactions have become an inextricably intertwined with the Web, the importance of assignment, registration, discovery, and maintenance of identifiers has increased. In spite of this, integrated frameworks for managing identifiers have been slow to emerge. Instead, identification systems arise (quite naturally) from immediate business needs without consideration for how they fit into larger information architectures. In addition, many legacy identifier systems further complicate the landscape, making it difficult for content managers to select and deploy identifier systems that meet both the business case and long term information management objectives. This presentation will outline a model for evaluating identifier applications and the functional requirements of the systems necessary to support them. The model is based on a layered analysis of the characteristics of identifier systems, including: * Functional characteristics * Technology * Policy * Business * Social T...

  4. Teaching assistants’ performance at identifying common introductory student difficulties in mechanics revealed by the Force Concept Inventory

    Directory of Open Access Journals (Sweden)

    Alexandru Maries

    2016-05-01

    Full Text Available The Force Concept Inventory (FCI has been widely used to assess student understanding of introductory mechanics concepts by a variety of educators and physics education researchers. One reason for this extensive use is that many of the items on the FCI have strong distractor choices which correspond to students’ alternate conceptions in mechanics. Instruction is unlikely to be effective if instructors do not know the common alternate conceptions of introductory physics students and explicitly take into account students’ initial knowledge states in their instructional design. Here, we discuss research involving the FCI to evaluate one aspect of the pedagogical content knowledge of teaching assistants (TAs: knowledge of introductory student alternate conceptions in mechanics as revealed by the FCI. For each item on the FCI, the TAs were asked to identify the most common incorrect answer choice of introductory physics students. This exercise was followed by a class discussion with the TAs related to this task, including the importance of knowing student difficulties in teaching and learning. Then, we used FCI pretest and post-test data from a large population (∼900 of introductory physics students to assess the extent to which TAs were able to identify alternate conceptions of introductory students related to force and motion. In addition, we carried out think-aloud interviews with graduate students who had more than two semesters of teaching experience in recitations to examine how they reason about the task. We find that while the TAs, on average, performed better than random guessing at identifying introductory students’ difficulties with FCI content, they did not identify many common difficulties that introductory physics students have after traditional instruction. We discuss specific alternate conceptions, the extent to which TAs are able to identify them, and results from the think-aloud interviews that provided valuable information

  5. Video performance for high security applications

    International Nuclear Information System (INIS)

    Connell, Jack C.; Norman, Bradley C.

    2010-01-01

    The complexity of physical protection systems has increased to address modern threats to national security and emerging commercial technologies. A key element of modern physical protection systems is the data presented to the human operator used for rapid determination of the cause of an alarm, whether false (e.g., caused by an animal, debris, etc.) or real (e.g., a human adversary). Alarm assessment, the human validation of a sensor alarm, primarily relies on imaging technologies and video systems. Developing measures of effectiveness (MOE) that drive the design or evaluation of a video system or technology becomes a challenge, given the subjectivity of the application (e.g., alarm assessment). Sandia National Laboratories has conducted empirical analysis using field test data and mathematical models such as binomial distribution and Johnson target transfer functions to develop MOEs for video system technologies. Depending on the technology, the task of the security operator and the distance to the target, the Probability of Assessment (PAs) can be determined as a function of a variety of conditions or assumptions. PAs used as an MOE allows the systems engineer to conduct trade studies, make informed design decisions, or evaluate new higher-risk technologies. This paper outlines general video system design trade-offs, discusses ways video can be used to increase system performance and lists MOEs for video systems used in subjective applications such as alarm assessment.

  6. Predicting General Academic Performance and Identifying the Differential Contribution of Participating Variables Using Artificial Neural Networks

    Science.gov (United States)

    Musso, Mariel F.; Kyndt, Eva; Cascallar, Eduardo C.; Dochy, Filip

    2013-01-01

    Many studies have explored the contribution of different factors from diverse theoretical perspectives to the explanation of academic performance. These factors have been identified as having important implications not only for the study of learning processes, but also as tools for improving curriculum designs, tutorial systems, and students'…

  7. Identifying customer-focused performance measures : final report 655.

    Science.gov (United States)

    2010-10-01

    The Arizona Department of Transportation (ADOT) completed a comprehensive customer satisfaction : assessment in July 2009. ADOT commissioned the assessment to acquire statistically valid data from residents : and community leaders to help it identify...

  8. Genome-wide association study identifies three novel genetic markers associated with elite endurance performance

    DEFF Research Database (Denmark)

    Ahmetov, Ii; Kulemin, Na; Popov, Dv

    2015-01-01

    To investigate the association between multiple single-nucleotide polymorphisms (SNPs), aerobic performance and elite endurance athlete status in Russians. By using GWAS approach, we examined the association between 1,140,419 SNPs and relative maximal oxygen consumption rate ([Formula: see text]O2......max) in 80 international-level Russian endurance athletes (46 males and 34 females). To validate obtained results, we further performed case-control studies by comparing the frequencies of the most significant SNPs (with P endurance athletes and opposite cohorts (192...... Russian controls, 1367 European controls, and 230 Russian power athletes). Initially, six 'endurance alleles' were identified showing discrete associations with [Formula: see text]O2max both in males and females. Next, case-control studies resulted in remaining three SNPs (NFIA-AS2 rs1572312, TSHR rs...

  9. Results of data base management system parameterized performance testing related to GSFC scientific applications

    Science.gov (United States)

    Carchedi, C. H.; Gough, T. L.; Huston, H. A.

    1983-01-01

    The results of a variety of tests designed to demonstrate and evaluate the performance of several commercially available data base management system (DBMS) products compatible with the Digital Equipment Corporation VAX 11/780 computer system are summarized. The tests were performed on the INGRES, ORACLE, and SEED DBMS products employing applications that were similar to scientific applications under development by NASA. The objectives of this testing included determining the strength and weaknesses of the candidate systems, performance trade-offs of various design alternatives and the impact of some installation and environmental (computer related) influences.

  10. Distributed dynamic simulations of networked control and building performance applications.

    Science.gov (United States)

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  11. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  12. Mobile NBM - Android medical mobile application designed to help in learning how to identify the different regions of interest in the brain's white matter.

    Science.gov (United States)

    Sánchez-Rola, Iskander; Zapirain, Begoña García

    2014-07-18

    One of the most critical tasks when conducting neurological studies is identifying the different regions of interest in the brain's white matter. Currently few programs or applications are available that serve as an interactive guide in this process. This is why a mobile application has been designed and developed in order to teach users how to identify the referred regions of the brain. It also enables users to share the results obtained and take an examination on the knowledge thus learnt. In order to provide direct user-user or user-developer contact, the project includes a website and a Twitter account. An application has been designed with a basic, minimalist look, which anyone can access easily in order to learn to identify a specific region in the brain's white matter. A survey has also been conducted on people who have used it, which has shown that the application is attractive both in the student (final mean satisfaction of 4.2/5) and in the professional (final mean satisfaction of 4.3/5) environment. The response obtained in the online part of the project reflects the high practical value and quality of the application, as shown by the fact that the website has seen a large number of visitors (over 1000 visitors) and the Twitter account has a high number of followers (over 280 followers). Mobile NBM is the first mobile application to be used as a guide in the process of identifying a region of interest in the brain's white matter. Although initially not many areas are available in the application, new ones can be added as required by users in their respective studies. Apart from the application itself, the online resources provided (website and Twitter account) significantly enhance users' experience.

  13. An examination of OLED display application to military equipment

    Science.gov (United States)

    Thomas, J.; Lorimer, S.

    2010-04-01

    OLED display technology has developed sufficiently to support small format commercial applications such as cell-phone main display functions. Revenues seem sufficient to finance both performance improvements and to develop new applications. The situation signifies the possibility that OLED technology is on the threshold of credibility for military applications. This paper will examine both performance and some possible applications for the military ground mobile environment, identifying the advantages and disadvantages of this promising new technology.

  14. Precision ring rolling technique and application in high-performance bearing manufacturing

    Directory of Open Access Journals (Sweden)

    Hua Lin

    2015-01-01

    Full Text Available High-performance bearing has significant application in many important industry fields, like automobile, precision machine tool, wind power, etc. Precision ring rolling is an advanced rotary forming technique to manufacture high-performance seamless bearing ring thus can improve the working life of bearing. In this paper, three kinds of precision ring rolling techniques adapt to different dimensional ranges of bearings are introduced, which are cold ring rolling for small-scale bearing, hot radial ring rolling for medium-scale bearing and hot radial-axial ring rolling for large-scale bearing. The forming principles, technological features and forming equipments for three kinds of precision ring rolling techniques are summarized, the technological development and industrial application in China are introduced, and the main technological development trend is described.

  15. Thermal performance of a PCB embedded pulsating heat pipe for power electronics applications

    International Nuclear Information System (INIS)

    Kearney, Daniel J.; Suleman, Omar; Griffin, Justin; Mavrakis, Georgios

    2016-01-01

    Highlights: • Planar, compact PCB embedded pulsating heat pipe for heat spreading applications. • Embedded heat pipe operates at sub-ambient pressure with environmentally. • Compatible fluids. • Range of optimum operating conditions, orientations and fill ratios identified. - Abstract: Low voltage power electronics applications (<1.2 kV) are pushing the design envelope towards increased functionality, better reliability, low profile and reduced cost. One packaging method to enable these constraints is the integration of active power electronic devices into the printed circuit board improving electrical and thermal performance. This development requires a reliable passive thermal management solution to mitigate hot spots due to the increased heat flux density. To this end, a 44 channel open looped pulsating heat pipe (OL-PHP) is experimentally investigated for two independent dielectric working fluids – Novec"T"M 649 and Novec"T"M 774 – due to their lower pressure operation and low global warming potential compared to traditional two-phase coolants. The OL-PHP is investigated in vertical (90°) orientation with fill ratios ranging from 0.30 to 0.70. The results highlight the steady state operating conditions for each working fluid with instantaneous plots of pressure, temperature, and thermal resistance; the minimum potential bulk thermal resistance for each fill ratio and the effective thermal conductivity achievable for the OL-PHP.

  16. High performance hybrid magnetic structure for biotechnology applications

    Science.gov (United States)

    Humphries, David E [El Cerrito, CA; Pollard, Martin J [El Cerrito, CA; Elkin, Christopher J [San Ramon, CA

    2009-02-03

    The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides means for separation and other biotechnology applications involving holding, manipulation, or separation of magnetic or magnetizable molecular structures and targets. Also disclosed are further improvements to aspects of the hybrid magnetic structure, including additional elements and for adapting the use of the hybrid magnetic structure for use in biotechnology and high throughput processes.

  17. High performance parallel computing of flows in complex geometries: II. Applications

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F; Poinsot, T

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  18. Array Manipulation And Matrix-Tree Method To Identify High Concentration Regions HCRs

    Directory of Open Access Journals (Sweden)

    Rachana Arora

    2015-08-01

    Full Text Available Abstract-Sequence Alignment and Analysis is one of the most important applications of bioinformatics. It involves alignment a pair or more sequences with each other and identify a common pattern that would ultimately lead to conclusions of homology or dissimilarity. A number of algorithms that make use of dynamic programming to perform alignment between sequences are available. One of their main disadvantages is that they involve complicated computations and backtracking methods that are difficult to implement. This paper describes a much simpler method to identify common regions in 2 sequences and align them based on the density of common sequences identified.

  19. The coupling of the neutron transport application RATTLESNAKE to the nuclear fuels performance application BISON under the MOOSE framework

    Energy Technology Data Exchange (ETDEWEB)

    Gleicher, Frederick N.; Williamson, Richard L.; Ortensi, Javier; Wang, Yaqi; Spencer, Benjamin W.; Novascone, Stephen R.; Hales, Jason D.; Martineau, Richard C.

    2014-10-01

    The MOOSE neutron transport application RATTLESNAKE was coupled to the fuels performance application BISON to provide a higher fidelity tool for fuel performance simulation. This project is motivated by the desire to couple a high fidelity core analysis program (based on the self-adjoint angular flux equations) to a high fidelity fuel performance program, both of which can simulate on unstructured meshes. RATTLESNAKE solves self-adjoint angular flux transport equation and provides a sub-pin level resolution of the multigroup neutron flux with resonance treatment during burnup or a fast transient. BISON solves the coupled thermomechanical equations for the fuel on a sub-millimeter scale. Both applications are able to solve their respective systems on aligned and unaligned unstructured finite element meshes. The power density and local burnup was transferred from RATTLESNAKE to BISON with the MOOSE Multiapp transfer system. Multiple depletion cases were run with one-way data transfer from RATTLESNAKE to BISON. The eigenvalues are shown to agree well with values obtained from the lattice physics code DRAGON. The one-way data transfer of power density is shown to agree with the power density obtained from an internal Lassman-style model in BISON.

  20. Performance measurement in the UK construction industry and its role in supporting the application of lean construction concepts

    Directory of Open Access Journals (Sweden)

    Saad Sarhan

    2013-03-01

    Full Text Available Performance measurement has received substantial attention from researchers and the construction industry over the past two decades. This study sought to assess UK practitioners’ awareness of the importance of the use of appropriate performance measures and its role in supporting the application of Lean Construction (LC concepts. To enable the study to achieve its objectives, a review of a range of measurements developed to evaluate project performance including those devoted to support LC efforts was conducted. Consequently a questionnaire survey was developed and sent to 198 professionals in the UK construction industry as well as a small sample of academics with an interest in LC. Results indicated that although practitioners recognise the importance of the selection of non-financial performance measures, it has not been properly and widely implemented. The study identified the most common techniques used by UK construction organisations for performance measurement, and ranked a number of non-financial key performance indicators as significant. Some professed to have embraced the Last Planner System methodology as a means for performance measurement and organisational learning, while further questioning suggested otherwise. It was also suggested that substance thinking amongst professionals could be a significant hidden barrier that militates against the successful implementation of LC.

  1. Study on application and performance of OPPC under ice-phased condition

    Directory of Open Access Journals (Sweden)

    An Yi

    2016-01-01

    Full Text Available The optical fiber composite overhead line (OPPC combines electricity transmission and information transmission, is used increasingly widely in the electric power system, further broadening the application area of our country’s special cable in the meantime. In heavy ice-phased regions, the designing parameters and technical requirements should be of higher standards, or it will directly affect the safety and stability of electric communication system’s operation. Therefore, OPPC under ice condition performance changes should cause enough attention. This article proposes simulation test of OPPC under ice-cladding condition, basing on which the mechanical properties and light transmission performance were calculated and analyzed. Then it comes to a conclusion that the ice-cladding has a variety of impact on OPPC in the stress and strain of fiber optic cable, optical transmission performance, residual RTS and other related elements. The test results show that the corresponding tension values under ice thickness would change and should be taken seriously into consideration when discussing the application of OPPC line under ice-phased condition.

  2. Development of a Web Application: Recording Learners' Mouse Trajectories and Retrieving their Study Logs to Identify the Occurrence of Hesitation in Solving Word-Reordering Problems

    Directory of Open Access Journals (Sweden)

    Mitsumasa Zushi

    2014-04-01

    Full Text Available Most computer marking systems evaluate the results of the answers reached by learners without looking into the process by which the answers are produced, which will be insufficient to ascertain learners' understanding level because correct answers may well include lucky hunches, namely accidentally correct but not confident answers. In order to differentiate these lucky answers from confident correct ones, we have developed a Web application that can record mouse trajectories during the performance of tasks. Mathematical analyses of these trajectories have revealed that some parameters for mouse movements can be useful indicators to identify the occurrence of hesitation resulting from lack of knowledge or confidence in solving problems.

  3. A Two-Step Method to Identify Positive Deviant Physician Organizations of Accountable Care Organizations with Robust Performance Management Systems.

    Science.gov (United States)

    Pimperl, Alexander F; Rodriguez, Hector P; Schmittdiel, Julie A; Shortell, Stephen M

    2018-06-01

    To identify positive deviant (PD) physician organizations of Accountable Care Organizations (ACOs) with robust performance management systems (PMSYS). Third National Survey of Physician Organizations (NSPO3, n = 1,398). Organizational and external factors from NSPO3 were analyzed. Linear regression estimated the association of internal and contextual factors on PMSYS. Two cutpoints (75th/90th percentiles) identified PDs with the largest residuals and highest PMSYS scores. A total of 65 and 41 PDs were identified using 75th and 90th percentiles cutpoints, respectively. The 90th percentile more strongly differentiated PDs from non-PDs. Having a high proportion of vulnerable patients appears to constrain PMSYS development. Our PD identification method increases the likelihood that PD organizations selected for in-depth inquiry are high-performing organizations that exceed expectations. © Health Research and Educational Trust.

  4. Assessment of the performance of containment and surveillance equipment part 2: trial application

    International Nuclear Information System (INIS)

    Rezniczek, A.; Richter, B.; Jussofie, A.

    2009-01-01

    The adopted methodological approach for assessing the performance of Containment and Surveillance (C/S) equipment resulted from an account of work performed for and in cooperation with the ESARDA Working Group on C/S. It was applied on a trial basis to a dry storage facility for spent nuclear fuel and consisted of the following steps: (1) Acquisition and analysis of design information and operational characteristics of the facility under consideration, (2) assumptions on diversion and misuse scenarios, (3) assumptions on safeguards approach and definition of safeguards requirements, (4) compilation and characterisation of candidate C/S equipment, (5) performance assessment of C/S equipment. The candidate equipment taken into account was routinely used by the IAEA: DCM14-type camera, Type E capand- wire seal, COBRA fibre optic seal, and VACOSS electronic seal. Four applications were considered: camera mounted in the reception area, seal on secondary lid of transport and storage cask, seal on protective lid, and seal on group of casks. For these applications, requirements were defined and requirement levels were attributed. The assignment of performance levels was carried out by using the technical specifications and design basis tolerances provided by the equipment manufacturers. The results were entered into four performance assessment tables. Although the assessment methodology was not yet fully developed, its trial application yielded promising results with regard to the selection of appropriate C/S equipment.

  5. The Applicability of 360 Degree Feedback Performance Appraisal System: A Brigade Sample

    Directory of Open Access Journals (Sweden)

    Hakan TURGUT

    2012-03-01

    Full Text Available On the subject of measuring individual performances, which considered to be one of the fundamental functions of the human resources management process, 360 Degree Feedback Performance Appraisal System (360 DFPAS preferred for taking into account more than one aspect while providing opportunities in order to achieve more objective results. It’s been thought that the applicability of the above mentioned method is not investigated enough in the public sector where most of the employments take place in Turkey. The purpose of this study is to probe into the applicability of the 360 DFPAS on low and mid-level managers in a brigade. Within this framework, differences between the raters (manager, subordinate, peer, client, self-evaluation have been examined and comparisons made between traditional performance appraisal and 360 DFPAS. There are meaningful differences only inner client between two different evaluation methods (Traditional manager evalution-360 DFPAS within evaluation raters. But there aren’t any meaningful differences between traditional performance appraisal and 360 DFPAS as whole method.

  6. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  7. Performance Characteristics of Hybrid MPI/OpenMP Scientific Applications on a Large-Scale Multithreaded BlueGene/Q Supercomputer

    KAUST Repository

    Wu, Xingfu

    2013-07-01

    In this paper, we investigate the performance characteristics of five hybrid MPI/OpenMP scientific applications (two NAS Parallel benchmarks Multi-Zone SP-MZ and BT-MZ, an earthquake simulation PEQdyna, an aerospace application PMLB and a 3D particle-in-cell application GTC) on a large-scale multithreaded Blue Gene/Q supercomputer at Argonne National laboratory, and quantify the performance gap resulting from using different number of threads per node. We use performance tools and MPI profile and trace libraries available on the supercomputer to analyze and compare the performance of these hybrid scientific applications with increasing the number OpenMP threads per node, and find that increasing the number of threads to some extent saturates or worsens performance of these hybrid applications. For the strong-scaling hybrid scientific applications such as SP-MZ, BT-MZ, PEQdyna and PLMB, using 32 threads per node results in much better application efficiency than using 64 threads per node, and as increasing the number of threads per node, the FPU (Floating Point Unit) percentage decreases, and the MPI percentage (except PMLB) and IPC (Instructions per cycle) per core (except BT-MZ) increase. For the weak-scaling hybrid scientific application such as GTC, the performance trend (relative speedup) is very similar with increasing number of threads per node no matter how many nodes (32, 128, 512) are used. © 2013 IEEE.

  8. Trends in HFE Methods and Tools and Their Applicability to Safety Reviews

    Energy Technology Data Exchange (ETDEWEB)

    O' Hara, J.M.; Plott, C.; Milanski, J.; Ronan, A.; Scheff, S.; Laux, L.; and Bzostek, J.

    2009-09-30

    The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M&T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M&Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance to support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M&Ts. The M&Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M&Ts for each trend, and (2) whether M&Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M&T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.

  9. Trends in HFE Methods and Tools and Their Applicability to Safety Reviews

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Plott, C.; Milanski, J.; Ronan, A.; Scheff, S.; Laux, L.; Bzostek, J.

    2009-01-01

    The U.S. Nuclear Regulatory Commission's (NRC) conducts human factors engineering (HFE) safety reviews of applicant submittals for new plants and for changes to existing plants. The reviews include the evaluation of the methods and tools (M and T) used by applicants as part of their HFE program. The technology used to perform HFE activities has been rapidly evolving, resulting in a whole new generation of HFE M and Ts. The objectives of this research were to identify the current trends in HFE methods and tools, determine their applicability to NRC safety reviews, and identify topics for which the NRC may need additional guidance to support the NRC's safety reviews. We conducted a survey that identified over 100 new HFE M and Ts. The M and Ts were assessed to identify general trends. Seven trends were identified: Computer Applications for Performing Traditional Analyses, Computer-Aided Design, Integration of HFE Methods and Tools, Rapid Development Engineering, Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. We assessed each trend to determine its applicability to the NRC's review by considering (1) whether the nuclear industry is making use of M and Ts for each trend, and (2) whether M and Ts reflecting the trend can be reviewed using the current design review guidance. We concluded that M and T trends that are applicable to the commercial nuclear industry and are expected to impact safety reviews may be considered for review guidance development. Three trends fell into this category: Analysis of Cognitive Tasks, Use of Virtual Environments and Visualizations, and Application of Human Performance Models. The other trends do not need to be addressed at this time.

  10. Research and Application of New Type of High Performance Titanium Alloy

    Directory of Open Access Journals (Sweden)

    ZHU Zhishou

    2016-06-01

    Full Text Available With the continuous extension of the application quantity and range for titanium alloy in the fields of national aviation, space, weaponry, marine and chemical industry, etc., even more critical requirements to the comprehensive mechanical properties, low cost and process technological properties of titanium alloy have been raised. Through the alloying based on the microstructure parameters design, and the comprehensive strengthening and toughening technologies of fine grain strengthening, phase transformation and process control of high toughening, the new type of high performance titanium alloy which has good comprehensive properties of high strength and toughness, anti-fatigue, failure resistance and anti-impact has been researched and manufactured. The new titanium alloy has extended the application quantity and application level in the high end field, realized the industrial upgrading and reforming, and met the application requirements of next generation equipment.

  11. ePRO-MP: A Tool for Profiling and Optimizing Energy and Performance of Mobile Multiprocessor Applications

    Directory of Open Access Journals (Sweden)

    Wonil Choi

    2009-01-01

    Full Text Available For mobile multiprocessor applications, achieving high performance with low energy consumption is a challenging task. In order to help programmers to meet these design requirements, system development tools play an important role. In this paper, we describe one such development tool, ePRO-MP, which profiles and optimizes both performance and energy consumption of multi-threaded applications running on top of Linux for ARM11 MPCore-based embedded systems. One of the key features of ePRO-MP is that it can accurately estimate the energy consumption of multi-threaded applications without requiring a power measurement equipment, using a regression-based energy model. We also describe another key benefit of ePRO-MP, an automatic optimization function, using two example problems. Using the automatic optimization function, ePRO-MP can achieve high performance and low power consumption without programmer intervention. Our experimental results show that ePRO-MP can improve the performance and energy consumption by 6.1% and 4.1%, respectively, over a baseline version for the co-running applications optimization example. For the producer-consumer application optimization example, ePRO-MP improves the performance and energy consumption by 60.5% and 43.3%, respectively over a baseline version.

  12. Structural Identifiability of Dynamic Systems Biology Models.

    Science.gov (United States)

    Villaverde, Alejandro F; Barreiro, Antonio; Papachristodoulou, Antonis

    2016-10-01

    A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas.

  13. Remote Core Locking: Migrating Critical-Section Execution to Improve the Performance of Multithreaded Applications

    OpenAIRE

    Lozi , Jean-Pierre; David , Florian; Thomas , Gaël; Lawall , Julia; Muller , Gilles

    2014-01-01

    National audience; The scalability of multithreaded applications on current multicore systems is hampered by the performance of lock algorithms, due to the costs of access contention and cache misses. In this paper, we propose a new lock algorithm, Remote Core Locking (RCL), that aims to improve the performance of critical sections in legacy applications on multicore architectures. The idea of RCL is to replace lock acquisitions by optimized remote procedure calls to a dedicated server core. ...

  14. Methods for identifying 30 chronic conditions: application to administrative data.

    Science.gov (United States)

    Tonelli, Marcello; Wiebe, Natasha; Fortin, Martin; Guthrie, Bruce; Hemmelgarn, Brenda R; James, Matthew T; Klarenbach, Scott W; Lewanczuk, Richard; Manns, Braden J; Ronksley, Paul; Sargious, Peter; Straus, Sharon; Quan, Hude

    2015-04-17

    Multimorbidity is common and associated with poor clinical outcomes and high health care costs. Administrative data are a promising tool for studying the epidemiology of multimorbidity. Our goal was to derive and apply a new scheme for using administrative data to identify the presence of chronic conditions and multimorbidity. We identified validated algorithms that use ICD-9 CM/ICD-10 data to ascertain the presence or absence of 40 morbidities. Algorithms with both positive predictive value and sensitivity ≥70% were graded as "high validity"; those with positive predictive value ≥70% and sensitivity <70% were graded as "moderate validity". To show proof of concept, we applied identified algorithms with high to moderate validity to inpatient and outpatient claims and utilization data from 574,409 people residing in Edmonton, Canada during the 2008/2009 fiscal year. Of the 40 morbidities, we identified 30 that could be identified with high to moderate validity. Approximately one quarter of participants had identified multimorbidity (2 or more conditions), one quarter had a single identified morbidity and the remaining participants were not identified as having any of the 30 morbidities. We identified a panel of 30 chronic conditions that can be identified from administrative data using validated algorithms, facilitating the study and surveillance of multimorbidity. We encourage other groups to use this scheme, to facilitate comparisons between settings and jurisdictions.

  15. Environmental performance of electricity storage systems for grid applications, a life cycle approach

    International Nuclear Information System (INIS)

    Oliveira, L.; Messagie, M.; Mertens, J.; Laget, H.; Coosemans, T.; Van Mierlo, J.

    2015-01-01

    Highlights: • Large energy storage systems: environmental performance under different scenarios. • ReCiPe midpoint and endpoint impact assessment results are analyzed. • Energy storage systems can replace peak power generation units. • Energy storage systems and renewable energy have the best environmental scores. • Environmental performance of storage systems is application dependent. - Abstract: In this paper, the environmental performance of electricity storage technologies for grid applications is assessed. Using a life cycle assessment methodology we analyze the impacts of the construction, disposal/end of life, and usage of each of the systems. Pumped hydro and compressed air storage are studied as mechanical storage, and advanced lead acid, sodium sulfur, lithium-ion and nickel–sodium-chloride batteries are addressed as electrochemical storage systems. Hydrogen production from electrolysis and subsequent usage in a proton exchange membrane fuel cell are also analyzed. The selected electricity storage systems mimic real world installations in terms of capacity, power rating, life time, technology and application. The functional unit is one kW h of energy delivered back to the grid, from the storage system. The environmental impacts assessed are climate change, human toxicity, particulate matter formation, and fossil resource depletion. Different electricity mixes are used in order to exemplify scenarios where the selected technologies meet specific applications. Results indicate that the performance of the storage systems is tied to the electricity feedstocks used during use stage. Renewable energy sources have lower impacts throughout the use stage of the storage technologies. Using the Belgium electricity mix of 2011 as benchmark, the sodium sulfur battery is shown to be the best performer for all the impacts analyzed. Pumped hydro storage follows in second place. Regarding infrastructure and end of life, results indicate that battery systems

  16. Performance demonstration and evaluation of the synergetic application of vanadium dioxide glazing and phase change material in passive buildings

    International Nuclear Information System (INIS)

    Long, Linshuang; Ye, Hong; Gao, Yanfeng; Zou, Ruqiang

    2014-01-01

    Highlights: • VO 2 and PCM were combined in passive building application for the first time. • Synergetic performance of them is demonstrated in a full size room. • Synergetic application has a better performance than the solo ones. • The materials interact with each other in synergetic application. • ESI can be used to evaluate the performance of the synergetic application. - Abstract: One of the key methods to improve the energy saving performance of a building is to apply advanced materials or components to the building envelope. However, the two parts of a building’s envelope, the transparent one and the non-transparent one, are usually investigated individually by existing literature. In this study, vanadium dioxide (VO 2 ) glazing, an advanced energy-efficient element applied to the transparent parts of the building envelope, and phase change material (PCM), a typical thermal storage material used to improve the non-transparent parts of the building envelope, were adopted simultaneously for the first time. The synergetic performance of VO 2 glazing and PCM, demonstrated in a full-scale, lightweight, passive room, resulted in a significant improvement in the thermal comfort degree. The Energy Saving Index (ESI) is a simple and effective indicator that can be used to evaluate the passive application performance of a single energy-efficient material or component on a common standpoint. In this work, the index was broadened to evaluate the performance of more than one material, showing that ESI is feasible and favorable to analyze the coefficient application of several building materials and/or components. Using the ESI, the performance of the synergetic application was also compared with those of the sole materials, indicating that the synergetic application has a better performance during the cooling period. Furthermore the synergetic application involves an interplay rather than a simple combination of the energy-efficient materials. The

  17. Performance of alternative refrigerants for residential air-conditioning applications

    International Nuclear Information System (INIS)

    Park, Ki-Jung; Seo, Taebeom; Jung, Dongsoo

    2007-01-01

    In this study, performances of two pure hydrocarbons and seven mixtures composed of propylene, propane, HFC152a, and dimethylether were measured to substitute for HCFC22 in residential air-conditioners and heat pumps. Thermodynamic cycle analysis was carried out to determine the optimum compositions before testing and actual tests were performed in a breadboard-type laboratory heat pump/air-conditioner at the evaporation and condensation temperatures of 7 and 45 deg. C, respectively. Test results show that the coefficient of performance of these mixtures is up to 5.7% higher than that of HCFC22. While propane showed a 11.5% reduction in capacity, most of the fluids had a similar capacity to that of HCFC22. For these fluids, compressor-discharge temperatures were reduced by 11-17 deg. C. For all fluids tested, the amount of charge was reduced by up to 55% as compared to HCFC22. Overall, these fluids provide good performances with reasonable energy-savings without any environmental problem and thus can be used as long-term alternatives for residential air-conditioning and heat-pumping applications

  18. The European ALMA production antennas: new drive applications for better performances and low cost management

    Science.gov (United States)

    Giacomel, L.; Manfrin, C.; Marchiori, G.

    2008-07-01

    From the first application on the VLT Telescopes till today, the linear motor identifies the best solution in terms of quality/cost for any technological application in the astronomical field. Its application also in the radio-astronomy sector with the ALMA project represents a whole of forefront technology, high reliability and minimum maintenance. The adoption of embedded electronics on each motor sector makes it a system at present modular, redundant with resetting of EMC troubles.

  19. The application of digital image plane holography technology to identify Chinese herbal medicine

    Science.gov (United States)

    Wang, Huaying; Guo, Zhongjia; Liao, Wei; Zhang, Zhihui

    2012-03-01

    In this paper, the imaging technology of digital image plane holography to identify the Chinese herbal medicine is studied. The optical experiment system of digital image plane holography which is the special case of pre-magnification digital holography was built. In the record system, one is an object light by using plane waves which illuminates the object, and the other one is recording hologram by using spherical light wave as reference light. There is a Micro objective lens behind the object. The second phase factor which caus ed by the Micro objective lens can be eliminated by choosing the proper position of the reference point source when digital image plane holography is recorded by spherical light. In this experiment, we use the Lygodium cells and Onion cells as the object. The experiment results with Lygodium cells and Onion cells show that digital image plane holography avoid the process of finding recording distance by using auto-focusing approach, and the phase information of the object can be reconstructed more accurately. The digital image plane holography is applied to the microscopic imaging of cells more effectively, and it is suit to apply for the identify of Chinese Herbal Medicine. And it promotes the application of digital holographic in practice.

  20. Performance indicator system with application to NPP management

    International Nuclear Information System (INIS)

    Gomez, J.; Roldan, J.

    2001-01-01

    The objective of the paper is to present the work that is being conducted in the scope of a research project between Cofrentes NPP and the polytechnic university of Valencia aimed to the development and implementation of a performance indicators system to support plant management. In developing this system, attention is being paid to the areas of safety, production and dependability. The first step in the project was the development of the performance indicator system (PIS), in order to help in assessing the effectiveness of the different activities in plant (i.e. maintenance, inspections, tests, etc.). It is suggested establishing the operational indicators set in 3 levels. The lowest level concerns indicators monitoring performance and maintenance characteristics of components. The next one involves a subset of indicators placed at system level with a similar goal. And finally, the highest level summarizes the impact of the global policy in the whole plant from safety and performance point of view. The definition of an indicator should comprise, at least, the following items: indicator's name, performance area, definition and data needed. A strategy should define what, when and how indicators have to be evaluated, analyzed and reported. This article gives an example application of the methodology at the Cofrentes NPP, collective dose as safety indicator, power production as production indicator and the number of work orders as maintenance indicator are considered and their time evolution is given. (A.C.)

  1. Application of secondary ion mass spectrometry for the characterization of commercial high performance materials

    International Nuclear Information System (INIS)

    Gritsch, M.

    2000-09-01

    The industry today offers an uncounted number of high performance materials, that have to meet highest standards. Commercial high performance materials, though often sold in large quantities, still require ongoing research and development to keep up to date with increasing needs and decreasing tolerances. Furthermore, a variety of materials is on the market that are not fully understood in their microstructure, in the way they react under application conditions, and in which mechanisms are responsible for their degradation. Secondary Ion Mass Spectrometry (SIMS) is an analytical method that is now in commercial use for over 30 years. Its main advantages are the very high detection sensitivity (down to ppb), the ability to measure all elements with isotopic sensitivity, the ability of gaining laterally resolved images, and the inherent capability of depth-profiling. These features make it an ideal tool for a wide field of applications within advanced material science. The present work gives an introduction into the principles of SIMS and shows the successful application for the characterization of commercially used high performance materials. Finally, a selected collection of my publications in reviewed journals will illustrate the state of the art in applied materials research and development with dynamic SIMS. All publications focus on the application of dynamic SIMS to analytical questions that stem from questions arising during the production and improvement of high-performance materials. (author)

  2. Performance test of a bladeless turbine for geothermal applications

    Energy Technology Data Exchange (ETDEWEB)

    Steidel, R.; Weiss, H.

    1976-03-24

    The Possell bladeless turbine was tested at the LLL Geothermal Test Facility to evaluate its potential for application in the total flow process. Test description and performance data are given for 3000, 3500, 4000, and 4500 rpm. The maximum engine efficiency observed was less than 7 percent. It is concluded that the Possell turbine is not a viable candidate machine for the conversion of geothermal fluids by the total flow process. (LBS)

  3. Identifying optimum performance trade-offs using a cognitively bounded rational analysis model of discretionary task interleaving.

    Science.gov (United States)

    Janssen, Christian P; Brumby, Duncan P; Dowell, John; Chater, Nick; Howes, Andrew

    2011-01-01

    We report the results of a dual-task study in which participants performed a tracking and typing task under various experimental conditions. An objective payoff function was used to provide explicit feedback on how participants should trade off performance between the tasks. Results show that participants' dual-task interleaving strategy was sensitive to changes in the difficulty of the tracking task and resulted in differences in overall task performance. To test the hypothesis that people select strategies that maximize payoff, a Cognitively Bounded Rational Analysis model was developed. This analysis evaluated a variety of dual-task interleaving strategies to identify the optimal strategy for maximizing payoff in each condition. The model predicts that the region of optimum performance is different between experimental conditions. The correspondence between human data and the prediction of the optimal strategy is found to be remarkably high across a number of performance measures. This suggests that participants were honing their behavior to maximize payoff. Limitations are discussed. Copyright © 2011 Cognitive Science Society, Inc.

  4. Bayesian Comparison of Alternative Graded Response Models for Performance Assessment Applications

    Science.gov (United States)

    Zhu, Xiaowen; Stone, Clement A.

    2012-01-01

    This study examined the relative effectiveness of Bayesian model comparison methods in selecting an appropriate graded response (GR) model for performance assessment applications. Three popular methods were considered: deviance information criterion (DIC), conditional predictive ordinate (CPO), and posterior predictive model checking (PPMC). Using…

  5. SELECTION OF ENDOCRINOLOGY SUBSPECIALTY TRAINEES: WHICH APPLICANT CHARACTERISTICS ARE ASSOCIATED WITH PERFORMANCE DURING FELLOWSHIP TRAINING?

    Science.gov (United States)

    Natt, Neena; Chang, Alice Y; Berbari, Elie F; Kennel, Kurt A; Kearns, Ann E

    2016-01-01

    To determine which residency characteristics are associated with performance during endocrinology fellowship training as measured by competency-based faculty evaluation scores and faculty global ratings of trainee performance. We performed a retrospective review of interview applications from endocrinology fellows who graduated from a single academic institution between 2006 and 2013. Performance measures included competency-based faculty evaluation scores and faculty global ratings. The association between applicant characteristics and measures of performance during fellowship was examined by linear regression. The presence of a laudatory comparative statement in the residency program director's letter of recommendation (LoR) or experience as a chief resident was significantly associated with competency-based faculty evaluation scores (β = 0.22, P = .001; and β = 0.24, P = .009, respectively) and faculty global ratings (β = 0.85, P = .006; and β = 0.96, P = .015, respectively). The presence of a laudatory comparative statement in the residency program director's LoR or experience as a chief resident were significantly associated with overall performance during subspecialty fellowship training. Future studies are needed in other cohorts to determine the broader implications of these findings in the application and selection process.

  6. Characterization of high performance silicon-based VMJ PV cells for laser power transmission applications

    Science.gov (United States)

    Perales, Mico; Yang, Mei-huan; Wu, Cheng-liang; Hsu, Chin-wei; Chao, Wei-sheng; Chen, Kun-hsien; Zahuranec, Terry

    2016-03-01

    Continuing improvements in the cost and power of laser diodes have been critical in launching the emerging fields of power over fiber (PoF), and laser power beaming. Laser power is transmitted either over fiber (for PoF), or through free space (power beaming), and is converted to electricity by photovoltaic cells designed to efficiently convert the laser light. MH GoPower's vertical multi-junction (VMJ) PV cell, designed for high intensity photovoltaic applications, is fueling the emergence of this market, by enabling unparalleled photovoltaic receiver flexibility in voltage, cell size, and power output. Our research examined the use of the VMJ PV cell for laser power transmission applications. We fully characterized the performance of the VMJ PV cell under various laser conditions, including multiple near IR wavelengths and light intensities up to tens of watts per cm2. Results indicated VMJ PV cell efficiency over 40% for 9xx nm wavelengths, at laser power densities near 30 W/cm2. We also investigated the impact of the physical dimensions (length, width, and height) of the VMJ PV cell on its performance, showing similarly high performance across a wide range of cell dimensions. We then evaluated the VMJ PV cell performance within the power over fiber application, examining the cell's effectiveness in receiver packages that deliver target voltage, intensity, and power levels. By designing and characterizing multiple receivers, we illustrated techniques for packaging the VMJ PV cell for achieving high performance (> 30%), high power (> 185 W), and target voltages for power over fiber applications.

  7. A forensic application of PIXE analysis

    International Nuclear Information System (INIS)

    Kravchenko, I.I.; Dunnam, F.E.; Rinsvelt, H.A. van; Warren, M.W.; Falsetti, A.B.

    2001-01-01

    PIXE measurements were performed on various calcareous materials including identified bone residues, human cremains, and samples of disputed origin. In a forensic application, the elemental analysis suggests that the origin of a sample suspectly classified as human cremains can tentatively be identified as a mixture of sandy soil and dolomitic limestone

  8. The application of metal cutting technologies in tasks performed in radioactive environments

    International Nuclear Information System (INIS)

    Fogle, R.F.; Younkins, R.M.

    1997-01-01

    The design and use of equipment to perform work in radioactive environments is uniquely challenging. Some tasks require that the equipment be operated by a person wearing a plastic suit or full face respirator and donning several pairs of rubber gloves. Other applications may require that the equipment be remotely controlled. Other important, design considerations include material compatibility, mixed waste issues, tolerance to ionizing radiation, size constraints and weight capacities. As always, there is the ''We need it ASAP'' design criteria. This paper describes four applications where different types of metal cutting technologies were used to successfully perform tasks in radioactive environments. The technologies include a plasma cutting torch, a grinder with an abrasive disk, a hydraulic shear, and a high pressure abrasive water jet cutter

  9. Application of artificial neural network to identify nuclear materials

    International Nuclear Information System (INIS)

    Xu Peng; Wang Zhe; Li Tiantuo

    2005-01-01

    Applying the neutral network, the article studied the technology of identifying the gamma spectra of the nuclear material in the nuclear components. In the article, theory of the network identifying the spectra is described, and the results of identification of gamma spectra are given.(authors)

  10. Parallel Application Performance on Two Generations of Intel Xeon HPC Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Christopher H.; Long, Hai; Sides, Scott; Vaidhynathan, Deepthi; Jones, Wesley

    2015-10-15

    Two next-generation node configurations hosting the Haswell microarchitecture were tested with a suite of microbenchmarks and application examples, and compared with a current Ivy Bridge production node on NREL" tm s Peregrine high-performance computing cluster. A primary conclusion from this study is that the additional cores are of little value to individual task performance--limitations to application parallelism, or resource contention among concurrently running but independent tasks, limits effective utilization of these added cores. Hyperthreading generally impacts throughput negatively, but can improve performance in the absence of detailed attention to runtime workflow configuration. The observations offer some guidance to procurement of future HPC systems at NREL. First, raw core count must be balanced with available resources, particularly memory bandwidth. Balance-of-system will determine value more than processor capability alone. Second, hyperthreading continues to be largely irrelevant to the workloads that are commonly seen, and were tested here, at NREL. Finally, perhaps the most impactful enhancement to productivity might occur through enabling multiple concurrent jobs per node. Given the right type and size of workload, more may be achieved by doing many slow things at once, than fast things in order.

  11. Research on dynamic performance design of mobile phone application based on context awareness

    Science.gov (United States)

    Bo, Zhang

    2018-05-01

    It aims to explore the dynamic performance of different mobile phone applications and the user's cognitive differences, reduce the cognitive burden, and enhance the sense of experience. By analyzing the dynamic design performance in four different interactive contexts, and constructing the framework of information service process in the interactive context perception and the two perception principles of the cognitive consensus between designer and user, and the two kinds of knowledge in accordance with the perception principles. The analysis of the context will help users sense the dynamic performance more intuitively, so that the details of interaction will be performed more vividly and smoothly, thus enhance user's experience in the interactive process. The common perception experience enables designers and users to produce emotional resonance in different interactive contexts, and help them achieve rapid understanding of interactive content and perceive the logic and hierarchy of the content and the structure, therefore the effectiveness of mobile applications will be improved.

  12. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  13. Performance Evaluation of New Generation CdZnTe Detectors for Safeguards Applications

    International Nuclear Information System (INIS)

    Ivanovs, V.; Mintcheva, J.; Berlizov, A.; Lebrun, A.

    2015-01-01

    Cadmium zinc telluride detectors (CdZnTe) have found a wide application in nondestructive assay measurements in the IAEA's verification practice. It is because of their form factor, usability, sensitivity and good spectral characteristics that they are extensively used for fresh and spent fuel attribute test measurements. Until now, the series of CdZnTe detectors utilized in the IAEA have covered the range of 5 mm 3 , 20 mm 3 , 60 mm 3 and 500mm 3 of sensitive volume. Recently, new CdZnTe detectors with improved spectroscopic characteristics and significantly bigger active volume have become available, owing to advances in crystal and detector manufacturing and signal processing technologies. The distinctive feature of this new technological development is the application of a low-intensity monochromatic optical stimulation with infrared (IR) light. The use of IR illumination with a properly chosen wavelength close to the absorption edge of the CdZnTe can significantly improve the performance of the detectors. Recognizing potential benefits of these detectors in safeguards applications, the IAEA has performed an evaluation of their performance characteristics. Under evaluation were several new detectors with sensitive volumes of 500 mm 3 , 1500 mm 3 and 4000 mm 3 , as well as all-in-one 60 mm 3 , 500 mm 3 and 1500 mm 3 integrated micro-spectrometers available from RITEC, Latvia. In addition to the standard performance characteristics, such as energy resolution, peak shape, efficiency, linearity, throughput and temperature stability, the potential use of the detectors for safeguards specific measurements, such as uranium enrichment with infinite thickness method, was of particular interest. The paper will describe the advances in the CdZnTe detector technology and present the results of their performance evaluation. (author)

  14. Identifying colon cancer risk modules with better classification performance based on human signaling network.

    Science.gov (United States)

    Qu, Xiaoli; Xie, Ruiqiang; Chen, Lina; Feng, Chenchen; Zhou, Yanyan; Li, Wan; Huang, Hao; Jia, Xu; Lv, Junjie; He, Yuehan; Du, Youwen; Li, Weiguo; Shi, Yuchen; He, Weiming

    2014-10-01

    Identifying differences between normal and tumor samples from a modular perspective may help to improve our understanding of the mechanisms responsible for colon cancer. Many cancer studies have shown that signaling transduction and biological pathways are disturbed in disease states, and expression profiles can distinguish variations in diseases. In this study, we integrated a weighted human signaling network and gene expression profiles to select risk modules associated with tumor conditions. Risk modules as classification features by our method had a better classification performance than other methods, and one risk module for colon cancer had a good classification performance for distinguishing between normal/tumor samples and between tumor stages. All genes in the module were annotated to the biological process of positive regulation of cell proliferation, and were highly associated with colon cancer. These results suggested that these genes might be the potential risk genes for colon cancer. Copyright © 2013. Published by Elsevier Inc.

  15. High-performance heat pipes for heat recovery applications

    Science.gov (United States)

    Saaski, E. W.; Hartl, J. H.

    1980-01-01

    Methods to improve the performance of reflux heat pipes for heat recovery applications were examined both analytically and experimentally. Various models for the estimation of reflux heat pipe transport capacity were surveyed in the literature and compared with experimental data. A high transport capacity reflux heat pipe was developed that provides up to a factor of 10 capacity improvement over conventional open tube designs; analytical models were developed for this device and incorporated into a computer program HPIPE. Good agreement of the model predictions with data for R-11 and benzene reflux heat pipes was obtained.

  16. Human performance improvement in organizations: Potential application for the nuclear industry

    International Nuclear Information System (INIS)

    2005-11-01

    This publication is primarily intended for managers and specialists in nuclear facility operating organizations working in the area of human performance improvement. It is intended to provide them with practical information they can use to improve human performance in their organizations. While some of the information provided in this publication is based upon the experience of nuclear facility operating organizations, most of it comes from human performance improvement initiatives in non-nuclear organizations and industries. The nuclear industry has a long tradition of sharing good management practices in order to foster continuous improvement. However, it is not always realized that many of the practices that are now well established initially came from non-nuclear industries and were subsequently adapted for application to nuclear power plant operating organizations. There is, therefore, good reason to periodically review non-nuclear industry practices for ideas that might have direct or indirect application to the nuclear industry in order to potentially gain benefits such as the following: new approaches to certain problem areas, insights into new or impending challenges, improvements in existing practices, benchmarking of opportunities, development of learning organizations and avoidance of collective blind spots. The preparation of this report was an activity of the project on Effective Training to Achieve Excellence in the Performance of NPP Personnel. The objective of this project is to enhance the capability of Member States to utilize proven practices developed and transferred by the IAEA for improving personnel performance. The expected outcome from this project is the increased use by organizations in Members States of proven engineering and management practices and methodologies developed and transferred by the IAEA to improve personnel performance

  17. Inclusive vision for high performance computing at the CSIR

    CSIR Research Space (South Africa)

    Gazendam, A

    2006-02-01

    Full Text Available and computationally intensive applications. A number of different technologies and standards were identified as core to the open and distributed high-performance infrastructure envisaged...

  18. Modified performance test of vented lead acid batteries for stationary applications

    International Nuclear Information System (INIS)

    Uhlir, K.W.; Fletcher, R.J.

    1995-01-01

    The concept of a modified performance test for vented lead acid batteries in stationary applications has been developed by the IEEE Battery Working Group. The modified performance test is defined as a test in the ''as found'' condition of the battery capacity and its ability to provide a high rate, short duration load (usually the highest rate of the duty cycle) that will confirm the battery's ability to meet the critical period of the load duty cycle, in addition to determining its percentage of rated capacity. This paper will begin by reviewing performance and service test requirements and concerns associated with both types of tests. The paper will then discuss the rationale for developing a modified performance test along with the benefits that can be derived from performing a modified performance test in lieu of a capacity test and/or a service test. The paper will conclude with an example on how to apply a modified performance test and test acceptance criteria

  19. Performance profiling for brachytherapy applications

    Science.gov (United States)

    Choi, Wonqook; Cho, Kihyeon; Yeo, Insung

    2018-05-01

    In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.

  20. Evaluation of the feasibility and performance of early warning scores to identify patients at risk of adverse outcomes in a low-middle income country setting

    Science.gov (United States)

    Beane, Abi; De Silva, Ambepitiyawaduge Pubudu; De Silva, Nirodha; Sujeewa, Jayasingha A; Rathnayake, R M Dhanapala; Sigera, P Chathurani; Athapattu, Priyantha Lakmini; Mahipala, Palitha G; Rashan, Aasiyah; Munasinghe, Sithum Bandara; Jayasinghe, Kosala Saroj Amarasiri; Dondorp, Arjen M; Haniffa, Rashan

    2018-01-01

    Objective This study describes the availability of core parameters for Early Warning Scores (EWS), evaluates the ability of selected EWS to identify patients at risk of death or other adverse outcome and describes the burden of triggering that front-line staff would experience if implemented. Design Longitudinal observational cohort study. Setting District General Hospital Monaragala. Participants All adult (age >17 years) admitted patients. Main outcome measures Existing physiological parameters, adverse outcomes and survival status at hospital discharge were extracted daily from existing paper records for all patients over an 8-month period. Statistical analysis Discrimination for selected aggregate weighted track and trigger systems (AWTTS) was assessed by the area under the receiver operating characteristic (AUROC) curve. Performance of EWS are further evaluated at time points during admission and across diagnostic groups. The burden of trigger to correctly identify patients who died was evaluated using positive predictive value (PPV). Results Of the 16 386 patients included, 502 (3.06%) had one or more adverse outcomes (cardiac arrests, unplanned intensive care unit admissions and transfers). Availability of physiological parameters on admission ranged from 90.97% (95% CI 90.52% to 91.40%) for heart rate to 23.94% (95% CI 23.29% to 24.60%) for oxygen saturation. Ability to discriminate death on admission was less than 0.81 (AUROC) for all selected EWS. Performance of the best performing of the EWS varied depending on admission diagnosis, and was diminished at 24 hours prior to event. PPV was low (10.44%). Conclusion There is limited observation reporting in this setting. Indiscriminate application of EWS to all patients admitted to wards in this setting may result in an unnecessary burden of monitoring and may detract from clinician care of sicker patients. Physiological parameters in combination with diagnosis may have a place when applied on admission to

  1. Accelerating Scientific Applications using High Performance Dense and Sparse Linear Algebra Kernels on GPUs

    KAUST Repository

    Abdelfattah, Ahmad

    2015-01-15

    High performance computing (HPC) platforms are evolving to more heterogeneous configurations to support the workloads of various applications. The current hardware landscape is composed of traditional multicore CPUs equipped with hardware accelerators that can handle high levels of parallelism. Graphical Processing Units (GPUs) are popular high performance hardware accelerators in modern supercomputers. GPU programming has a different model than that for CPUs, which means that many numerical kernels have to be redesigned and optimized specifically for this architecture. GPUs usually outperform multicore CPUs in some compute intensive and massively parallel applications that have regular processing patterns. However, most scientific applications rely on crucial memory-bound kernels and may witness bottlenecks due to the overhead of the memory bus latency. They can still take advantage of the GPU compute power capabilities, provided that an efficient architecture-aware design is achieved. This dissertation presents a uniform design strategy for optimizing critical memory-bound kernels on GPUs. Based on hierarchical register blocking, double buffering and latency hiding techniques, this strategy leverages the performance of a wide range of standard numerical kernels found in dense and sparse linear algebra libraries. The work presented here focuses on matrix-vector multiplication kernels (MVM) as repre- sentative and most important memory-bound operations in this context. Each kernel inherits the benefits of the proposed strategies. By exposing a proper set of tuning parameters, the strategy is flexible enough to suit different types of matrices, ranging from large dense matrices, to sparse matrices with dense block structures, while high performance is maintained. Furthermore, the tuning parameters are used to maintain the relative performance across different GPU architectures. Multi-GPU acceleration is proposed to scale the performance on several devices. The

  2. Development and Performance Analysis of a Photonics-Assisted RF Converter for 5G Applications

    Science.gov (United States)

    Borges, Ramon Maia; Muniz, André Luiz Marques; Sodré Junior, Arismar Cerqueira

    2017-03-01

    This article presents a simple, ultra-wideband and tunable radiofrequency (RF) converter for 5G cellular networks. The proposed optoelectronic device performs broadband photonics-assisted upconversion and downconversion using a single optical modulator. Experimental results demonstrate RF conversion from DC to millimeter waves, including 28 and 38 GHz that are potential frequency bands for 5G applications. Narrow linewidth and low phase noise characteristics are observed in all generated RF carriers. An experimental digital performance analysis using different modulation schemes illustrates the applicability of the proposed photonics-based device in reconfigurable optical wireless communications.

  3. Study of a methodology of identifying important research problems by the PIRT process

    International Nuclear Information System (INIS)

    Aoki, Takayuki; Takagi, Toshiyuki; Urayama, Ryoichi; Komura, Ichiro; Furukawa, Takashi; Yusa, Noritaka

    2013-01-01

    In this paper, we propose a new methodology of identifying important research problems to be solved to improve the performance of some specific scientific technologies by the phenomena identification and ranking table (PIRT) process, which has been used as a methodology for demonstrating the validity of the best estimate simulation codes in USNRC licensing of nuclear power plants. It keeps the fundamental concepts of the original PIRT process but makes it possible to identify important factors affecting the performance of the technologies from the viewpoint of the figure of merit and problems associated with them, which need to be solved to improve the performance. Also in this paper, we demonstrate the effectiveness of the developed method by showing a specific example of the application to physical events or phenomena in objects having fatigue or SCC crack(s) under ultrasonic testing and eddy current testing. (author)

  4. Project management and performance management: potential transdisciplinary contributions

    Directory of Open Access Journals (Sweden)

    Gerrit van der Waldt

    2012-12-01

    Full Text Available As project management and performance management as management applications gain momentum in public sector settings, the question often arise as to if, how, and when these applications should complement each other in various policy implementation and service delivery initiatives. Answers to this question should be sought from various vantage points or perspectives. These vantage points may range from macro, meso, micro as well as theoretical-methodological perspectives. The purpose of this paper is to unlock the potential for transdisciplinary contributions between Project Management and Performance Management by focusing on the methodologies, functional areas, and practical applications of both management disciplines. It is argued that the respective methodologies and their processes should be unpacked to identify the timing or moment when each discipline could, and should, make a contribution to the success of the other. This will add value to the theoretical underpinnings and practical applications of both study domains in the public sector. The respective contributions are illustrated by means of application realities of both management practices in the South African Public Service. Keywords: project management, performance management, Public Sector applications, transdisciplinarity Disciplines: project management, performance management

  5. The development and performance testing of a biodegradable scale inhibitor

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, Julie; Fidoe, Steve; Jones, Chris

    2006-03-15

    The oil industry is currently facing severe restrictions concerning the discharge of oil field chemicals into the environment. Many commonly used materials in both topside and downhole applications are phased for substitution for use in the North Sea, and more will be identified. The development of biodegradable and low toxicity chemicals, which afford equal or improved efficacy, compared to conventional technology, available at a competitive price, is a current industry challenge. A range of biodegradable materials are increasingly available, however their limited performance can result in a restricted range of applications. This paper discusses the development and commercialization of a readily biodegradable scale inhibitor, ideal for use in topside applications. This material offers a broad spectrum of activity, notably efficiency against barium sulphate, calcium sulphate and calcium carbonate scales, in a range of water chemistries. A range of performance testing, compatibility, stability and OCNS dataset will be presented. Comparisons with commonly used chemicals have been made to identify the superior performance of this phosphate ester. This paper will discuss a scale inhibitor suitable for use in a variety of conditions which offers enhanced performance combined with a favourable biodegradation profile. This material is of great benefit to the industry, particularly in North Sea applications. (author) (tk)

  6. Identifying genetic relatives without compromising privacy.

    Science.gov (United States)

    He, Dan; Furlotte, Nicholas A; Hormozdiari, Farhad; Joo, Jong Wha J; Wadia, Akshay; Ostrovsky, Rafail; Sahai, Amit; Eskin, Eleazar

    2014-04-01

    The development of high-throughput genomic technologies has impacted many areas of genetic research. While many applications of these technologies focus on the discovery of genes involved in disease from population samples, applications of genomic technologies to an individual's genome or personal genomics have recently gained much interest. One such application is the identification of relatives from genetic data. In this application, genetic information from a set of individuals is collected in a database, and each pair of individuals is compared in order to identify genetic relatives. An inherent issue that arises in the identification of relatives is privacy. In this article, we propose a method for identifying genetic relatives without compromising privacy by taking advantage of novel cryptographic techniques customized for secure and private comparison of genetic information. We demonstrate the utility of these techniques by allowing a pair of individuals to discover whether or not they are related without compromising their genetic information or revealing it to a third party. The idea is that individuals only share enough special-purpose cryptographically protected information with each other to identify whether or not they are relatives, but not enough to expose any information about their genomes. We show in HapMap and 1000 Genomes data that our method can recover first- and second-order genetic relationships and, through simulations, show that our method can identify relationships as distant as third cousins while preserving privacy.

  7. Identified best environmental management practices to improve the energy performance of the retail trade sector in Europe

    International Nuclear Information System (INIS)

    Galvez-Martos, Jose-Luis; Styles, David; Schoenberger, Harald

    2013-01-01

    The retail trade sector has been identified as a target sector for the development of sectoral reference documents on best environmental management practices under the Eco-Management and Audit Scheme. This paper focuses on the important energy-related needs in retailers' stores such as for food refrigeration and lighting, as well as heating, ventilation and air conditioning of the building. For the definition of best environmental management practices in the European framework, frontrunner retailers have been identified as those retailers integrating energy minimization and saving measures as standard practice systematically across stores. These best performers also integrate a comprehensive monitoring system in the energy management of every store or building belonging to the company, enabling the rapid identification of energy saving opportunities. An integrative approach is needed to define how best practices should be implemented in combination to optimize energy management within stores: building aspects such as insulation of the building envelope or the heating, ventilation and air conditioning system, should be optimized in combination with best options for refrigeration in food retailers. Refrigeration systems are responsible for half of the final energy use in stores and of their carbon footprint. Natural refrigerants, heat recovery from the condensation stage and covering of display cases are measures with high environmental benefits to reduce the impact of refrigeration. Finally, practices for lighting, as optimal lighting strategies, and the integration of renewable energy sources in overall zero energy building concepts can save considerable amounts of fossil energy, reduce the carbon footprint and produce significant cost-savings in the long term. - highlights: • There is a high energy performance improvement potential of the retail trade sector. • We propose techniques with a high performance level and applied by frontrunners. • We identified

  8. Can surveillance systems identify and avert adverse drug events? A prospective evaluation of a commercial application.

    Science.gov (United States)

    Jha, Ashish K; Laguette, Julia; Seger, Andrew; Bates, David W

    2008-01-01

    Computerized monitors can effectively detect and potentially prevent adverse drug events (ADEs). Most monitors have been developed in large academic hospitals and are not readily usable in other settings. We assessed the ability of a commercial program to identify and prevent ADEs in a community hospital. and Measurement We prospectively evaluated the commercial application in a community-based hospital. We examined the frequency and types of alerts produced, how often they were associated with ADEs and potential ADEs, and the potential financial impact of monitoring for ADEs. Among 2,407 patients screened, the application generated 516 high priority alerts. We were able to review 266 alerts at the time they were generated and among these, 30 (11.3%) were considered substantially important to warrant contacting the physician caring for the patient. These 30 alerts were associated with 4 ADEs and 11 potential ADEs. In all 15 cases, the responsible physician was unaware of the event, leading to a change in clinical care in 14 cases. Overall, 23% of high priority alerts were associated with an ADE (95% confidence interval [CI] 12% to 34%) and another 15% were associated with a potential ADE (95% CI 6% to 24%). Active surveillance used approximately 1.5 hours of pharmacist time daily. A commercially available, computer-based ADE detection tool was effective at identifying ADEs. When used as part of an active surveillance program, it can have an impact on preventing or ameliorating ADEs.

  9. High Performance Multi-GPU SpMV for Multi-component PDE-Based Applications

    KAUST Repository

    Abdelfattah, Ahmad; Ltaief, Hatem; Keyes, David E.

    2015-01-01

    -block structure. While these optimizations are important for high performance dense kernel executions, they are even more critical when dealing with sparse linear algebra operations. The most time-consuming phase of many multicomponent applications, such as models

  10. LHCb: Statistical Comparison of CPU performance for LHCb applications on the Grid

    CERN Multimedia

    Graciani, R

    2009-01-01

    The usage of CPU resources by LHCb on the Grid id dominated by two different applications: Gauss and Brunel. Gauss the application doing the Monte Carlo simulation of proton-proton collisions. Brunel is the application responsible for the reconstruction of the signals recorded by the detector converting them into objects that can be used for later physics analysis of the data (tracks, clusters,…) Both applications are based on the Gaudi and LHCb software frameworks. Gauss uses Pythia and Geant as underlying libraries for the simulation of the collision and the later passage of the generated particles through the LHCb detector. While Brunel makes use of LHCb specific code to process the data from each sub-detector. Both applications are CPU bound. Large Monte Carlo productions or data reconstructions running on the Grid are an ideal benchmark to compare the performance of the different CPU models for each case. Since the processed events are only statistically comparable, only statistical comparison of the...

  11. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  12. High performance protection circuit for power electronics applications

    Energy Technology Data Exchange (ETDEWEB)

    Tudoran, Cristian D., E-mail: cristian.tudoran@itim-cj.ro; Dădârlat, Dorin N.; Toşa, Nicoleta; Mişan, Ioan [National Institute for Research and Development of Isotopic and Molecular Technologies, 67-103 Donat, PO 5 Box 700, 400293 Cluj-Napoca (Romania)

    2015-12-23

    In this paper we present a high performance protection circuit designed for the power electronics applications where the load currents can increase rapidly and exceed the maximum allowed values, like in the case of high frequency induction heating inverters or high frequency plasma generators. The protection circuit is based on a microcontroller and can be adapted for use on single-phase or three-phase power systems. Its versatility comes from the fact that the circuit can communicate with the protected system, having the role of a “sensor” or it can interrupt the power supply for protection, in this case functioning as an external, independent protection circuit.

  13. Effects of bioethanol ultrasonic generated aerosols application on diesel engine performances

    Directory of Open Access Journals (Sweden)

    Mariasiu Florin

    2015-01-01

    Full Text Available In this paper the effects of an experimental bioethanol fumigation application using an experimental ultrasound device on performance and emissions of a single cylinder diesel engine have been experimentally investigated. Engine performance and pollutant emissions variations were considered for three different types of fuels (biodiesel, biodiesel-bioethanol blend and biodiesel and fumigated bioethanol. Reductions in brake specific fuel consumption and NOx pollutant emissions are correlated with the use of ultrasonic fumigation of bioethanol fuel, comparative to use of biodiesel-bioethanol blend. Considering the fuel consumption as diesel engine’s main performance parameter, the proposed bioethanol’s fumigation method, offers the possibility to use more efficient renewable biofuels (bioethanol, with immediate effects on environmental protection.

  14. A Hierarchy of Network Performance Characteristics for Grid Applications and Services

    Energy Technology Data Exchange (ETDEWEB)

    Lowekamp, B

    2004-07-06

    This document describes a standard set of network characteristics that are useful for Grid applications and services as well as a classification hierarchy for these characteristics. The goal of this work is to identify the various types of network measurements according to the network characteristic they measure and the network entity on which they are taken. This document defines standard terminology to describe those measurements, but it does not attempt to define new standard measurement methodologies or attempt to define the best measurement methodologies to use for grid applications. However, it does attempt to point out the advantages and disadvantages of different measurement methodologies. This document was motivated by the need for the interchange of measurements taken by various systems in the Grid and to develop a common dictionary to facilitate discussions about and specifications for measurement systems. The application of this naming system will facilitate the creation of common schemata for describing network monitoring data in Grid Monitoring and Discovery Services, and thus help to address portability issues between the wide variety of network measurements used between sites of a Grid.

  15. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  16. Identifying Copy Number Variants under Selection in Geographically Structured Populations Based on -statistics

    Directory of Open Access Journals (Sweden)

    Hae-Hiang Song

    2012-06-01

    Full Text Available Large-scale copy number variants (CNVs in the human provide the raw material for delineating population differences, as natural selection may have affected at least some of the CNVs thus far discovered. Although the examination of relatively large numbers of specific ethnic groups has recently started in regard to inter-ethnic group differences in CNVs, identifying and understanding particular instances of natural selection have not been performed. The traditional FST measure, obtained from differences in allele frequencies between populations, has been used to identify CNVs loci subject to geographically varying selection. Here, we review advances and the application of multinomial-Dirichlet likelihood methods of inference for identifying genome regions that have been subject to natural selection with the FST estimates. The contents of presentation are not new; however, this review clarifies how the application of the methods to CNV data, which remains largely unexplored, is possible. A hierarchical Bayesian method, which is implemented via Markov Chain Monte Carlo, estimates locus-specific FST and can identify outlying CNVs loci with large values of FST. By applying this Bayesian method to the publicly available CNV data, we identified the CNV loci that show signals of natural selection, which may elucidate the genetic basis of human disease and diversity.

  17. Total System Performance Assessment-License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2002-09-13

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issue (KTI) agreements, the ''Yucca Mountain Review Plan'' (CNWRA 2002 [158449]), and 10 CFR Part 63. This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are utilized in this document.

  18. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  19. Performance Evaluation Of Furrow Lengths And Field Application Techniques

    Directory of Open Access Journals (Sweden)

    Issaka

    2015-08-01

    Full Text Available Abstract The study evaluated performance of furrow lengths and field application techniques. The experiment was conducted on 2000 m2 field at Bontanga irrigation scheme. Randomized Complete Block Design RCBD was used with three replicates. The replicates include Blocks A B and C of furrow lengths 100 m 75 m and 50 m respectively. Each replicate had surge cut-off cut-back and bunds treatments. Water was introduced into the furrows and the advance distances and time were measured. Results of the study showed that at Block A surge technique recorded the highest advance rate of 1.26 minm and opportunity time of 11 min whilst bunds recorded the lowest advance rate of 0.92 minm. Significant difference 3.32 pamp88050.05 occurred between treatment means of field application techniques at Block A 100 m. Significant difference 2.71 pamp88050.05 was also recorded between treatment means. At Block B 75 m there was significant difference 2.71 pamp88050.05 between treatment means. No significant difference 0.14 pamp88040.05 was observed among surge cut-back and bunds techniques. There was significant difference 2.60 pamp88050.05 between treatment means but no significant difference between cut-back and bunds techniques in Block C 50 m. Their performance was ranked in the order Surge Cut-back Cut-off Bunds for furrow lengths 100 m 75 m and 50 m respectively.

  20. The Relationship between Cost Leadership Strategy, Total Quality Management Applications and Financial Performance

    OpenAIRE

    Ali KURT; Cemal ZEHİR

    2016-01-01

    Firms need to implement some competition strategies and total quality management applications to overcome the fierce competition among others. The purpose of this study is to show the relationship between cost leadership strategy, total quality management applications and firms’ financial performance with literature review and empirical analysis. 449 questionnaires were conducted to the managers of 142 big firms. The data gathered was assessed with AMOS. As a result, the relationship between ...

  1. Performance of the Emotiv Epoc headset for P300-based applications.

    Science.gov (United States)

    Duvinage, Matthieu; Castermans, Thierry; Petieau, Mathieu; Hoellinger, Thomas; Cheron, Guy; Dutoit, Thierry

    2013-06-25

    For two decades, EEG-based Brain-Computer Interface (BCI) systems have been widely studied in research labs. Now, researchers want to consider out-of-the-lab applications and make this technology available to everybody. However, medical-grade EEG recording devices are still much too expensive for end-users, especially disabled people. Therefore, several low-cost alternatives have appeared on the market. The Emotiv Epoc headset is one of them. Although some previous work showed this device could suit the customer's needs in terms of performance, no quantitative classification-based assessments compared to a medical system are available. This paper aims at statistically comparing a medical-grade system, the ANT device, and the Emotiv Epoc headset by determining their respective performances in a P300 BCI using the same electrodes. On top of that, a review of previous Emotiv studies and a discussion on practical considerations regarding both systems are proposed. Nine healthy subjects participated in this experiment during which the ANT and the Emotiv systems are used in two different conditions: sitting on a chair and walking on a treadmill at constant speed. The Emotiv headset performs significantly worse than the medical device; observed effect sizes vary from medium to large. The Emotiv headset has higher relative operational and maintenance costs than its medical-grade competitor. Although this low-cost headset is able to record EEG data in a satisfying manner, it should only be chosen for non critical applications such as games, communication systems, etc. For rehabilitation or prosthesis control, this lack of reliability may lead to serious consequences. For research purposes, the medical system should be chosen except if a lot of trials are available or when the Signal-to-Noise Ratio is high. This also suggests that the design of a specific low-cost EEG recording system for critical applications and research is still required.

  2. Input/Output of ab-initio nuclear structure calculations for improved performance and portability

    International Nuclear Information System (INIS)

    Laghave, Nikhil

    2010-01-01

    Many modern scientific applications rely on highly computation intensive calculations. However, most applications do not concentrate as much on the role that input/output operations can play for improved performance and portability. Parallelizing input/output operations of large files can significantly improve the performance of parallel applications where sequential I/O is a bottleneck. A proper choice of I/O library also offers a scope for making input/output operations portable across different architectures. Thus, use of parallel I/O libraries for organizing I/O of large data files offers great scope in improving performance and portability of applications. In particular, sequential I/O has been identified as a bottleneck for the highly scalable MFDn (Many Fermion Dynamics for nuclear structure) code performing ab-initio nuclear structure calculations. We develop interfaces and parallel I/O procedures to use a well-known parallel I/O library in MFDn. As a result, we gain efficient I/O of large datasets along with their portability and ease of use in the down-stream processing. Even situations where the amount of data to be written is not huge, proper use of input/output operations can boost the performance of scientific applications. Application checkpointing offers enormous performance improvement and flexibility by doing a negligible amount of I/O to disk. Checkpointing saves and resumes application state in such a manner that in most cases the application is unaware that there has been an interruption to its execution. This helps in saving large amount of work that has been previously done and continue application execution. This small amount of I/O provides substantial time saving by offering restart/resume capability to applications. The need for checkpointing in optimization code NEWUOA has been identified and checkpoint/restart capability has been implemented in NEWUOA by using simple file I/O.

  3. Performance analysis of InSb based QWFET for ultra high speed applications

    International Nuclear Information System (INIS)

    Subash, T. D.; Gnanasekaran, T.; Divya, C.

    2015-01-01

    An indium antimonide based QWFET (quantum well field effect transistor) with the gate length down to 50 nm has been designed and investigated for the first time for L-band radar applications at 230 GHz. QWFETs are designed at the high performance node of the International Technology Road Map for Semiconductors (ITRS) requirements of drive current (Semiconductor Industry Association 2010). The performance of the device is investigated using the SYNOPSYS CAD (TCAD) software. InSb based QWFET could be a promising device technology for very low power and ultra-high speed performance with 5–10 times low DC power dissipation. (semiconductor devices)

  4. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Naser, J.A.

    1987-01-01

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  5. Performance Evaluation of an Enhanced Uplink 3.5G System for Mobile Healthcare Applications

    Directory of Open Access Journals (Sweden)

    Dimitris Komnakos

    2008-01-01

    Full Text Available The present paper studies the prospective and the performance of a forthcoming high-speed third generation (3.5G networking technology, called enhanced uplink, for delivering mobile health (m-health applications. The performance of 3.5G networks is a critical factor for successful development of m-health services perceived by end users. In this paper, we propose a methodology for performance assessment based on the joint uplink transmission of voice, real-time video, biological data (such as electrocardiogram, vital signals, and heart sounds, and healthcare records file transfer. Various scenarios were concerned in terms of real-time, nonreal-time, and emergency applications in random locations, where no other system but 3.5G is available. The accomplishment of quality of service (QoS was explored through a step-by-step improvement of enhanced uplink system's parameters, attributing the network system for the best performance in the context of the desired m-health services.

  6. Performance Evaluation of an Enhanced Uplink 3.5G System for Mobile Healthcare Applications.

    Science.gov (United States)

    Komnakos, Dimitris; Vouyioukas, Demosthenes; Maglogiannis, Ilias; Constantinou, Philip

    2008-01-01

    The present paper studies the prospective and the performance of a forthcoming high-speed third generation (3.5G) networking technology, called enhanced uplink, for delivering mobile health (m-health) applications. The performance of 3.5G networks is a critical factor for successful development of m-health services perceived by end users. In this paper, we propose a methodology for performance assessment based on the joint uplink transmission of voice, real-time video, biological data (such as electrocardiogram, vital signals, and heart sounds), and healthcare records file transfer. Various scenarios were concerned in terms of real-time, nonreal-time, and emergency applications in random locations, where no other system but 3.5G is available. The accomplishment of quality of service (QoS) was explored through a step-by-step improvement of enhanced uplink system's parameters, attributing the network system for the best performance in the context of the desired m-health services.

  7. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  8. Topic 14+16: High-performance and scientific applications and extreme-scale computing (Introduction)

    KAUST Repository

    Downes, Turlough P.

    2013-01-01

    As our understanding of the world around us increases it becomes more challenging to make use of what we already know, and to increase our understanding still further. Computational modeling and simulation have become critical tools in addressing this challenge. The requirements of high-resolution, accurate modeling have outstripped the ability of desktop computers and even small clusters to provide the necessary compute power. Many applications in the scientific and engineering domains now need very large amounts of compute time, while other applications, particularly in the life sciences, frequently have large data I/O requirements. There is thus a growing need for a range of high performance applications which can utilize parallel compute systems effectively, which have efficient data handling strategies and which have the capacity to utilise current and future systems. The High Performance and Scientific Applications topic aims to highlight recent progress in the use of advanced computing and algorithms to address the varied, complex and increasing challenges of modern research throughout both the "hard" and "soft" sciences. This necessitates being able to use large numbers of compute nodes, many of which are equipped with accelerators, and to deal with difficult I/O requirements. © 2013 Springer-Verlag.

  9. Repository performance confirmation

    International Nuclear Information System (INIS)

    Hansen, Francis D.

    2011-01-01

    Yucca Mountain license application identified a broad suite of monitoring activities. A revision of the plan was expected to winnow the number of activities down to a manageable size. As a result, an objective process for the next stage of performance confirmation planning was developed as an integral part of an overarching long-term testing and monitoring strategy. The Waste Isolation Pilot Plant compliance monitoring program at once reflects its importance to stakeholders while demonstrating adequate understanding of relevant monitoring parameters. The compliance criteria were stated by regulation and are currently monitored as part of the regulatory rule for disposal. At the outset, the screening practice and parameter selection were not predicated on a direct or indirect correlation to system performance metrics, as was the case for Yucca Mountain. Later on, correlation to performance was established, and the Waste Isolation Pilot Plant continues to monitor ten parameters originally identified in the compliance certification documentation. The monitoring program has proven to be effective for the technical intentions and societal or public assurance. The experience with performance confirmation in the license application process for Yucca Mountain helped identify an objective, quantitative methodology for this purpose. Revision of the existing plan would be based on findings of the total system performance assessment. Identification and prioritization of confirmation activities would then derive from performance metrics associated with performance assessment. Given the understanding of repository performance confirmation, as reviewed in this paper, it is evident that the performance confirmation program for the Yucca Mountain project could be readily re-engaged if licensing activities resumed.

  10. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  11. MOGO: Model-Oriented Global Optimization of Petascale Applications

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.; Shende, Sameer S.

    2012-09-14

    The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge, performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.

  12. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Ginés D. Guerrero

    2014-01-01

    Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  13. Structural identifiability analyses of candidate models for in vitro Pitavastatin hepatic uptake.

    Science.gov (United States)

    Grandjean, Thomas R B; Chappell, Michael J; Yates, James W T; Evans, Neil D

    2014-05-01

    In this paper a review of the application of four different techniques (a version of the similarity transformation approach for autonomous uncontrolled systems, a non-differential input/output observable normal form approach, the characteristic set differential algebra and a recent algebraic input/output relationship approach) to determine the structural identifiability of certain in vitro nonlinear pharmacokinetic models is provided. The Organic Anion Transporting Polypeptide (OATP) substrate, Pitavastatin, is used as a probe on freshly isolated animal and human hepatocytes. Candidate pharmacokinetic non-linear compartmental models have been derived to characterise the uptake process of Pitavastatin. As a prerequisite to parameter estimation, structural identifiability analyses are performed to establish that all unknown parameters can be identified from the experimental observations available. Copyright © 2013. Published by Elsevier Ireland Ltd.

  14. Development of GEM detector for plasma diagnostics application: simulations addressing optimization of its performance

    Science.gov (United States)

    Chernyshova, M.; Malinowski, K.; Kowalska-Strzęciwilk, E.; Czarski, T.; Linczuk, P.; Wojeński, A.; Krawczyk, R. D.

    2017-12-01

    The advanced Soft X-ray (SXR) diagnostics setup devoted to studies of the SXR plasma emissivity is at the moment a highly relevant and important for ITER/DEMO application. Especially focusing on the energy range of tungsten emission lines, as plasma contamination by W and its transport in the plasma must be understood and monitored for W plasma-facing material. The Gas Electron Multiplier, with a spatial and energy-resolved photon detecting chamber, based SXR radiation detection system under development by our group may become such a diagnostic setup considering and solving many physical, technical and technological aspects. This work presents the results of simulations aimed to optimize a design of the detector's internal chamber and its performance. The study of the effect of electrodes alignment allowed choosing the gap distances which maximizes electron transmission and choosing the optimal magnitudes of the applied electric fields. Finally, the optimal readout structure design was identified suitable to collect a total formed charge effectively, basing on the range of the simulated electron cloud at the readout plane which was in the order of ~ 2 mm.

  15. Football refereeing: Identifying innovative methods

    Directory of Open Access Journals (Sweden)

    Reza MohammadKazemi

    2014-08-01

    Full Text Available The aim of the present study is to identify the potentials innovation in football industry. Data were collected from 10 national and international referees, assistant referees and referees’ supervisors in Iran. In this study, technological innovations are identified that assist better refereeing performances. The analysis revealed a significant relationship between using new technologies and referees ‘performance. The results indicate that elite referees, assistant referees and supervisors agreed to use new technological innovations during the game. According to their comments, this kind of technology causes the referees’ performance development.

  16. Identify and Quantify the Mechanistic Sources of Sensor Performance Variation Between Individual Sensors SN1 and SN2

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, Aaron A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baldwin, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cinson, Anthony D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jones, Anthony M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Larche, Michael R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mathews, Royce [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mullen, Crystal A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pardini, Allan F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Posakony, Gerald J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prowant, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hartman, Trenton S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Edwards, Matthew K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-08-06

    This Technical Letter Report satisfies the M3AR-14PN2301022 milestone, and is focused on identifying and quantifying the mechanistic sources of sensor performance variation between individual 22-element, linear phased-array sensor prototypes, SN1 and SN2. This effort constitutes an iterative evolution that supports the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor inspection system. The scope of the work for this portion of the PNNL effort conducted in FY14 includes performing a comparative evaluation and assessment of the performance characteristics of the SN1 and SN2 22 element PA-UT probes manufactured at PNNL. Key transducer performance parameters, such as sound field dimensions, resolution capabilities, frequency response, and bandwidth are used as a metric for the comparative evaluation and assessment of the SN1 and SN2 engineering test units.

  17. Development of comprehensive material performance database for nuclear applications

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime

    1993-01-01

    This paper introduces the present status of the comprehensive material performance database for nuclear applications, which was named JAERI Material Performance Database (JMPD), and examples of its utilization. The JMPD has been developed since 1986 in JAERI with a view to utilizing various kinds of characteristics data of nuclear materials efficiently. Management system of relational database, PLANNER, was employed, and supporting systems for data retrieval and output were expanded. In order to improve user-friendliness of the retrieval system, the menu selection type procedures have been developed where knowledge of the system or the data structures are not required for end-users. As to utilization of the JMPD, two types of data analyses are mentioned as follows: (1) A series of statistical analyses was performed in order to estimate the design values both of the yield strength (Sy) and the tensile strength (Su) for aluminum alloys which are widely used as structural materials for research reactors. (2) Statistical analyses were accomplished by using the cyclic crack growth rate data for nuclear pressure vessel steels, and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and ΔK-constant type tests. (author)

  18. Control-based method to identify underlying delays of a nonlinear dynamical system.

    Science.gov (United States)

    Yu, Dongchuan; Frasca, Mattia; Liu, Fang

    2008-10-01

    We suggest several stationary state control-based delay identification methods which do not require any structural information about the controlled systems and are applicable to systems described by delayed ordinary differential equations. This proposed technique includes three steps: (i) driving a system to a steady state; (ii) perturbing the control signal for shifting the steady state; and (iii) identifying all delays by detecting the time that the system is abruptly drawn out of stationarity. Some aspects especially important for applications are discussed as well, including interaction delay identification, stationary state convergence speed, performance comparison, and the influence of noise on delay identification. Several examples are presented to illustrate the reliability and robustness of all delay identification methods suggested.

  19. Pay for Performance Proposals in Race to the Top Round II Applications. Briefing Memo

    Science.gov (United States)

    Rose, Stephanie

    2010-01-01

    The Education Commission of the States reviewed all 36 Race to the Top (RttT) round II applications. Each of the 36 states that applied for round II funding referenced pay for performance under the heading of "Improving teacher and principal effectiveness based on performance." The majority of states outlined pay for performance…

  20. Application Of Quality Function Deployment (QFD) To Measure Performance

    International Nuclear Information System (INIS)

    Fazila Said; Mohd Amirul Shafiq Shafiee; Nurul Hasanah Mohd Abd Basir

    2014-01-01

    This study aims to measure service quality performance and identify critical service quality characteristics as perceived by the customers. An integrated results survey that conducted by seven service centers that certified with Quality Management System (QMS) in Nuclear Malaysia are analysed. This is followed by constructing House of Quality (HoQ) and identifying other parameters for the Quality Function Deployment (QFD) matrix. HoQ is a simple and attractive service innovation tool which can be used to directly show comprehensive information which contained the voice of customer (VOC), technical response, technical correlation and matrix relationship. This study revealed that the information's from HoQ with further discussion on planning part which can be used to assist management in knowing the overall detail information of service center achievement and recognizes the solution for unsatisfied customer through priority improvement activity to enhance the customer satisfaction in future. (author)

  1. A New Unified Intrusion Anomaly Detection in Identifying Unseen Web Attacks

    Directory of Open Access Journals (Sweden)

    Muhammad Hilmi Kamarudin

    2017-01-01

    Full Text Available The global usage of more sophisticated web-based application systems is obviously growing very rapidly. Major usage includes the storing and transporting of sensitive data over the Internet. The growth has consequently opened up a serious need for more secured network and application security protection devices. Security experts normally equip their databases with a large number of signatures to help in the detection of known web-based threats. In reality, it is almost impossible to keep updating the database with the newly identified web vulnerabilities. As such, new attacks are invisible. This research presents a novel approach of Intrusion Detection System (IDS in detecting unknown attacks on web servers using the Unified Intrusion Anomaly Detection (UIAD approach. The unified approach consists of three components (preprocessing, statistical analysis, and classification. Initially, the process starts with the removal of irrelevant and redundant features using a novel hybrid feature selection method. Thereafter, the process continues with the application of a statistical approach to identifying traffic abnormality. We performed Relative Percentage Ratio (RPR coupled with Euclidean Distance Analysis (EDA and the Chebyshev Inequality Theorem (CIT to calculate the normality score and generate a finest threshold. Finally, Logitboost (LB is employed alongside Random Forest (RF as a weak classifier, with the aim of minimising the final false alarm rate. The experiment has demonstrated that our approach has successfully identified unknown attacks with greater than a 95% detection rate and less than a 1% false alarm rate for both the DARPA 1999 and the ISCX 2012 datasets.

  2. Performance and application of a fourfold monopole mass spectrometer

    International Nuclear Information System (INIS)

    Richards, J.A.; Huey, R.M.

    1978-01-01

    Some preliminary tests with an experimental fourfold monopole mass spectrometer described, illustrating that the device performs acceptably (at the low resolutions used) despite the fact that the field-forming surfaces of the driven electrodes are only one quadrant of a cylinder. Coupling between adjacent channels is shown not to be a problem so that applications requiring simultaneous measurements using two or more of the monopole channels can be entertained. Owing to its parellel structure the instrument is suggested as being suited particularly to isotope ratio measurements with precisions which could be significantly better than would be possible with a quadrupole device. (Auth.)

  3. Performance measures in the earth observations commercialization applications program

    Science.gov (United States)

    Macauley, Molly K.

    1996-03-01

    Performance measures in the Earth Observations Commercialization Application Program (EOCAP) are key to its success and include net profitability; enhancements to industry productivity through generic innovations in industry practices, standards, and protocols; and documented contributions to public policy governing the newly developing remote sensing industry. Because EOCAP requires company co-funding, both parties to the agreement (the government and the corporate partner) have incentives to pursue these goals. Further strengthening progress towards these goals are requirements for business plans in the company's EOCAP proposal, detailed scrutiny given these plans during proposal selection, and regularly documented progress reports during project implementation.

  4. The minireactor Mirene for neutron-radiography: performances and applications

    International Nuclear Information System (INIS)

    Houelle, M.; Gerberon, J.M.

    1981-05-01

    The MIRENE neutron radiograhy mini-reactor is described. The core contains only one kilogram of enriched uranium in solution form. It works by pulsed operation. The neutron bursts produced are collimated into two beams which pass through the concrete protection around the reactor block. The performance of the reactor and the results achieved since it went into service in 1977 are described. These concern various fields. In the nuclear field: examination of fast neutron reactor fissile pins, monitoring of neutron absorbing screens employed to guarantee the safety-criticality of the transport and storage of the nuclear fuel cycle, observation of irradiated oxide fuel pellets in order to determine the fuel state equation of the fast neutron system, examination of UO 2 and water mixtures for criticality experiments. In the industrial field, Mirene has a vast field of application. Two examples are given: monitoring of electric insulation sealing, visualization of the bonding of two high density metal parts. Finally an original application in agronomy has given very good results: this concerns the on-site follow-up of the root growth of maize plants [fr

  5. Performance of a novel micro force vector sensor and outlook into its biomedical applications

    Science.gov (United States)

    Meiss, Thorsten; Rossner, Tim; Minamisava Faria, Carlos; Völlmeke, Stefan; Opitz, Thomas; Werthschützky, Roland

    2011-05-01

    For the HapCath system, which provides haptic feedback of the forces acting on a guide wire's tip during vascular catheterization, very small piezoresistive force sensors of 200•200•640μm3 have been developed. This paper focuses on the characterization of the measurement performance and on possible new applications. Besides the determination of the dynamic measurement performance, special focus is put onto the results of the 3- component force vector calibration. This article addresses special advantageous characteristics of the sensor, but also the limits of applicability will be addressed. As for the special characteristics of the sensor, the second part of the article demonstrates new applications which can be opened up with the novel force sensor, like automatic navigation of medical or biological instruments without impacting surrounding tissue, surface roughness evaluation in biomedical systems, needle insertion with tactile or higher level feedback, or even building tactile hairs for artificial organisms.

  6. Performance of large-scale scientific applications on the IBM ASCI Blue-Pacific system

    International Nuclear Information System (INIS)

    Mirin, A.

    1998-01-01

    The IBM ASCI Blue-Pacific System is a scalable, distributed/shared memory architecture designed to reach multi-teraflop performance. The IBM SP pieces together a large number of nodes, each having a modest number of processors. The system is designed to accommodate a mixed programming model as well as a pure message-passing paradigm. We examine a number of applications on this architecture and evaluate their performance and scalability

  7. Performance Evaluation of RIPng, EIGRPv6 and OSPFv3 for Real Time Applications

    Directory of Open Access Journals (Sweden)

    Sama Salam Samaan

    2018-01-01

    Full Text Available In this modern Internet era and the transition to IPv6, routing protocols must adjust to assist this transformation. RIPng, EIGRPv6 and OSPFv3 are the dominant IPv6 IGRP (Interior Gateway Routing Protocols. Selecting the best routing protocol among the available is a critical task, which depends upon the network requirement and performance parameters of different real time applications. The primary motivation of this paper is to estimate the performance of these protocols in real time applications. The evaluation is based on a number of criteria including: network convergence duration, Http Page Response Time, DB Query Response Time, IPv6 traffic dropped, video packet delay variation and video packet end to end delay. After examining the simulation results, a conclusion will be extracted to reveal the findings of which protocol performs the best upon implementation within a IPv6 WAN. OPNET modeler simulator is used to evaluate the accomplishment of these protocols. To get the results, three scenarios are designed, one for each protocol.

  8. Assessment of applicability of portable HPGe detector with in situ object counting system based on performance evaluation of thyroid radiobioassays

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Seok; Kwon, Tae Eun; Pak, Min Jung; Park, Se Young; Ha, Wi Ho; Jin, Young Woo [National Radiation Emergency Medical Center, Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2017-06-15

    Different cases exist in the measurement of thyroid radiobioassays owing to the individual characteristics of the subjects, especially the potential variation in the counting efficiency. An In situ Object Counting System (ISOCS) was developed to perform an efficiency calibration based on the Monte Carlo calculation, as an alternative to conventional calibration methods. The purpose of this study is to evaluate the applicability of ISOCS to thyroid radiobioassays by comparison with a conventional thyroid monitoring system. The efficiency calibration of a portable high-purity germanium (HPGe) detector was performed using ISOCS software. In contrast, the conventional efficiency calibration, which needed a radioactive material, was applied to a scintillator-based thyroid monitor. Four radioiodine samples that contained 125I and 131I in both aqueous solution and gel forms were measured to evaluate radioactivity in the thyroid. ANSI/HPS N13.30 performance criteria, which included the relative bias, relative precision, and root-mean-squared error, were applied to evaluate the performance of the measurement system. The portable HPGe detector could measure both radioiodines with ISOCS but the thyroid monitor could not measure 125I because of the limited energy resolution of the NaI(Tl) scintillator. The 131I results from both detectors agreed to within 5% with the certified results. Moreover, the 125I results from the portable HPGe detector agreed to within 10% with the certified results. All measurement results complied with the ANSI/HPS N13.30 performance criteria. The results of the intercomparison program indicated the feasibility of applying ISOCS software to direct thyroid radiobioassays. The portable HPGe detector with ISOCS software can provide the convenience of efficiency calibration and higher energy resolution for identifying photopeaks, compared with a conventional thyroid monitor with a NaI(Tl) scintillator. The application of ISOCS software in a radiation

  9. 40 CFR 59.626 - What emission testing must I perform for my application for a certificate of conformity?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false What emission testing must I perform for my application for a certificate of conformity? 59.626 Section 59.626 Protection of Environment... application for a certificate of conformity? This section describes the emission testing you must perform to...

  10. Performance of magneto-optical glass in optical current transducer application

    International Nuclear Information System (INIS)

    Shen, Yan; Lu, Yunhe; Liu, Zhao; Yu, Xueliang; Zhang, Guoqing; Yu, Wenbin

    2015-01-01

    First, a theoretical analysis was performed on the effect of temperature on the performance of the sensing element of paramagnetic rare earth-doped magneto-optical glass material that can be used in an optical current transducer application. The effect comprises two aspects: the linear birefringence and the Verdet constant. On this basis, rare earth-doped glass temperature characteristics were studied, and the experimental results indicated that the linear birefringence of rare earth-doped glass increased with increasing temperature, while its magneto-optical sensitivity decreased. Comparative experiments performed for various concentrations of rare earth dopant in the glass revealed that changes in the dopant concentration had no significant effect on the performance of magneto-optical glass. At last, a comparison between rare earth-doped magneto-optical and diamagnetic dense flint glass showed that the sensitivity of the former was six times that of the latter, although the temperature stability of the former was poorer. - Highlights: • Theoretical analysis on the effects of temperature on RE glass. • Rare earth doping leads to higher magneto-optical sensitivity. • The sensitivity of the RE glass is six times that of the dense flint glass

  11. The electronic residency application service application can predict accreditation council for graduate medical education competency-based surgical resident performance.

    Science.gov (United States)

    Tolan, Amy M; Kaji, Amy H; Quach, Chi; Hines, O Joe; de Virgilio, Christian

    2010-01-01

    Program directors often struggle to determine which factors in the Electronic Residency Application Service (ERAS) application are important in the residency selection process. With the establishment of the Accreditation Council for Graduate Medical Education (ACGME) competencies, it would be important to know whether information available in the ERAS application can predict subsequent competency-based performance of general surgery residents. This study is a retrospective correlation of data points found in the ERAS application with core competency-based clinical rotation evaluations. ACGME competency-based evaluations as well as technical skills assessment from all rotations during residency were collected. The overall competency score was defined as an average of all 6 competencies and technical skills. A total of77 residents from two (one university and one community based university-affiliate) general surgery residency programs were included in the analysis. Receiving honors for many of the third year clerkships and AOA membership were associated with a number of the individual competencies. USMLE scores were predictive only of Medical Knowledge (p = 0.004). Factors associated with higher overall competency were female gender (p = 0.02), AOA (p = 0.06), overall number of honors received (p = 0.04), and honors in Ob/Gyn (p = 0.03) and Pediatrics (p = 0.05). Multivariable analysis showed honors in Ob/Gyn, female gender, older age, and total number of honors to be predictive of a number of individual core competencies. USMLE scores were only predictive of Medical Knowledge. The ERAS application is useful for predicting subsequent competency based performance in surgical residents. Receiving honors in the surgery clerkship, which has traditionally carried weight when evaluating a potential surgery resident, may not be as strong a predictor of future success. Copyright © 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights

  12. Impact of power limitations on the performance of WLANs for home networking applications

    OpenAIRE

    Armour, SMD; Lee, BS; Doufexi, A; Nix, AR; Bull, DR

    2001-01-01

    This paper considers the application of 5 GHz wireless LAN technology to home networking applications. An assessment of physical layer performance is presented in the form of the achievable data rate as a function of received signal to noise ratio. The transmit power limitations imposed by the relevant regulatory bodies are also summarised. Based on this information, a state of the art propagation modelling tool is used to evaluate the coverage achieved by a WLAN system in an example resident...

  13. The application of advanced rotor (performance) methods for design calculations

    Energy Technology Data Exchange (ETDEWEB)

    Bussel, G.J.W. van [Delft Univ. of Technology, Inst. for Wind Energy, Delft (Netherlands)

    1997-08-01

    The calculation of loads and performance of wind turbine rotors has been a topic for research over the last century. The principles for the calculation of loads on rotor blades with a given specific geometry, as well as the development of optimal shaped rotor blades have been published in the decades that significant aircraft development took place. Nowadays advanced computer codes are used for specific problems regarding modern aircraft, and application to wind turbine rotors has also been performed occasionally. The engineers designing rotor blades for wind turbines still use methods based upon global principles developed in the beginning of the century. The question what to expect in terms of the type of methods to be applied in a design environment for the near future is addressed here. (EG) 14 refs.

  14. FDTD simulations to assess the performance of CFMA-434 applicators for superficial hyperthermia.

    Science.gov (United States)

    Kok, H Petra; De Greef, Martijn; Correia, Davi; Vörding, Paul J Zum Vörde Sive; Van Stam, Gerard; Gelvich, Edward A; Bel, Arjan; Crezee, Johannes

    2009-01-01

    Contact flexible microstrip applicators (CFMA), operating at 434 MHz, are applied at the Academic Medical Center (AMC) for superficial hyperthermia (e.g. chest wall recurrences and melanoma). This paper investigates the performance of CFMA, evaluating the stability of the specific absorption rate (SAR) distribution, effective heating depth (EHD) and effective field size (EFS) under different conditions. Simulations were performed using finite differences and were compared to existing measurement data, performed using a rectangular phantom with a superficial fat-equivalent layer of 1 cm and filled with saline solution. The electrode plates of the applicators measure approximately 7 x 20, 29 x 21 and 20 x 29 cm(2). Bolus thickness varied between 1 and 2 cm. The impact of the presence of possible air layers between the rubber frame and the electrodes on the SAR distribution was investigated. The EHD was approximately 1.4 cm and the EFS ranged between approximately 60 and approximately 300 cm(2), depending on the applicator type. Both measurements and simulations showed a split-up of the SAR focus with a 2 cm water bolus. The extent and location of air layers has a strong influence on the shape and size of the iso-SAR contours with a value higher than 50%, but the impact on EFS and EHD is limited. Simulations, confirmed by measurements, showed that the presence of air between the rubber and the electrodes changes the iso-SAR contours, but the impact on the EFS and EHD is limited.

  15. Performance assessment of advanced engineering workstations for fuel management applications

    International Nuclear Information System (INIS)

    Turinsky, P.J.

    1989-07-01

    The purpose of this project was to assess the performance of an advanced engineering workstation [AEW] with regard to applications to incore fuel management for LWRs. The attributes of most interest to us that define an AEW are parallel computational hardware and graphics capabilities. The AEWs employed were super microcomputers manufactured by MASSCOMP, Inc. These computers utilize a 32-bit architecture, graphics co-processor, multi-CPUs [up to six] attached to common memory and multi-vector accelerators. 7 refs., 33 figs., 4 tabs

  16. Comparing performance on the MNREAD iPad application with the MNREAD acuity chart.

    Science.gov (United States)

    Calabrèse, Aurélie; To, Long; He, Yingchen; Berkholtz, Elizabeth; Rafian, Paymon; Legge, Gordon E

    2018-01-01

    Our purpose was to compare reading performance measured with the MNREAD Acuity Chart and an iPad application (app) version of the same test for both normally sighted and low-vision participants. Our methods included 165 participants with normal vision and 43 participants with low vision tested on the standard printed MNREAD and on the iPad app version of the test. Maximum Reading Speed, Critical Print Size, Reading Acuity, and Reading Accessibility Index were compared using linear mixed-effects models to identify any potential differences in test performance between the printed chart and the iPad app. Our results showed the following: For normal vision, chart and iPad yield similar estimates of Critical Print Size and Reading Acuity. The iPad provides significantly slower estimates of Maximum Reading Speed than the chart, with a greater difference for faster readers. The difference was on average 3% at 100 words per minute (wpm), 6% at 150 wpm, 9% at 200 wpm, and 12% at 250 wpm. For low vision, Maximum Reading Speed, Reading Accessibility Index, and Critical Print Size are equivalent on the iPad and chart. Only the Reading Acuity is significantly smaller (I. E., better) when measured on the digital version of the test, but by only 0.03 logMAR (p = 0.013). Our conclusions were that, overall, MNREAD parameters measured with the printed chart and the iPad app are very similar. The difference found in Maximum Reading Speed for the normally sighted participants can be explained by differences in the method for timing the reading trials.

  17. Robust global identifiability theory using potentials--Application to compartmental models.

    Science.gov (United States)

    Wongvanich, N; Hann, C E; Sirisena, H R

    2015-04-01

    This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. A Data Filter for Identifying Steady-State Operating Points in Engine Flight Data for Condition Monitoring Applications

    Science.gov (United States)

    Simon, Donald L.; Litt, Jonathan S.

    2010-01-01

    This paper presents an algorithm that automatically identifies and extracts steady-state engine operating points from engine flight data. It calculates the mean and standard deviation of select parameters contained in the incoming flight data stream. If the standard deviation of the data falls below defined constraints, the engine is assumed to be at a steady-state operating point, and the mean measurement data at that point are archived for subsequent condition monitoring purposes. The fundamental design of the steady-state data filter is completely generic and applicable for any dynamic system. Additional domain-specific logic constraints are applied to reduce data outliers and variance within the collected steady-state data. The filter is designed for on-line real-time processing of streaming data as opposed to post-processing of the data in batch mode. Results of applying the steady-state data filter to recorded helicopter engine flight data are shown, demonstrating its utility for engine condition monitoring applications.

  19. Staff Performance Analysis: A Method for Identifying Brigade Staff Tasks

    National Research Council Canada - National Science Library

    Ford, Laura

    1997-01-01

    ... members of conventional mounted brigade staff. Initial analysis of performance requirements in existing documentation revealed that the performance specifications were not sufficiently detailed for brigade battle staffs...

  20. Development of a software application to evaluate the performance and energy losses of grid-connected photovoltaic systems

    International Nuclear Information System (INIS)

    Trillo-Montero, D.; Santiago, I.; Luna-Rodriguez, J.J.; Real-Calvo, R.

    2014-01-01

    Highlights: • Software application to perform an automated analysis of grid-connected PV systems. • It integrates data from all devices registering data on typical PV installations. • Flexible to analyze installations with different configurations and components. • An analysis of two grid-connected PV systems located in Andalusia, was performed. • Temperature losses in summer months varying between 15% and 25% of energy production. - Abstract: The aim of this paper was to design and develop a software application that enables users to perform an automated analysis of data from the monitoring of grid-connected photovoltaic (PV) systems. This application integrates data from all devices already in operation such as environmental sensors, inverters and meters, which record information on typical PV installations. This required the development of a Relational Database Management System (RDBMS), consisting of a series of linked databases, enabling all PV system information to be stored; and a software, called S·lar, which enables all information from the monitoring to be automatically migrated to the database as well as determining some standard magnitudes related to performances and losses of PV installation components at different time scales. A visualization tool, which is both graphical and numerical, makes access to all of the information be a simple task. Moreover, the application enables relationships between parameters and/or magnitudes to be easily established. Furthermore, it can perform a preliminary analysis of the influence of PV installations on the distribution grids where the produced electricity is injected. The operation of such a software application was implemented by performing the analysis of two grid-connected PV installations located in Andalusia, Spain, via data monitoring therein. The monitoring took place from January 2011 to May 2012

  1. Identifying attributes of food literacy: a scoping review.

    Science.gov (United States)

    Azevedo Perry, Elsie; Thomas, Heather; Samra, H Ruby; Edmonstone, Shannon; Davidson, Lyndsay; Faulkner, Amy; Petermann, Lisa; Manafò, Elizabeth; Kirkpatrick, Sharon I

    2017-09-01

    An absence of food literacy measurement tools makes it challenging for nutrition practitioners to assess the impact of food literacy on healthy diets and to evaluate the outcomes of food literacy interventions. The objective of the present scoping review was to identify the attributes of food literacy. A scoping review of peer-reviewed and grey literature was conducted and attributes of food literacy identified. Subjects included in the search were high-risk groups. Eligible articles were limited to research from Canada, USA, the UK, Australia and New Zealand. The search identified nineteen peer-reviewed and thirty grey literature sources. Fifteen identified food literacy attributes were organized into five categories. Food and Nutrition Knowledge informs decisions about intake and distinguishing between 'healthy' and 'unhealthy' foods. Food Skills focuses on techniques of food purchasing, preparation, handling and storage. Self-Efficacy and Confidence represent one's capacity to perform successfully in specific situations. Ecologic refers to beyond self and the interaction of macro- and microsystems with food decisions and behaviours. Food Decisions reflects the application of knowledge, information and skills to make food choices. These interdependent attributes are depicted in a proposed conceptual model. The lack of evaluated tools inhibits the ability to assess and monitor food literacy; tailor, target and evaluate programmes; identify gaps in programming; engage in advocacy; and allocate resources. The present scoping review provides the foundation for the development of a food literacy measurement tool to address these gaps.

  2. Performance Analysis of a Bunch and Track Identifier Prototype (BTI) for the CMS Barrel Muon Drift Chambers

    International Nuclear Information System (INIS)

    Puerta Pelayo, J.

    2001-01-01

    This note contains a short description of the first step in the first level trigger applied to the barrel muon drift chambers of CMS: the Bunch and Track Identifier (BTI). The test beam results obtained with a BTI prototype have been also analysed BTI performance for different incidence angles and in presence of external magnetic field has been tested, as well as BTI capability as trigger device and track reconstructor. (Author) 30 refs

  3. Time distortion associated with smartphone addiction: Identifying smartphone addiction via a mobile application (App).

    Science.gov (United States)

    Lin, Yu-Hsuan; Lin, Yu-Cheng; Lee, Yang-Han; Lin, Po-Hsien; Lin, Sheng-Hsuan; Chang, Li-Ren; Tseng, Hsien-Wei; Yen, Liang-Yu; Yang, Cheryl C H; Kuo, Terry B J

    2015-06-01

    Global smartphone penetration has brought about unprecedented addictive behaviors. We report a proposed diagnostic criteria and the designing of a mobile application (App) to identify smartphone addiction. We used a novel empirical mode decomposition (EMD) to delineate the trend in smartphone use over one month. The daily use count and the trend of this frequency are associated with smartphone addiction. We quantify excessive use by daily use duration and frequency, as well as the relationship between the tolerance symptoms and the trend for the median duration of a use epoch. The psychiatrists' assisted self-reporting use time is significant lower than and the recorded total smartphone use time via the App and the degree of underestimation was positively correlated with actual smartphone use. Our study suggests the identification of smartphone addiction by diagnostic interview and via the App-generated parameters with EMD analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Developing a methodology for identifying correlations between LERF and early fatality

    International Nuclear Information System (INIS)

    Kang, Kyung Min; Jae, Moo Sung; Ahn, Kwang Il

    2009-01-01

    The correlations between Large Early Release Frequency (LERF) and Early Fatality need to be investigated for risk-informed application and regulation. In RG-1.174, there are decision-making criteria using the measures of CDF and LERF, while there are no specific criteria on LERF. Since there are both huge uncertainty and large cost need in off-site consequence calculation, a LERF assessment methodology need to be developed and its correlation factor needs to be identified for risk-informed decision-making. This regards, the robust method for estimating off-site consequence has been performed for assessing health effects caused by radioisotopes released from severe accidents of nuclear power plants. And also, MACCS2 code are used for validating source term quantitatively regarding health effects depending on release characteristics of radioisotopes during severe accidents has been performed. This study developed a method for identifying correlations between LERF and Early Fatality and validates the results of the model using MACCS2 code. The results of this study may contribute to defining LERF and finding a measure for risk-informed regulations and risk-informed decision making

  5. Experimental study on the performance of a liquid cooling garment with the application of MEPCMS

    International Nuclear Information System (INIS)

    Wang, Tao; Wang, Liang; Bai, Lizhan; Lin, Guiping; Bu, Xueqin; Liu, Xiangyang; Xie, Guanghui

    2015-01-01

    Highlights: • MEPCMS was applied in a liquid cooling garment for space applications. • Extensive experimental study on the performance of the LCG was conducted. • LCG was assessed by heat dissipation, temperature control and thermal comfort. • Proper match of relevant parameters was crucial in enhancing LCG performance. • 26% enhancement in heat dissipation was achieved by MEPCMS compared to water. - Abstract: As a novel working fluid, microencapsulated phase change material suspension (MEPCMS) exhibits obvious superiority in both heat transfer and temperature control compared with traditional ones. In this paper, extensive experimental study on the performance of a liquid cooling garment (LCG) with the application of this novel working fluid was conducted for future space applications. The main task for a LCG is to efficiently collect, transport and dissipate the metabolic heat produced from the human body. In the experiment, a thermal manikin was employed to simulate the human body, and the performance of the LCG with MEPCMS as the working fluid was evaluated by a variety of aspects such as heat dissipation, temperature control, pump power consumption and thermal comfort under both steady state and transient conditions. Experimental results show that the inlet temperature, mass flowrate and volume concentration of the MEPCMS are three key parameters affecting the performance of the LCG, which can be enhanced significantly by a proper combination of these parameters. Otherwise, the performance of the LCG will deteriorate or even be worse than that using water as the working fluid. When the inlet temperature, mass flowrate and volume concentration of the MEPCMS were selected as 11 °C, 200 g/min and 20% respectively, the heat dissipation of the LCG was enhanced by up to 26% with no obvious increase of the pump power compared with that using water as the working fluid, the temperature distribution in the human body became more uniform, and the capability

  6. The performance and application of laser-induced photoacoustic spectrometer

    International Nuclear Information System (INIS)

    Wang Bo; Chen Xi; Yao Jun

    2012-01-01

    Laser-induced photoacoustic spectrometer (LIPAS) is a key instrument can be used in the investigation of radionuclides migration behaviors due to its higher sensitivity for the detection and identification of radionuclides speciation in aqueous solutions. The speciation of radionuclides such as oxidation states and complexation may be determined directly by using this specific non-contact and nondestructive analytical technique, and the sensitivity of LIPAS surpasses that of conventional absorption spectroscopy by one to two orders of magnitude. In the present work, LIPAS system was established at China Institute of Atomic Energy (CIAE), and the principle, performance and preliminary application of LIPAS are also be presented. (authors)

  7. 2003 Wastewater Land Application Site Performance Reports for the Idaho National Engineering and Environmental Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Teresa R. Meachum

    2004-02-01

    The 2003 Wastewater Land Application Site Performance Reports for the Idaho National Engineering and Environmental Laboratory describe the conditions for the facilities with State of Idaho Wastewater Land Application Permits. Permit-required monitoring data are summarized, and permit exceedences or environmental impacts relating to the operations of the facilities during the 2003 permit year are discussed.

  8. IEA Wind Task 32: Wind lidar identifying and mitigating barriers to the adoption of wind lidar

    DEFF Research Database (Denmark)

    Clifton, Andrew; Clive, Peter; Gottschall, Julia

    2018-01-01

    IEA Wind Task 32 exists to identify and mitigate barriers to the adoption of lidar for wind energy applications. It leverages ongoing international research and development activities in academia and industry to investigate site assessment, power performance testing, controls and loads, and complex...... flows. Since its initiation in 2011, Task 32 has been responsible for several recommended practices and expert reports that have contributed to the adoption of ground-based, nacelle-based, and floating lidar by the wind industry. Future challenges include the development of lidar uncertainty models......, best practices for data management, and developing community-based tools for data analysis, planning of lidar measurements and lidar configuration. This paper describes the barriers that Task 32 identified to the deployment of wind lidar in each of these application areas, and the steps that have been...

  9. Artificial Intelligence: An Analysis of Potential Applications to Training, Performance Measurement, and Job Performance Aiding. Interim Report for Period September 1982-July 1983.

    Science.gov (United States)

    Richardson, J. Jeffrey

    This paper is part of an Air Force planning effort to develop a research, development, and applications program for the use of artificial intelligence (AI) technology in three target areas: training, performance measurement, and job performance aiding. The paper is organized in five sections that (1) introduce the reader to AI and those subfields…

  10. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    International Nuclear Information System (INIS)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-01-01

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroic effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster

  11. Core-Shell Columns in High-Performance Liquid Chromatography: Food Analysis Applications

    OpenAIRE

    Preti, Raffaella

    2016-01-01

    The increased separation efficiency provided by the new technology of column packed with core-shell particles in high-performance liquid chromatography (HPLC) has resulted in their widespread diffusion in several analytical fields: from pharmaceutical, biological, environmental, and toxicological. The present paper presents their most recent applications in food analysis. Their use has proved to be particularly advantageous for the determination of compounds at trace levels or when a large am...

  12. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Miller, Dwight Peter

    2006-01-01

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate they would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.

  13. Performance Engineering for a Medical Imaging Application on the Intel Xeon Phi Accelerator

    OpenAIRE

    Hofmann, Johannes; Treibig, Jan; Hager, Georg; Wellein, Gerhard

    2013-01-01

    We examine the Xeon Phi, which is based on Intel's Many Integrated Cores architecture, for its suitability to run the FDK algorithm--the most commonly used algorithm to perform the 3D image reconstruction in cone-beam computed tomography. We study the challenges of efficiently parallelizing the application and means to enable sensible data sharing between threads despite the lack of a shared last level cache. Apart from parallelization, SIMD vectorization is critical for good performance on t...

  14. The Java EE architect's handbook how to be a successful application architect for Java EE applications

    CERN Document Server

    Ashmore, Derek C.

    2014-01-01

    This handbook is a concise guide to assuming the role of application architect for Java EE applications. This handbook will guide the application architect through the entire Java EE project including identifying business requirements, performing use-case analysis, object and data modeling, and guiding a development team during construction. This handbook will provide tips and techniques for communicating with project managers and management. This handbook will provide strategies for making your application easier and less costly to support. Whether you are about to architect your first Java EE application or are looking for ways to keep your projects on-time and on-budget, you will refer to this handbook again and again.

  15. The performance of blood pressure-to-height ratio as a screening measure for identifying children and adolescents with hypertension: a meta-analysis.

    Science.gov (United States)

    Ma, Chunming; Liu, Yue; Lu, Qiang; Lu, Na; Liu, Xiaoli; Tian, Yiming; Wang, Rui; Yin, Fuzai

    2016-02-01

    The blood pressure-to-height ratio (BPHR) has been shown to be an accurate index for screening hypertension in children and adolescents. The aim of the present study was to perform a meta-analysis to assess the performance of BPHR for the assessment of hypertension. Electronic and manual searches were performed to identify studies of the BPHR. After methodological quality assessment and data extraction, pooled estimates of the sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio, area under the receiver operating characteristic curve and summary receiver operating characteristics were assessed systematically. The extent of heterogeneity for it was assessed. Six studies were identified for analysis. The pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio and diagnostic odds ratio values of BPHR, for assessment of hypertension, were 96% [95% confidence interval (CI)=0.95-0.97], 90% (95% CI=0.90-0.91), 10.68 (95% CI=8.03-14.21), 0.04 (95% CI=0.03-0.07) and 247.82 (95% CI=114.50-536.34), respectively. The area under the receiver operating characteristic curve was 0.9472. The BPHR had higher diagnostic accuracies for identifying hypertension in children and adolescents.

  16. A bio-inspired methodology of identifying influential nodes in complex networks.

    Directory of Open Access Journals (Sweden)

    Cai Gao

    Full Text Available How to identify influential nodes is a key issue in complex networks. The degree centrality is simple, but is incapable to reflect the global characteristics of networks. Betweenness centrality and closeness centrality do not consider the location of nodes in the networks, and semi-local centrality, leaderRank and pageRank approaches can be only applied in unweighted networks. In this paper, a bio-inspired centrality measure model is proposed, which combines the Physarum centrality with the K-shell index obtained by K-shell decomposition analysis, to identify influential nodes in weighted networks. Then, we use the Susceptible-Infected (SI model to evaluate the performance. Examples and applications are given to demonstrate the adaptivity and efficiency of the proposed method. In addition, the results are compared with existing methods.

  17. Technology, Performance, and Market Report of Wind-Diesel Applications for Remote and Island Communities: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, I.; Dabo, M.

    2009-02-01

    This paper describes the current status of wind-diesel technology and its applications, the current research activities, and the remaining system technical and commercial challenges. System architectures, dispatch strategies, and operating experience from a variety of wind-diesel systems will be discussed, as well as how recent development to explore distributed energy generation solutions for wind generation can benefit from the performance experience of operating systems. The paper also includes a detailed discussion of the performance of wind-diesel applications in Alaska, where 10 wind-diesel stations are operating and additional systems are currently being implemented. Additionally, because this application represents an international opportunity, a community of interest committed to sharing technical and operating developments is being formed. The authors hope to encourage this expansion while allowing communities and nations to investigate the wind-diesel option for reducing their dependence on diesel-driven energy sources.

  18. Technology, Performance, and Market Report of Wind-Diesel Applications for Remote and Island Communities: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, I.; Dabo, M.

    2009-05-01

    This paper describes the current status of wind-diesel technology and its applications, the current research activities, and the remaining system technical and commercial challenges. System architectures, dispatch strategies, and operating experience from a variety of wind-diesel systems will be discussed, as well as how recent development to explore distributed energy generation solutions for wind generation can benefit from the performance experience of operating systems. The paper also includes a detailed discussion of the performance of wind-diesel applications in Alaska, where 10 wind-diesel stations are operating and additional systems are currently being implemented. Additionally, because this application represents an international opportunity, a community of interest committed to sharing technical and operating developments is being formed. The authors hope to encourage this expansion while allowing communities and nations to investigate the wind-diesel option for reducing their dependence on diesel-driven energy sources.

  19. Lifetime laser damage performance of β-Ga2O3 for high power applications

    Directory of Open Access Journals (Sweden)

    Jae-Hyuck Yoo

    2018-03-01

    Full Text Available Gallium oxide (Ga2O3 is an emerging wide bandgap semiconductor with potential applications in power electronics and high power optical systems where gallium nitride and silicon carbide have already demonstrated unique advantages compared to gallium arsenide and silicon-based devices. Establishing the stability and breakdown conditions of these next-generation materials is critical to assessing their potential performance in devices subjected to large electric fields. Here, using systematic laser damage performance tests, we establish that β-Ga2O3 has the highest lifetime optical damage performance of any conductive material measured to date, above 10 J/cm2 (1.4 GW/cm2. This has direct implications for its use as an active component in high power laser systems and may give insight into its utility for high-power switching applications. Both heteroepitaxial and bulk β-Ga2O3 samples were benchmarked against a heteroepitaxial gallium nitride sample, revealing an order of magnitude higher optical lifetime damage threshold for β-Ga2O3. Photoluminescence and Raman spectroscopy results suggest that the exceptional damage performance of β-Ga2O3 is due to lower absorptive defect concentrations and reduced epitaxial stress.

  20. Lifetime laser damage performance of β -Ga2O3 for high power applications

    Science.gov (United States)

    Yoo, Jae-Hyuck; Rafique, Subrina; Lange, Andrew; Zhao, Hongping; Elhadj, Selim

    2018-03-01

    Gallium oxide (Ga2O3) is an emerging wide bandgap semiconductor with potential applications in power electronics and high power optical systems where gallium nitride and silicon carbide have already demonstrated unique advantages compared to gallium arsenide and silicon-based devices. Establishing the stability and breakdown conditions of these next-generation materials is critical to assessing their potential performance in devices subjected to large electric fields. Here, using systematic laser damage performance tests, we establish that β-Ga2O3 has the highest lifetime optical damage performance of any conductive material measured to date, above 10 J/cm2 (1.4 GW/cm2). This has direct implications for its use as an active component in high power laser systems and may give insight into its utility for high-power switching applications. Both heteroepitaxial and bulk β-Ga2O3 samples were benchmarked against a heteroepitaxial gallium nitride sample, revealing an order of magnitude higher optical lifetime damage threshold for β-Ga2O3. Photoluminescence and Raman spectroscopy results suggest that the exceptional damage performance of β-Ga2O3 is due to lower absorptive defect concentrations and reduced epitaxial stress.

  1. Robust superhydrophobic bridged silsesquioxane aerogels with tunable performances and their applications.

    Science.gov (United States)

    Wang, Zhen; Wang, Dong; Qian, Zhenchao; Guo, Jing; Dong, Haixia; Zhao, Ning; Xu, Jian

    2015-01-28

    Aerogels are a family of highly porous materials whose applications are commonly restricted by poor mechanical properties. Herein, thiol-ene chemistry is employed to synthesize a series of novel bridged silsesquioxane (BSQ) precursors with various alkoxy groups. On the basis of the different hydrolyzing rates of the methoxy and ethoxy groups, robust superhydrophobic BSQ aerogels with tailorable morphology and mechanical performances have been prepared. The flexible thioether bridge contributes to the robustness of the as-formed aerogels, and the property can be tuned on the basis of the distinct combinations of alkoxy groups with the density of the aerogels almost unchanged. To the best of our knowledge, the lowest density among the ambient pressure dried aerogels is obtained. Further, potential application of the aerogels for oil/water separation and acoustic materials has also been presented.

  2. Economic performances optimization of the transcritical Rankine cycle systems in geothermal application

    International Nuclear Information System (INIS)

    Yang, Min-Hsiung; Yeh, Rong-Hua

    2015-01-01

    Highlights: • The optimal economic performance of the TRC system are investigated. • In economic evaluations, R125 performs the most satisfactorily, followed by R41 and CO 2 . • The TRC system with CO 2 has the largest averaged temperature difference. • Economic optimized pressures are always lower than thermodynamic optimized operating pressures. - Abstract: The aim of this study is to investigate the economic optimization of a TRC system for the application of geothermal energy. An economic parameter of net power output index, which is the ratio of net power output to the total cost, is applied to optimize the TRC system using CO 2 , R41 and R125 as working fluids. The maximum net power output index and the corresponding optimal operating pressures are obtained and evaluated for the TRC system. Furthermore, the analyses of the corresponding averaged temperature differences in the heat exchangers on the optimal economic performances of the TRC system are carried out. The effects of geothermal temperatures on the thermodynamic and economic optimizations are also revealed. In both optimal economic and thermodynamic evaluations, R125 performs the most satisfactorily, followed by R41 and CO 2 in the TRC system. In addition, the TRC system operated with CO 2 has the largest averaged temperature difference in the heat exchangers and thus has potential in future application for lower-temperature heat resources. The highest working pressures obtained from economic optimization are always lower than those from thermodynamic optimization for CO 2 , R41, and R125 in the TRC system

  3. Ultrasonically identified cap seal for LWR fuel bundles

    International Nuclear Information System (INIS)

    Buergers, W.; Dal Cero, J.; Crutzen, S.

    1981-01-01

    This paper aims to provide a general review of techniques available for surveillance and for sealing, marking or otherwise identifying material in such a way that its recognition and guarantee of integrity are unequivocally assured. The problem of obtaining such assurance has been the subject of work at Ispra and elsewhere. Some discussion of the problems, the work performed and possible solution are given. In addition, techniques which, although not yet in routine use, would be suitable for such application are described. Using industrial ultrasonic apparatus, the signal obtained by scanning the seals was very satisfactory, as was shown by the evaluation studies. The general method is based on reflection due to the great difference in acoustical impedance existing between the matrix and the inclusions

  4. Performance evaluation of near-real-time accounting systems

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Examples are given illustrating the application of near-real-time accounting concepts and principles to actual nuclear facilities. Experience with prototypical systems at the AGNS reprocessing plant and the Los Alamos plutonium facility is described using examples of actual data to illustrate the performance and effectiveness of near-real-time systems. The purpose of the session is to enable participants to: (1) identify the major components of near-real-time accounting systems; (2) describe qualitatively the advantages, limitations, and performance of such systems in real nuclear facilities; (3) identify process and facility design characteristics that affect the performance of near-real-time systems; and (4) describe qualitatively the steps necessary to implement a near-real-time accounting and control system in a nuclear facility

  5. Performance limitations of piezoelectric and force feedback electrostatic transducers in different applications

    International Nuclear Information System (INIS)

    Hadjiloucas, S; Walker, G C; Bowen, J W; Karatzas, L S

    2009-01-01

    Current limitations in piezoelectric and electrostatic transducers are discussed. A force-feedback electrostatic transducer capable of operating at bandwidths up to 20 kHz is described. Advantages of the proposed design are a linearised operation which simplifies the feedback control aspects and robustness of the performance characteristics to environmental perturbations. Applications in nanotechnology, optical sciences and acoustics are discussed.

  6. Performance limitations of piezoelectric and force feedback electrostatic transducers in different applications

    Energy Technology Data Exchange (ETDEWEB)

    Hadjiloucas, S; Walker, G C; Bowen, J W [Cybernetics, School of Systems Engineering, University of Reading, RG6 6AY (United Kingdom); Karatzas, L S, E-mail: s.hadjiloucas@reading.ac.u [Temasek Polytechnic, School of Engineering, 21 Tampines Avenue 1, Singapore, 529757 (Singapore)

    2009-07-01

    Current limitations in piezoelectric and electrostatic transducers are discussed. A force-feedback electrostatic transducer capable of operating at bandwidths up to 20 kHz is described. Advantages of the proposed design are a linearised operation which simplifies the feedback control aspects and robustness of the performance characteristics to environmental perturbations. Applications in nanotechnology, optical sciences and acoustics are discussed.

  7. Application of individually performed titanium mesh in infraorbital wall fracture reconstruction

    Directory of Open Access Journals (Sweden)

    Kai-Jian Sun

    2016-04-01

    Full Text Available AIM:To discuss the application effect of individually performed titanium mesh in infraorbital wall fracture reconstruction. METHODS:Sixty-seven patients(67 eyesdiagnosed as infraorbital fracture from January 2011 to February 2014 were performed reconstruction with individually performed titanium mesh. The recovery of incision, visual acuity, eyeball mobility, diplopia and proptosis were monitored by post-operation follow-up which lasted for 1a. RESULTS:No infection, titanium mesh transposition, prolapse, deformities, exclusion or ectropion were occurred in the follow-up period. The eyeball embole was less than 2mm by bilateral proptosis contrast. The diplopia in 5 eyes were disappeared in 4 and approved in one. The eyeball descent in 2 cases was disappeared. The visual acuity was the same compared with pre-operation. The rate of disappeared diplopia at primary position was 93% and improved significantly in the other 3 patients. The rate of disappeared diplopia at peripheral visual field was 86% and improved significantly in the other 2 patients.CONCLUSION:The reconstruction effect of individually performed titanium mesh in infraorbital wall fracture was satisfied and safe.

  8. Application of Ionic Liquids in High Performance Reversed-Phase Chromatography

    Directory of Open Access Journals (Sweden)

    Wentao Bi

    2009-06-01

    Full Text Available Ionic liquids, considered “green” chemicals, are widely used in many areas of analytical chemistry due to their unique properties. Recently, ionic liquids have been used as a kind of novel additive in separation and combined with silica to synthesize new stationary phase as separation media. This review will focus on the properties and mechanisms of ionic liquids and their potential applications as mobile phase modifier and surface-bonded stationary phase in reversed-phase high performance liquid chromatography (RP-HPLC. Ionic liquids demonstrate advantages and potential in chromatographic field.

  9. Further applications for mosaic pixel FPA technology

    Science.gov (United States)

    Liddiard, Kevin C.

    2011-06-01

    In previous papers to this SPIE forum the development of novel technology for next generation PIR security sensors has been described. This technology combines the mosaic pixel FPA concept with low cost optics and purpose-designed readout electronics to provide a higher performance and affordable alternative to current PIR sensor technology, including an imaging capability. Progressive development has resulted in increased performance and transition from conventional microbolometer fabrication to manufacture on 8 or 12 inch CMOS/MEMS fabrication lines. A number of spin-off applications have been identified. In this paper two specific applications are highlighted: high performance imaging IRFPA design and forest fire detection. The former involves optional design for small pixel high performance imaging. The latter involves cheap expendable sensors which can detect approaching fire fronts and send alarms with positional data via mobile phone or satellite link. We also introduce to this SPIE forum the application of microbolometer IR sensor technology to IoT, the Internet of Things.

  10. The Epidemiology of Injuries Identified at the National Football League Scouting Combine and their Impact on Professional Sport Performance: 2203 athletes, 2009-2015

    Science.gov (United States)

    Price, Mark D.; Rossy, William H.; Sanchez, George; McHale, Kevin Jude; Logan, Catherine; Provencher, Matthew T.

    2017-01-01

    Objectives: Normal At the annual National Football League (NFL) Scouting Combine, the medical staff of each NFL franchise performs a comprehensive medical evaluation of all athletes potentially entering the NFL. Currently, little is known regarding the overall epidemiology of injuries identified at the Combine and their impact on NFL performance. The purpose of this study is to determine the epidemiology of injuries identified at the Combine and their impact on future NFL performance. Methods: All previous musculoskeletal injuries identified at the NFL combine (2009-2015) were retrospectively reviewed. Medical records and imaging reports were examined. Game statistics for the first two seasons of NFL play were obtained for all players from 2009 to 2013. Analysis of injury prevalence and overall impact on draft status and position-specific performance metrics of each injury was performed and compared versus a position-matched control group with no history of injury and surgery. Results: A total of 2,203 athletes over seven years were evaluated, including 1,490 (67.6%) drafted athletes and 1,040 (47.2%) who ultimately played at least two years in the NFL. The most common sites of injury were the ankle (1160, 52.7%), shoulder (1143, 51.9%), knee (1128, 51.2%), spine (785, 35.6%), and hand (739, 33.5%). Odds ratios (OR) demonstrated quarterbacks were most at risk of shoulder injury (OR 2.78, p=0.001) while running backs most commonly sustained ankle (OR 1.49, p=0.038) and shoulder injuries (OR 1.55, p=0.022). Ultimately, defensive players demonstrated a more negative impact than offensive players following injury with multiple performance metrics impacted for each defensive position analyzed whereas skilled offensive players (i.e. quarterbacks, running backs) demonstrated only one metric affected at each position. Conclusion: The most common sites of injury identified at the Combine were: (1) ankle, (2) shoulder, (3) knee, (4) spine, and (5) hand. Overall, performance

  11. Data mining in bone marrow transplant records to identify patients with high odds of survival.

    Science.gov (United States)

    Taati, Babak; Snoek, Jasper; Aleman, Dionne; Ghavamzadeh, Ardeshir

    2014-01-01

    Patients undergoing a bone marrow stem cell transplant (BMT) face various risk factors. Analyzing data from past transplants could enhance the understanding of the factors influencing success. Records up to 120 measurements per transplant procedure from 1751 patients undergoing BMT were collected (Shariati Hospital). Collaborative filtering techniques allowed the processing of highly sparse records with 22.3% missing values. Ten-fold cross-validation was used to evaluate the performance of various classification algorithms trained on predicting the survival status. Modest accuracy levels were obtained in predicting the survival status (AUC = 0.69). More importantly, however, operations that had the highest chances of success were shown to be identifiable with high accuracy, e.g., 92% or 97% when identifying 74 or 31 recipients, respectively. Identifying the patients with the highest chances of survival has direct application in the prioritization of resources and in donor matching. For patients where high-confidence prediction is not achieved, assigning a probability to their survival odds has potential applications in probabilistic decision support systems and in combination with other sources of information.

  12. GPU Performance and Power Consumption Analysis: A DCT based denoising application

    OpenAIRE

    Pi Puig, Martín; De Giusti, Laura Cristina; Naiouf, Marcelo; De Giusti, Armando Eduardo

    2017-01-01

    It is known that energy and power consumption are becoming serious metrics in the design of high performance workstations because of heat dissipation problems. In the last years, GPU accelerators have been integrating many of these expensive systems despite they are embedding more and more transistors on their chips producing a quick increase of power consumption requirements. This paper analyzes an image processing application, in particular a Discrete Cosine Transform denoising algorithm, i...

  13. Application of DNA forensic techniques for identifying poached guanacos (Lama guanicoe) in Chilean Patagonia*.

    Science.gov (United States)

    Marín, Juan C; Saucedo, Cristian E; Corti, Paulo; González, Benito A

    2009-09-01

    Guanaco (Lama guanicoe) is a protected and widely distributed ungulate in South America. A poacher, after killing guanacos in Valle Chacabuco, Chilean Patagonia, transported and stored the meat. Samples were retrieved by local police but the suspect argued that the meat was from a horse. Mitochondrial cytochrome b gene (774 pb), 15 loci microsatellites, and SRY gene were used to identify the species, number of animals and their population origin, and the sex of the animals, respectively. Analysis revealed that the samples came from a female (absence of SRY gene) Patagonian guanaco (assignment probability between 0.0075 and 0.0282), and clearly distinguishing it from sympatric ungulates (E-value = 0). Based on the evidence obtained in the field in addition to forensic data, the suspect was convicted of poaching and illegally carrying fire arms. This is the first report of molecular tools being used in forensic investigations of Chilean wildlife indicating its promising future application in guanaco management and conservation.

  14. Integrated Approach Towards the Application of Horizontal Wells to Improve Waterflooding Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kelkar, Mohan; Liner, Chris; Kerr, Dennis

    1999-10-15

    This final report describes the progress during the six year of the project on ''Integrated Approach Towards the Application of Horizontal Wells to Improve Waterflooding Performance.'' This report is funded under the Department of Energy's (DOE's) Class I program which is targeted towards improving the reservoir performance of mature oil fields located in fluvially-dominated deltaic deposits. The project involves using an integrated approach to characterize the reservoir followed by drilling of horizontal injection wells to improve production performance. The project was divided into two budget periods. In the first budget period, many modern technologies were used to develop a detailed reservoir management plan; whereas, in the second budget period, conventional data was used to develop a reservoir management plan. The idea was to determine the cost effectiveness of various technologies in improving the performance of mature oil fields.

  15. Identifying Enterprise Leverage Points in Defense Acquisition Program Performance

    Science.gov (United States)

    2009-09-01

    differentiated . [108] Table 1: Table of Validation and Approval Authority5 Beyond the major categories used for programs as noted above, there is also a...impossible to identify which “ uber -portfolio” a system should belong to as many “portfolios” claim a system as an integral part of the larger portfolio...to differentiate between programs. DOD 5002, Enclosure E states “A technology project or acquisition program shall be categorized based on its

  16. Exploring performance and energy tradeoffs for irregular applications: A case study on the Tilera many-core architecture

    Energy Technology Data Exchange (ETDEWEB)

    Panyala, Ajay; Chavarría-Miranda, Daniel; Manzano, Joseph B.; Tumeo, Antonino; Halappanavar, Mahantesh

    2017-06-01

    High performance, parallel applications with irregular data accesses are becoming a critical workload class for modern systems. In particular, the execution of such workloads on emerging many-core systems is expected to be a significant component of applications in data mining, machine learning, scientific computing and graph analytics. However, power and energy constraints limit the capabilities of individual cores, memory hierarchy and on-chip interconnect of such systems, thus leading to architectural and software trade-os that must be understood in the context of the intended application’s behavior. Irregular applications are notoriously hard to optimize given their data-dependent access patterns, lack of structured locality and complex data structures and code patterns. We have ported two irregular applications, graph community detection using the Louvain method (Grappolo) and high-performance conjugate gradient (HPCCG), to the Tilera many-core system and have conducted a detailed study of platform-independent and platform-specific optimizations that improve their performance as well as reduce their overall energy consumption. To conduct this study, we employ an auto-tuning based approach that explores the optimization design space along three dimensions - memory layout schemes, GCC compiler flag choices and OpenMP loop scheduling options. We leverage MIT’s OpenTuner auto-tuning framework to explore and recommend energy optimal choices for different combinations of parameters. We then conduct an in-depth architectural characterization to understand the memory behavior of the selected workloads. Finally, we perform a correlation study to demonstrate the interplay between the hardware behavior and application characteristics. Using auto-tuning, we demonstrate whole-node energy savings and performance improvements of up to 49:6% and 60% relative to a baseline instantiation, and up to 31% and 45:4% relative to manually optimized variants.

  17. In search of novel, high performance and intelligent materials for applications in severe and unconditioned environments

    International Nuclear Information System (INIS)

    Gyeabour Ayensu, A. I.; Normeshie, C. M. K.

    2007-01-01

    For extreme operating conditions in aerospace, nuclear power plants and medical applications, novel materials have become more competitive over traditional materials because of the unique characteristics. Extensive research programmes are being undertaken to develop high performance and knowledge-intensive new materials, since existing materials cannot meet the stringent technological requirements of advanced materials for emerging industries. The technologies of intermetallic compounds, nanostructural materials, advanced composites, and photonics materials are presented. In addition, medical biomaterial implants of high functional performance based on biocompatibility, resistance against corrosion and degradation, and for applications in hostile environment of human body are discussed. The opportunities for African researchers to collaborate in international research programmes to develop local raw materials into high performance materials are also highlighted. (au)

  18. The Protein Identifier Cross-Referencing (PICR service: reconciling protein identifiers across multiple source databases

    Directory of Open Access Journals (Sweden)

    Leinonen Rasko

    2007-10-01

    Full Text Available Abstract Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR service, a web application that provides interactive and programmatic (SOAP and REST access to a mapping algorithm that uses the UniProt Archive (UniParc as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV or Microsoft Excel (XLS files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR

  19. State E-Government Strategies: Identifying Best Practices and Applications

    Science.gov (United States)

    2007-07-23

    Internet; ! Developing meaningful online applications for local government, businesses, educators, and other sectors; ! Establishing local “ eCommunity ...state, national, and international levels. However, frequently there is little meaningful coordination or communication between various e-government...weekly with the governor, 13% reported meeting monthly, and 21% reported “other,” meaning that these states have a different meeting schedule

  20. Mobile Application Identification based on Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Yang Xinyan

    2018-01-01

    Full Text Available With the increasing number of mobile applications, there has more challenging network management tasks to resolve. Users also face security issues of the mobile Internet application when enjoying the mobile network resources. Identifying applications that correspond to network traffic can help network operators effectively perform network management. The existing mobile application recognition technology presents new challenges in extensibility and applications with encryption protocols. For the existing mobile application recognition technology, there are two problems, they can not recognize the application which using the encryption protocol and their scalability is poor. In this paper, a mobile application identification method based on Hidden Markov Model(HMM is proposed to extract the defined statistical characteristics from different network flows generated when each application starting. According to the time information of different network flows to get the corresponding time series, and then for each application to be identified separately to establish the corresponding HMM model. Then, we use 10 common applications to test the method proposed in this paper. The test results show that the mobile application recognition method proposed in this paper has a high accuracy and good generalization ability.

  1. Identifying reverse 3PL performance critical success factors

    OpenAIRE

    Sharif, A M

    2009-01-01

    The reverse and third party logistics operational process is now well known and established to be a vital component of modern day supply chain and product / service-based organizations (Marasco, 2007). Apart from being a vital component of such enterprises, many researchers and practitioners have also been noting the importance of this approach and its impact on customer service, satisfaction, profitability and other key performance indicators (Autry et al., 2001). However, studies relating t...

  2. Rapid Prototyping of High Performance Signal Processing Applications

    Science.gov (United States)

    Sane, Nimish

    Advances in embedded systems for digital signal processing (DSP) are enabling many scientific projects and commercial applications. At the same time, these applications are key to driving advances in many important kinds of computing platforms. In this region of high performance DSP, rapid prototyping is critical for faster time-to-market (e.g., in the wireless communications industry) or time-to-science (e.g., in radio astronomy). DSP system architectures have evolved from being based on application specific integrated circuits (ASICs) to incorporate reconfigurable off-the-shelf field programmable gate arrays (FPGAs), the latest multiprocessors such as graphics processing units (GPUs), or heterogeneous combinations of such devices. We, thus, have a vast design space to explore based on performance trade-offs, and expanded by the multitude of possibilities for target platforms. In order to allow systematic design space exploration, and develop scalable and portable prototypes, model based design tools are increasingly used in design and implementation of embedded systems. These tools allow scalable high-level representations, model based semantics for analysis and optimization, and portable implementations that can be verified at higher levels of abstractions and targeted toward multiple platforms for implementation. The designer can experiment using such tools at an early stage in the design cycle, and employ the latest hardware at later stages. In this thesis, we have focused on dataflow-based approaches for rapid DSP system prototyping. This thesis contributes to various aspects of dataflow-based design flows and tools as follows: 1. We have introduced the concept of topological patterns, which exploits commonly found repetitive patterns in DSP algorithms to allow scalable, concise, and parameterizable representations of large scale dataflow graphs in high-level languages. We have shown how an underlying design tool can systematically exploit a high

  3. Human Performance Optimization Metrics: Consensus Findings, Gaps, and Recommendations for Future Research.

    Science.gov (United States)

    Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A

    2015-11-01

    Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts

  4. Core-Shell Columns in High-Performance Liquid Chromatography: Food Analysis Applications

    Science.gov (United States)

    Preti, Raffaella

    2016-01-01

    The increased separation efficiency provided by the new technology of column packed with core-shell particles in high-performance liquid chromatography (HPLC) has resulted in their widespread diffusion in several analytical fields: from pharmaceutical, biological, environmental, and toxicological. The present paper presents their most recent applications in food analysis. Their use has proved to be particularly advantageous for the determination of compounds at trace levels or when a large amount of samples must be analyzed fast using reliable and solvent-saving apparatus. The literature hereby described shows how the outstanding performances provided by core-shell particles column on a traditional HPLC instruments are comparable to those obtained with a costly UHPLC instrumentation, making this novel column a promising key tool in food analysis. PMID:27143972

  5. Multiple Genes Related to Muscle Identified through a Joint Analysis of a Two-stage Genome-wide Association Study for Racing Performance of 1,156 Thoroughbreds

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Shin

    2015-06-01

    Full Text Available Thoroughbred, a relatively recent horse breed, is best known for its use in horse racing. Although myostatin (MSTN variants have been reported to be highly associated with horse racing performance, the trait is more likely to be polygenic in nature. The purpose of this study was to identify genetic variants strongly associated with racing performance by using estimated breeding value (EBV for race time as a phenotype. We conducted a two-stage genome-wide association study to search for genetic variants associated with the EBV. In the first stage of genome-wide association study, a relatively large number of markers (~54,000 single-nucleotide polymorphisms, SNPs were evaluated in a small number of samples (240 horses. In the second stage, a relatively small number of markers identified to have large effects (170 SNPs were evaluated in a much larger number of samples (1,156 horses. We also validated the SNPs related to MSTN known to have large effects on racing performance and found significant associations in the stage two analysis, but not in stage one. We identified 28 significant SNPs related to 17 genes. Among these, six genes have a function related to myogenesis and five genes are involved in muscle maintenance. To our knowledge, these genes are newly reported for the genetic association with racing performance of Thoroughbreds. It complements a recent horse genome-wide association studies of racing performance that identified other SNPs and genes as the most significant variants. These results will help to expand our knowledge of the polygenic nature of racing performance in Thoroughbreds.

  6. Breaking barriers to interoperability: assigning spatially and temporally unique identifiers to spaces and buildings.

    Science.gov (United States)

    Pyke, Christopher R; Madan, Isaac

    2013-08-01

    The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.

  7. Identifying the oil price-macroeconomy relationship. An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    Oladosu, Gbadebo

    2009-01-01

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate - WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses. (author)

  8. Performance Analysis of Faulty Gallager-B Decoding of QC-LDPC Codes with Applications

    Directory of Open Access Journals (Sweden)

    O. Al Rasheed

    2014-06-01

    Full Text Available In this paper we evaluate the performance of Gallager-B algorithm, used for decoding low-density parity-check (LDPC codes, under unreliable message computation. Our analysis is restricted to LDPC codes constructed from circular matrices (QC-LDPC codes. Using Monte Carlo simulation we investigate the effects of different code parameters on coding system performance, under a binary symmetric communication channel and independent transient faults model. One possible application of the presented analysis in designing memory architecture with unreliable components is considered.

  9. Identifying key performance indicators in food technology contract R&D

    NARCIS (Netherlands)

    Flipse, S.M.; Sanden, van der M.C.A.; Velden, van der T.; Fortuin, F.T.J.M.; Omta, S.W.F.; Osseweijer, P.

    2013-01-01

    Innovating companies increasingly rely on outsourcing to Contract Research Organisations (CROs) for their Research and Development (R&D), which are largely understudied. This paper presents the outcome of a case study in the field of food technology contract research, identifying context

  10. Photoperiodic envelope: application of the generative design based on the performance of architectural envelopes, the exploring its shape and performance optimization

    International Nuclear Information System (INIS)

    Viquez Alas, Ernesto Alonso

    2013-01-01

    An alternative method of design is demonstrated to be used in the creation of an architectural envelope, through the application of tools and techniques such as algorithms, optimization, parametrization and simulation. The aesthetic criteria of the form are enriched to achieve the decrease in solar radiation rates. The methods and techniques of optimization, simulation, analysis and synthesis are habituated through the study of the contemporary paradigm of generative design and design by performance. Some of the applying of potential benefits an alternative design method and conditions to be met are designed to facilitate its application in the design of envelopes. A study of application and testing is demonstrated to explore the surround topology. The optimization results in relation to reducing the solar incidence are examined in a simulated environment [es

  11. Market projections of cellulose nanomaterial-enabled products- Part 1: Applications

    Science.gov (United States)

    Jo Anne Shatkin; Theodore H. Wegner; E.M. (Ted) Bilek; John Cowie

    2014-01-01

    Nanocellulose provides a new materials platform for the sustainable production of high-performance nano-enabled products in an array of applications. In this paper, potential applications for cellulose nanomaterials are identified as the first step toward estimating market volume. The overall study, presented in two parts, estimates market volume on the basis of...

  12. Performance characterization of gallium nitride HEMT cascode switch for power conditioning applications

    International Nuclear Information System (INIS)

    Chou, Po-Chien; Cheng, Stone

    2015-01-01

    Highlights: • We develop TO-257 cascoded GaN switch configuration in power conversion applications. • The normally-off cascode circuit provides 14.6 A/600 V characteristics. • Analysis of resistive and inductive switching performances shown in loaded circuits. • A 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. - Abstract: A hybrid cascoded GaN switch configuration is demonstrated in power conversion applications. A novel metal package is proposed for the packaging of a D-mode GaN MIS-HEMT cascoded with an integrated power MOSFET and a SBD. The normally-off cascode circuit provides a maximum drain current of 14.6 A and a blocking capability of 600 V. Analysis of 200 V/1 A power conversion characteristics are discussed and show the excellent switching performance in load circuits. Switching characteristics of the integral SiC SBD are also demonstrated. Finally, a 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. These results show that high-voltage GaN-HEMTs can be switching devices for an ultralow-loss converter circuit

  13. Performance characterization of gallium nitride HEMT cascode switch for power conditioning applications

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Po-Chien; Cheng, Stone, E-mail: stonecheng@mail.nctu.edu.tw

    2015-08-15

    Highlights: • We develop TO-257 cascoded GaN switch configuration in power conversion applications. • The normally-off cascode circuit provides 14.6 A/600 V characteristics. • Analysis of resistive and inductive switching performances shown in loaded circuits. • A 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. - Abstract: A hybrid cascoded GaN switch configuration is demonstrated in power conversion applications. A novel metal package is proposed for the packaging of a D-mode GaN MIS-HEMT cascoded with an integrated power MOSFET and a SBD. The normally-off cascode circuit provides a maximum drain current of 14.6 A and a blocking capability of 600 V. Analysis of 200 V/1 A power conversion characteristics are discussed and show the excellent switching performance in load circuits. Switching characteristics of the integral SiC SBD are also demonstrated. Finally, a 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. These results show that high-voltage GaN-HEMTs can be switching devices for an ultralow-loss converter circuit.

  14. Determination of performance characteristics of scientific applications on IBM Blue Gene/Q

    Energy Technology Data Exchange (ETDEWEB)

    Evangelinos, C. [IBM Research Division, Cambridge, MA (United States); Walkup, R. E. [IBM, Yorktown Heights, NY (United States). Thomas J. Watson Research Center; Sachdeva, V. [IBM Research Division, Cambridge, MA (United States); Jordan, K. E. [IBM Research Division, Cambridge, MA (United States); Gahvari, H. [Univ. of Illinois, Urbana-Champaign, IL (United States). Computer Science Dept.; Chung, I. -H. [IBM, Yorktown Heights, NY (United States). Thomas J. Watson Research Center; Perrone, M. P. [IBM, Yorktown Heights, NY (United States). Thomas J. Watson Research Center; Lu, L. [IBM, Yorktown Heights, NY (United States). Thomas J. Watson Research Center; Liu, L. -K. [IBM, Yorktown Heights, NY (United States). Thomas J. Watson Research Center; Magerlein, K. [IBM, Yorktown Heights, NY (United States). Thomas J. Watson Research Center

    2013-02-13

    The IBM Blue Gene®/Q platform presents scientists and engineers with a rich set of hardware features such as 16 cores per chip sharing a Level 2 cache, a wide SIMD (single-instruction, multiple-data) unit, a five-dimensional torus network, and hardware support for collective operations. Especially important is the feature related to cores that have four “hardware threads,” which makes it possible to hide latencies and obtain a high fraction of the peak issue rate from each core. All of these hardware resources present unique performance-tuning opportunities on Blue Gene/Q. We provide an overview of several important applications and solvers and study them on Blue Gene/Q using performance counters and Message Passing Interface profiles. We also discuss how Blue Gene/Q tools help us understand the interaction of the application with the hardware and software layers and provide guidance for optimization. Furthermore, on the basis of our analysis, we discuss code improvement strategies targeting Blue Gene/Q. Information about how these algorithms map to the Blue Gene® architecture is expected to have an impact on future system design as we move to the exascale era.

  15. Reusable Software Usability Specifications for mHealth Applications.

    Science.gov (United States)

    Cruz Zapata, Belén; Fernández-Alemán, José Luis; Toval, Ambrosio; Idri, Ali

    2018-01-25

    One of the key factors for the adoption of mobile technologies, and in particular of mobile health applications, is usability. A usable application will be easier to use and understand by users, and will improve user's interaction with it. This paper proposes a software requirements catalog for usable mobile health applications, which can be used for the development of new applications, or the evaluation of existing ones. The catalog is based on the main identified sources in literature on usability and mobile health applications. Our catalog was organized according to the ISO/IEC/IEEE 29148:2011 standard and follows the SIREN methodology to create reusable catalogs. The applicability of the catalog was verified by the creation of an audit method, which was used to perform the evaluation of a real app, S Health, application created by Samsung Electronics Co. The usability requirements catalog, along with the audit method, identified several usability flaws on the evaluated app, which scored 83%. Some flaws were detected in the app related to the navigation pattern. Some more issues related to the startup experience, empty screens or writing style were also found. The way a user navigates through an application improves or deteriorates user's experience with the application. We proposed a reusable usability catalog and an audit method. This proposal was used to evaluate a mobile health application. An audit report was created with the usability issues identified on the evaluated application.

  16. Exome sequencing identifies ZNF644 mutations in high myopia.

    Directory of Open Access Journals (Sweden)

    Yi Shi

    2011-06-01

    Full Text Available Myopia is the most common ocular disorder worldwide, and high myopia in particular is one of the leading causes of blindness. Genetic factors play a critical role in the development of myopia, especially high myopia. Recently, the exome sequencing approach has been successfully used for the disease gene identification of Mendelian disorders. Here we show a successful application of exome sequencing to identify a gene for an autosomal dominant disorder, and we have identified a gene potentially responsible for high myopia in a monogenic form. We captured exomes of two affected individuals from a Han Chinese family with high myopia and performed sequencing analysis by a second-generation sequencer with a mean coverage of 30× and sufficient depth to call variants at ∼97% of each targeted exome. The shared genetic variants of these two affected individuals in the family being studied were filtered against the 1000 Genomes Project and the dbSNP131 database. A mutation A672G in zinc finger protein 644 isoform 1 (ZNF644 was identified as being related to the phenotype of this family. After we performed sequencing analysis of the exons in the ZNF644 gene in 300 sporadic cases of high myopia, we identified an additional five mutations (I587V, R680G, C699Y, 3'UTR+12 C>G, and 3'UTR+592 G>A in 11 different patients. All these mutations were absent in 600 normal controls. The ZNF644 gene was expressed in human retinal and retinal pigment epithelium (RPE. Given that ZNF644 is predicted to be a transcription factor that may regulate genes involved in eye development, mutation may cause the axial elongation of eyeball found in high myopia patients. Our results suggest that ZNF644 might be a causal gene for high myopia in a monogenic form.

  17. Applicability of the Linear Sorption Isotherm Model to Represent Contaminant Transport Processes in Site Wide Performance Assessments

    International Nuclear Information System (INIS)

    FOGWELL, T.W.; LAST, G.V.

    2003-01-01

    The estimation of flux of contaminants through the vadose zone to the groundwater under varying geologic, hydrologic, and chemical conditions is key to making technically credible and sound decisions regarding soil site characterization and remediation, single-shell tank retrieval, and waste site closures (DOE 2000). One of the principal needs identified in the science and technology roadmap (DOE 2000) is the need to improve the conceptual and numerical models that describe the location of contaminants today, and to provide the basis for forecasting future movement of contaminants on both site-specific and site-wide scales. The State of Knowledge (DOE 1999) and Preliminary Concepts documents describe the importance of geochemical processes on the transport of contaminants through the Vadose Zone. These processes have been identified in the international list of Features, Events, and Processes (FEPs) (NEA 2000) and included in the list of FEPS currently being developed for Hanford Site assessments (Soler et al. 2001). The current vision for Hanford site-wide cumulative risk assessments as performed using the System Assessment Capability (SAC) is to represent contaminant adsorption using the linear isotherm (empirical distribution coefficient, K d ) sorption model. Integration Project Expert Panel (PEP) comments indicate that work is required to adequately justify the applicability of the linear sorption model, and to identify and defend the range of K d values that are adopted for assessments. The work plans developed for the Science and Technology (S and T) efforts, SAC, and the Core Projects must answer directly the question of ''Is there a scientific basis for the application of the linear sorption isotherm model to the complex wastes of the Hanford Site?'' This paper is intended to address these issues. The reason that well documented justification is required for using the linear sorption (K d ) model is that this approach is strictly empirical and is often

  18. High-performance floating-point image computing workstation for medical applications

    Science.gov (United States)

    Mills, Karl S.; Wong, Gilman K.; Kim, Yongmin

    1990-07-01

    The medical imaging field relies increasingly on imaging and graphics techniques in diverse applications with needs similar to (or more stringent than) those of the military, industrial and scientific communities. However, most image processing and graphics systems available for use in medical imaging today are either expensive, specialized, or in most cases both. High performance imaging and graphics workstations which can provide real-time results for a number of applications, while maintaining affordability and flexibility, can facilitate the application of digital image computing techniques in many different areas. This paper describes the hardware and software architecture of a medium-cost floating-point image processing and display subsystem for the NeXT computer, and its applications as a medical imaging workstation. Medical imaging applications of the workstation include use in a Picture Archiving and Communications System (PACS), in multimodal image processing and 3-D graphics workstation for a broad range of imaging modalities, and as an electronic alternator utilizing its multiple monitor display capability and large and fast frame buffer. The subsystem provides a 2048 x 2048 x 32-bit frame buffer (16 Mbytes of image storage) and supports both 8-bit gray scale and 32-bit true color images. When used to display 8-bit gray scale images, up to four different 256-color palettes may be used for each of four 2K x 2K x 8-bit image frames. Three of these image frames can be used simultaneously to provide pixel selectable region of interest display. A 1280 x 1024 pixel screen with 1: 1 aspect ratio can be windowed into the frame buffer for display of any portion of the processed image or images. In addition, the system provides hardware support for integer zoom and an 82-color cursor. This subsystem is implemented on an add-in board occupying a single slot in the NeXT computer. Up to three boards may be added to the NeXT for multiple display capability (e

  19. Differentiation and Exploration of Model MACP for HE VER 1.0 on Prototype Performance Measurement Application for Higher Education

    Science.gov (United States)

    El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis

    2018-02-01

    Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.

  20. New pulsed YAG laser performances in cutting thick metallic materials for nuclear applications

    International Nuclear Information System (INIS)

    Alfille, J.P.; Prunele, D. de; Pilot, G.

    1996-01-01

    The purpose of this study was to evaluate the capacities of the pulsed YAG laser thick cutting on metallic material and to compare with the CO 2 laser capacities. Stainless steel (304L) cutting tests were made in air and underwater using CO 2 and YAG lasers. A performance assessment was made for each laser and the wastes produced in the cutting operation were measured and the gases and the aerosols analyzed. The results show that the pulsed YAG laser is high performance tool for thick cutting and particularly attractive for nuclear applications

  1. Identifying a Superfluid Reynolds Number via Dynamical Similarity.

    Science.gov (United States)

    Reeves, M T; Billam, T P; Anderson, B P; Bradley, A S

    2015-04-17

    The Reynolds number provides a characterization of the transition to turbulent flow, with wide application in classical fluid dynamics. Identifying such a parameter in superfluid systems is challenging due to their fundamentally inviscid nature. Performing a systematic study of superfluid cylinder wakes in two dimensions, we observe dynamical similarity of the frequency of vortex shedding by a cylindrical obstacle. The universality of the turbulent wake dynamics is revealed by expressing shedding frequencies in terms of an appropriately defined superfluid Reynolds number, Re(s), that accounts for the breakdown of superfluid flow through quantum vortex shedding. For large obstacles, the dimensionless shedding frequency exhibits a universal form that is well-fitted by a classical empirical relation. In this regime the transition to turbulence occurs at Re(s)≈0.7, irrespective of obstacle width.

  2. MetReS, an Efficient Database for Genomic Applications.

    Science.gov (United States)

    Vilaplana, Jordi; Alves, Rui; Solsona, Francesc; Mateo, Jordi; Teixidó, Ivan; Pifarré, Marc

    2018-02-01

    MetReS (Metabolic Reconstruction Server) is a genomic database that is shared between two software applications that address important biological problems. Biblio-MetReS is a data-mining tool that enables the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the processes of interest and their function. The main goal of this work was to identify the areas where the performance of the MetReS database performance could be improved and to test whether this improvement would scale to larger datasets and more complex types of analysis. The study was started with a relational database, MySQL, which is the current database server used by the applications. We also tested the performance of an alternative data-handling framework, Apache Hadoop. Hadoop is currently used for large-scale data processing. We found that this data handling framework is likely to greatly improve the efficiency of the MetReS applications as the dataset and the processing needs increase by several orders of magnitude, as expected to happen in the near future.

  3. Practical application of the KMS: 1) total system performance assessment - 16349

    International Nuclear Information System (INIS)

    Makino, Hitoshi; Hioki, Kazumasa; Umeki, Hiroyuki; Yang, Hongzhi; Takase, Hiroyasu; McKinley, Ian

    2009-01-01

    Comprehensive total system performance assessment (PA) is a key component of the safety case. Within this PA there are a number of tasks that reuse specific models and datasets, together with associated knowledge base for the disposal system considered. These are tasks where recent developments in the Knowledge Management System by Japan Atomic Energy Agency (JAEA KMS) can lead to optimisation of procedures. This paper will outline the reformulation of PA as a Knowledge Management (KM) task, discuss application of KM technologies to PA tasks, and illustrate how these can be handled electronically in a 'Performance assessment All-In-one Report System (PAIRS)' utilising hyper-links and embedded tools to minimise duplication of material, ease Quality Assurance (QA) and facilitate the regular updating required in the Japanese programme. (authors)

  4. Performance Confirmation Data Acquisition System

    International Nuclear Information System (INIS)

    D.W. Markman

    2000-01-01

    The purpose of this analysis is to identify and analyze concepts for the acquisition of data in support of the Performance Confirmation (PC) program at the potential subsurface nuclear waste repository at Yucca Mountain. The scope and primary objectives of this analysis are to: (1) Review the criteria for design as presented in the Performance Confirmation Data Acquisition/Monitoring System Description Document, by way of the Input Transmittal, Performance Confirmation Input Criteria (CRWMS M and O 1999c). (2) Identify and describe existing and potential new trends in data acquisition system software and hardware that would support the PC plan. The data acquisition software and hardware will support the field instruments and equipment that will be installed for the observation and perimeter drift borehole monitoring, and in-situ monitoring within the emplacement drifts. The exhaust air monitoring requirements will be supported by a data communication network interface with the ventilation monitoring system database. (3) Identify the concepts and features that a data acquisition system should have in order to support the PC process and its activities. (4) Based on PC monitoring needs and available technologies, further develop concepts of a potential data acquisition system network in support of the PC program and the Site Recommendation and License Application

  5. Development of a Mobile Application for Building Energy Prediction Using Performance Prediction Model

    Directory of Open Access Journals (Sweden)

    Yu-Ri Kim

    2016-03-01

    Full Text Available Recently, the Korean government has enforced disclosure of building energy performance, so that such information can help owners and prospective buyers to make suitable investment plans. Such a building energy performance policy of the government makes it mandatory for the building owners to obtain engineering audits and thereby evaluate the energy performance levels of their buildings. However, to calculate energy performance levels (i.e., asset rating methodology, a qualified expert needs to have access to at least the full project documentation and/or conduct an on-site inspection of the buildings. Energy performance certification costs a lot of time and money. Moreover, the database of certified buildings is still actually quite small. A need, therefore, is increasing for a simplified and user-friendly energy performance prediction tool for non-specialists. Also, a database which allows building owners and users to compare best practices is required. In this regard, the current study developed a simplified performance prediction model through experimental design, energy simulations and ANOVA (analysis of variance. Furthermore, using the new prediction model, a related mobile application was also developed.

  6. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  7. Mobile Applications Privacy, Towards a methodology to identify over-privileged applications

    OpenAIRE

    NAI FOVINO Igor; NEISSE RICARDO; GENEIATAKIS DIMITRIOS; KOUNELIS IOANNIS

    2013-01-01

    Smart-phones are today used to perform a huge amount of online activities. They are used as interfaces to access the cloud, as storage resource, as social network tools, agenda, digital wallet, digital identity repository etc. In other words smart-phone are today the citizen’s digital companion, and, as such, they are the explicit or implicit repository of a huge amount of personal information. The criticality of these devices is generally due to the following considerations: ...

  8. NDE performance demonstration in the US nuclear power industry - applications, costs, lessons learned, and connection to NDE reliability

    International Nuclear Information System (INIS)

    Ammirato, F.

    1997-01-01

    Periodic inservice inspection (ISI) of nuclear power plant components is performed in the United States to satisfy legal commitments and to provide plant owners with reliable information for managing degradation. Performance demonstration provides credible evidence that ISI will fulfill its objectives. This paper examines the technical requirements for inspection and discusses how these technical needs are used to develop effective performance demonstration applications. NDE reliability is discussed with particular reference to its role in structural integrity assessments and its connection with performance demonstration. It is shown that the role of NDE reliability can range from very small to critical depending on the particular application and must be considered carefully in design of inspection techniques and performance demonstration programs used to qualify the inspection. Finally, the costs, benefits, and problems associated with performance demonstration are reviewed along with lessons learned from more than 15 years of performance demonstration experience in the US. (orig.)

  9. Multidimensional evaluation of performance with experimental application of balanced scorecard: a two year experience

    Science.gov (United States)

    2011-01-01

    Background In today's dynamic health-care system, organizations such as hospitals are required to improve their performance for multiple stakeholders and deliver an integrated care that means to work effectively, be innovative and organize efficiently. Achieved goals and levels of quality can be successfully measured by a multidimensional approach like Balanced Scorecard (BSC). The aim of the study was to verify the opportunity to introduce BSC framework to measure performance in St. Anna University Hospital of Ferrara, applying it to the Clinical Laboratory Operative Unit in order to compare over time performance results and achievements of assigned targets. Methods In the first experience with BSC we distinguished four perspectives, according to Kaplan and Norton, identified Key Performance Areas and Key Performance Indicators, set standards and weights for each objective, collected data for all indicators, recognized cause-and-effect relationships in a strategic map. One year later we proceeded with the next data collection and analysed the preservation of framework aptitude to measure Operative Unit performance. In addition, we verified the ability to underline links between strategic actions belonging to different perspectives in producing outcomes changes. Results The BSC was found to be effective for underlining existing problems and identifying opportunities for improvements. The BSC also revealed the specific perspective contribution to overall performance enhancement. After time results comparison was possible depending on the selection of feasible and appropriate key performance indicators, which was occasionally limited by data collection problems. Conclusions The first use of BSC to compare performance at Operative Unit level, in course of time, suggested this framework can be successfully adopted for results measuring and revealing effective health factors, allowing health-care quality improvements. PMID:21586111

  10. Multidimensional evaluation of performance with experimental application of balanced scorecard: a two year experience.

    Science.gov (United States)

    Lupi, Silvia; Verzola, Adriano; Carandina, Gianni; Salani, Manuela; Antonioli, Paola; Gregorio, Pasquale

    2011-05-17

    In today's dynamic health-care system, organizations such as hospitals are required to improve their performance for multiple stakeholders and deliver an integrated care that means to work effectively, be innovative and organize efficiently. Achieved goals and levels of quality can be successfully measured by a multidimensional approach like Balanced Scorecard (BSC). The aim of the study was to verify the opportunity to introduce BSC framework to measure performance in St. Anna University Hospital of Ferrara, applying it to the Clinical Laboratory Operative Unit in order to compare over time performance results and achievements of assigned targets. In the first experience with BSC we distinguished four perspectives, according to Kaplan and Norton, identified Key Performance Areas and Key Performance Indicators, set standards and weights for each objective, collected data for all indicators, recognized cause-and-effect relationships in a strategic map. One year later we proceeded with the next data collection and analysed the preservation of framework aptitude to measure Operative Unit performance. In addition, we verified the ability to underline links between strategic actions belonging to different perspectives in producing outcomes changes. The BSC was found to be effective for underlining existing problems and identifying opportunities for improvements. The BSC also revealed the specific perspective contribution to overall performance enhancement. After time results comparison was possible depending on the selection of feasible and appropriate key performance indicators, which was occasionally limited by data collection problems. The first use of BSC to compare performance at Operative Unit level, in course of time, suggested this framework can be successfully adopted for results measuring and revealing effective health factors, allowing health-care quality improvements.

  11. Performance Evaluation and Community Application of Low-Cost Sensors for Ozone and Nitrogen Dioxide

    Science.gov (United States)

    This study reports on the performance of electrochemical-based low-cost sensors and their use in a community application. CairClip sensors were collocated with federal reference and equivalent methods and operated in a network of sites by citizen scientists (community members) in...

  12. Identifying patterns of motor performance, executive functioning, and verbal ability in preschool children: A latent profile analysis.

    Science.gov (United States)

    Houwen, Suzanne; Kamphorst, Erica; van der Veer, Gerda; Cantell, Marja

    2018-04-30

    A relationship between motor performance and cognitive functioning is increasingly being recognized. Yet, little is known about the precise nature of the relationship between both domains, especially in early childhood. To identify distinct constellations of motor performance, executive functioning (EF), and verbal ability in preschool aged children; and to explore how individual and contextual variables are related to profile membership. The sample consisted of 119 3- to 4-year old children (62 boys; 52%). The home based assessments consisted of a standardized motor test (Movement Assessment Battery for Children - 2), five performance-based EF tasks measuring inhibition and working memory, and the Receptive Vocabulary subtest from the Wechsler Preschool and Primary Scale of Intelligence Third Edition. Parents filled out the Behavior Rating Inventory of Executive Function - Preschool version. Latent profile analysis (LPA) was used to delineate profiles of motor performance, EF, and verbal ability. Chi-square statistics and multinomial logistic regression analysis were used to examine whether profile membership was predicted by age, gender, risk of motor coordination difficulties, ADHD symptomatology, language problems, and socioeconomic status (SES). LPA yielded three profiles with qualitatively distinct response patterns of motor performance, EF, and verbal ability. Quantitatively, the profiles showed most pronounced differences with regard to parent ratings and performance-based tests of EF, as well as verbal ability. Risk of motor coordination difficulties and ADHD symptomatology were associated with profile membership, whereas age, gender, language problems, and SES were not. Our results indicate that there are distinct subpopulations of children who show differential relations with regard to motor performance, EF, and verbal ability. The fact that we found both quantitative as well as qualitative differences between the three patterns of profiles underscores

  13. Performance optimization of Sparse Matrix-Vector Multiplication for multi-component PDE-based applications using GPUs

    KAUST Repository

    Abdelfattah, Ahmad

    2016-05-23

    Simulations of many multi-component PDE-based applications, such as petroleum reservoirs or reacting flows, are dominated by the solution, on each time step and within each Newton step, of large sparse linear systems. The standard solver is a preconditioned Krylov method. Along with application of the preconditioner, memory-bound Sparse Matrix-Vector Multiplication (SpMV) is the most time-consuming operation in such solvers. Multi-species models produce Jacobians with a dense block structure, where the block size can be as large as a few dozen. Failing to exploit this dense block structure vastly underutilizes hardware capable of delivering high performance on dense BLAS operations. This paper presents a GPU-accelerated SpMV kernel for block-sparse matrices. Dense matrix-vector multiplications within the sparse-block structure leverage optimization techniques from the KBLAS library, a high performance library for dense BLAS kernels. The design ideas of KBLAS can be applied to block-sparse matrices. Furthermore, a technique is proposed to balance the workload among thread blocks when there are large variations in the lengths of nonzero rows. Multi-GPU performance is highlighted. The proposed SpMV kernel outperforms existing state-of-the-art implementations using matrices with real structures from different applications. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Performance optimization of Sparse Matrix-Vector Multiplication for multi-component PDE-based applications using GPUs

    KAUST Repository

    Abdelfattah, Ahmad; Ltaief, Hatem; Keyes, David E.; Dongarra, Jack

    2016-01-01

    Simulations of many multi-component PDE-based applications, such as petroleum reservoirs or reacting flows, are dominated by the solution, on each time step and within each Newton step, of large sparse linear systems. The standard solver is a preconditioned Krylov method. Along with application of the preconditioner, memory-bound Sparse Matrix-Vector Multiplication (SpMV) is the most time-consuming operation in such solvers. Multi-species models produce Jacobians with a dense block structure, where the block size can be as large as a few dozen. Failing to exploit this dense block structure vastly underutilizes hardware capable of delivering high performance on dense BLAS operations. This paper presents a GPU-accelerated SpMV kernel for block-sparse matrices. Dense matrix-vector multiplications within the sparse-block structure leverage optimization techniques from the KBLAS library, a high performance library for dense BLAS kernels. The design ideas of KBLAS can be applied to block-sparse matrices. Furthermore, a technique is proposed to balance the workload among thread blocks when there are large variations in the lengths of nonzero rows. Multi-GPU performance is highlighted. The proposed SpMV kernel outperforms existing state-of-the-art implementations using matrices with real structures from different applications. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Analysis of Application Power and Schedule Composition in a High Performance Computing Environment

    Energy Technology Data Exchange (ETDEWEB)

    Elmore, Ryan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gruchalla, Kenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Phillips, Caleb [National Renewable Energy Lab. (NREL), Golden, CO (United States); Purkayastha, Avi [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wunder, Nick [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-05

    As the capacity of high performance computing (HPC) systems continues to grow, small changes in energy management have the potential to produce significant energy savings. In this paper, we employ an extensive informatics system for aggregating and analyzing real-time performance and power use data to evaluate energy footprints of jobs running in an HPC data center. We look at the effects of algorithmic choices for a given job on the resulting energy footprints, and analyze application-specific power consumption, and summarize average power use in the aggregate. All of these views reveal meaningful power variance between classes of applications as well as chosen methods for a given job. Using these data, we discuss energy-aware cost-saving strategies based on reordering the HPC job schedule. Using historical job and power data, we present a hypothetical job schedule reordering that: (1) reduces the facility's peak power draw and (2) manages power in conjunction with a large-scale photovoltaic array. Lastly, we leverage this data to understand the practical limits on predicting key power use metrics at the time of submission.

  16. Energy technologies for distributed utility applications: Cost and performance trends, and implications for photovoltaics

    International Nuclear Information System (INIS)

    Eyer, J.M.

    1994-01-01

    Utilities are evaluating several electric generation and storage (G ampersand S) technologies for distributed utility (DU) applications. Attributes of leading DU technologies and implications for photovoltaics (PV) are described. Included is a survey of present and projected cost and performance for: (1) small, advanced combustion turbines (CTs); (2) advanced, natural gas-fired, diesel engines (diesel engines); and (3) advanced lead-acid battery systems (batteries). Technology drivers and relative qualitative benefits are described. A levelized energy cost-based cost target for PV for DU applications is provided. The analysis addresses only relative cost, for PV and for three selected alternative DU technologies. Comparable size, utility, and benefits are assumed, although relative value is application-specific and often technology- and site-specific

  17. Power system technical performance issues related to the application of long HVAC cables

    DEFF Research Database (Denmark)

    Wiechowski, on behalf of Cigre WG C4.502, W.; Sluis, L. V. der; Ohno, Teruo

    2011-01-01

    This paper reports the progress of work of Cigre Working Group C4.502 “Power system technical performance issues related to the application of long HVAC cables”. The primary goal of the WG C4.502 is to write a technical brochure that will serve as practical guide for performing studies necessary...... for assessing the technical performance of HV/EHV systems with large share of AC cable lines. This paper besides providing a background for formulation of WG C4.502 and its overall aim, describes the tasks that were accomplished before the interim report was submitted to Study Committee C4 System Technical...... Performance in August 2010. The work in the WG is ongoing and final report will be ready according to the time schedule in 2012. The focus of this paper is in particular to show all issues related to system technical performance with assigned weights in terms of their importance and/or uniqueness for cable...

  18. Identifying Social Impacts in Product Supply Chains:Overview and Application of the Social Hotspot Database

    Directory of Open Access Journals (Sweden)

    Gregory Norris

    2012-08-01

    Full Text Available One emerging tool to measure the social-related impacts in supply chains is Social Life Cycle Assessment (S-LCA, a derivative of the well-established environmental LCA technique. LCA has recently started to gain popularity among large corporations and initiatives, such as The Sustainability Consortium or the Sustainable Apparel Coalition. Both have made the technique a cornerstone of their applied-research program. The Social Hotspots Database (SHDB is an overarching, global database that eases the data collection burden in S-LCA studies. Proposed “hotspots” are production activities or unit processes (also defined as country-specific sectors in the supply chain that may be at risk for social issues to be present. The SHDB enables efficient application of S-LCA by allowing users to prioritize production activities for which site-specific data collection is most desirable. Data for three criteria are used to inform prioritization: (1 labor intensity in worker hours per unit process and (2 risk for, or opportunity to affect, relevant social themes or sub-categories related to Human Rights, Labor Rights and Decent Work, Governance and Access to Community Services (3 gravity of a social issue. The Worker Hours Model was developed using a global input/output economic model and wage rate data. Nearly 200 reputable sources of statistical data have been used to develop 20 Social Theme Tables by country and sector. This paper presents an overview of the SHDB development and features, as well as results from a pilot study conducted on strawberry yogurt. This study, one of seven Social Scoping Assessments mandated by The Sustainability Consortium, identifies the potential social hotspots existing in the supply chain of strawberry yogurt. With this knowledge, companies that manufacture or sell yogurt can refine their data collection efforts in order to put their social responsibility performance in perspective and effectively set up programs and

  19. Application of adjoint sensitivity theory to performance assessment of hydrogeologic concerns

    International Nuclear Information System (INIS)

    Metcalfe, D.E.; Harper, W.V.

    1986-01-01

    Sensitivity and uncertainty analyses are important components of performance assessment activities for potential high-level radioactive waste repositories. The application of the adjoint sensitivity technique is demonstrated for the Leadville Limestone in the Paradox Basin, Utah. The adjoint technique is used sequentially to first assist in the calibration of the regional conceptual ground-water flow model to measured potentiometric data. Second, it is used to evaluate the sensitivities of the calculated pressures used to define local scale boundary conditions to regional parameters and boundary conditions

  20. High performance polypyrrole coating for corrosion protection and biocidal applications

    Science.gov (United States)

    Nautiyal, Amit; Qiao, Mingyu; Cook, Jonathan Edwin; Zhang, Xinyu; Huang, Tung-Shi

    2018-01-01

    Polypyrrole (PPy) coating was electrochemically synthesized on carbon steel using sulfonic acids as dopants: p-toluene sulfonic acid (p-TSA), sulfuric acid (SA), (±) camphor sulfonic acid (CSA), sodium dodecyl sulfate (SDS), and sodium dodecylbenzene sulfonate (SDBS). The effect of acidic dopants (p-TSA, SA, CSA) on passivation of carbon steel was investigated by linear potentiodynamic and compared with morphology and corrosion protection performance of the coating produced. The types of the dopants used were significantly affecting the protection efficiency of the coating against chloride ion attack on the metal surface. The corrosion performance depends on size and alignment of dopant in the polymer backbone. Both p-TSA and SDBS have extra benzene ring that stack together to form a lamellar sheet like barrier to chloride ions thus making them appropriate dopants for PPy coating in suppressing the corrosion at significant level. Further, adhesion performance was enhanced by adding long chain carboxylic acid (decanoic acid) directly in the monomer solution. In addition, PPy coating doped with SDBS displayed excellent biocidal abilities against Staphylococcus aureus. The polypyrrole coatings on carbon steels with dual function of anti-corrosion and excellent biocidal properties shows great potential application in the industry for anti-corrosion/antimicrobial purposes.

  1. Audit Techniques for Service Oriented Architecture Applications

    Directory of Open Access Journals (Sweden)

    Liviu Adrian COTFAS

    2010-01-01

    Full Text Available The Service Oriented Architecture (SOA approach enables the development of flexible distributed applications. Auditing such applications implies several specific challenges related to interoperability, performance and security. The service oriented architecture model is described and the advantages of this approach are analyzed. We also highlight several quality attributes and potential risks in SOA applications that an architect should be aware when designing a distributed system. Key risk factors are identified and a model for risk evaluation is introduced. The top reasons for auditing SOA applications are presented as well as the most important standards. The steps for a successful audit process are given and discussed.

  2. High performance graphics processors for medical imaging applications

    International Nuclear Information System (INIS)

    Goldwasser, S.M.; Reynolds, R.A.; Talton, D.A.; Walsh, E.S.

    1989-01-01

    This paper describes a family of high- performance graphics processors with special hardware for interactive visualization of 3D human anatomy. The basic architecture expands to multiple parallel processors, each processor using pipelined arithmetic and logical units for high-speed rendering of Computed Tomography (CT), Magnetic Resonance (MR) and Positron Emission Tomography (PET) data. User-selectable display alternatives include multiple 2D axial slices, reformatted images in sagittal or coronal planes and shaded 3D views. Special facilities support applications requiring color-coded display of multiple datasets (such as radiation therapy planning), or dynamic replay of time- varying volumetric data (such as cine-CT or gated MR studies of the beating heart). The current implementation is a single processor system which generates reformatted images in true real time (30 frames per second), and shaded 3D views in a few seconds per frame. It accepts full scale medical datasets in their native formats, so that minimal preprocessing delay exists between data acquisition and display

  3. Applications of high power microwaves

    International Nuclear Information System (INIS)

    Benford, J.; Swegle, J.

    1993-01-01

    The authors address a number of applications for HPM technology. There is a strong symbiotic relationship between a developing technology and its emerging applications. New technologies can generate new applications. Conversely, applications can demand development of new technological capability. High-power microwave generating systems come with size and weight penalties and problems associated with the x-radiation and collection of the electron beam. Acceptance of these difficulties requires the identification of a set of applications for which high-power operation is either demanded or results in significant improvements in peRFormance. The authors identify the following applications, and discuss their requirements and operational issues: (1) High-energy RF acceleration; (2) Atmospheric modification (both to produce artificial ionospheric mirrors for radio waves and to save the ozone layer); (3) Radar; (4) Electronic warfare; and (5) Laser pumping. In addition, they discuss several applications requiring high average power than border on HPM, power beaming and plasma heating

  4. KrF laser cost/performance model for ICF commercial applications

    International Nuclear Information System (INIS)

    Harris, D.B.; Pendergrass, J.H.

    1985-01-01

    Simple expressions suitable for use in commercial-applications plant parameter studies for the direct capital cost plus indirect field costs and for the efficiency as a function of repetition rate were developed for pure-optical-compression KrF laser fusion drivers. These simple expressions summarize estimates obtained from detailed cost-performance studies incorporating recent results of ongoing physics, design, and cost studies. Contributions of KrF laser capital charges and D and M costs to total levelized constant-dollar (1984) unit ICF power generation cost are estimated as a function of plant size and driver pulse energy using a published gain for short-wavelength lasers and representative values of plant parameters

  5. Performance analysis of a waste heat recovery thermoelectric generation system for automotive application

    International Nuclear Information System (INIS)

    Liu, X.; Deng, Y.D.; Li, Z.; Su, C.Q.

    2015-01-01

    Graphical abstract: A new automotive exhaust-based thermoelectric generator and its “four-TEGs” system are constructed, and the performance characteristics of system are discussed through road test and revolving drum test. - Highlights: • The automotive thermoelectric generator system was constructed and studied. • Road test and revolving drum test were used to measure the output power. • A performance of 201.7 V (open circuit voltage)/944 W obtained. - Abstract: Thermoelectric power generators are one of the promising green energy sources. In this case study, an energy-harvesting system which extracts heat from an automotive exhaust pipe and turns the heat into electricity by using thermoelectric power generators (TEGs) has been constructed. The test bench is developed to analysis the performance of TEG system characteristics, which are undertaken to assess the feasibility of automotive applications. Based on the test bench, a new system called “four-TEGs” system is designed and assembled into prototype vehicle called “Warrior”, through the road test and revolving drum test table, characteristics of the system such as hot-side temperature, cold-side temperature, open circuit voltage and power output are studied, and a maximum power of 944 W was obtained, which completely meets the automotive application. The present study shows the promising potential of using this kind of thermoelectric generator for low-temperature waste heat recovery vehicle

  6. A study on the influence diagrams for the application to containment performance analysis

    International Nuclear Information System (INIS)

    Park, Joon Won

    1995-02-01

    Influence diagrams have been applied to containment performance analysis of Young-Gwang 3 and 4 in an effort to explicitly display the dependencies between events and to treat operator intervention more generally. This study has been initiated to remove the three major drawbacks of the current event tree methodology: 1) Event tree cannot express dependency between events explicitly. 2) Accident Progression Event Tree (APET) cannot represent entire containment system. 3) It is difficult to consider operator intervention with event tree. To resolve these problems, a new approach, i.e., influence diagrams, are proposed. In the present work, the applicability of the influence diagrams have been demonstrated to YGN 3 and 4 containment performance analysis and an assessment of accident management strategies. To show that the results of the application of influence diagrams are reasonable, results are compared with that of YGN 3 and 4 IPE. Both results are in good agreement. In addition, influence diagrams are used to assess two accident management strategies: 1) RCS depressurization, 2) cavity flooding. Cavity flooding has a favorable effect to late containment failure and basemat melt-through, and depressurization of RCS is good for steam generator tube rupture. However, early containment failure probability is worse in both cases. As a result of the present study, it is shown that influence diagrams can be applied to the containment performance analysis

  7. The effect of the use of android-based application in learning together to improve students' academic performance

    Science.gov (United States)

    Ulfa, Andi Maria; Sugiyarto, Kristian H.; Ikhsan, Jaslin

    2017-05-01

    Poor achievement of students' performance on Chemistry may result from unfavourable learning processes. Therefore, innovation on learning process must be created. Regarding fast development of mobile technology, learning process cannot ignore the crucial role of the technology. This research and development (R&D) studies was done to develop android based application and to study the effect of its integration in Learning together (LT) into the improvement of students' learning creativity and cognitive achievement. The development of the application was carried out by adapting Borg & Gall and Dick & Carey model. The developed-product was reviewed by chemist, learning media practitioners, peer reviewers, and educators. After the revision based on the reviews, the application was used in the LT model on the topic of Stoichiometry in a senior high school. The instruments were questionnaires to get comments and suggestion from the reviewers about the application, and the another questionnaire was to collect the data of learning creativity. Another instrument used was a set of test by which data of students' achievement was collected. The results showed that the use of the mobile based application on Learning Together can bring about significant improvement of students' performance including creativity and cognitive achievement.

  8. APPLICATION OF GIS AND GROUNDWATER MODELLING TECHNIQUES TO IDENTIFY THE PERCHED AQUIFERS TO DEMARKATE WATER LOGGING CONDITIONS IN PARTS OF MEHSANA

    Directory of Open Access Journals (Sweden)

    D. Rawal

    2016-06-01

    The study highlights the application of GIS in establishing the basic parameters of soil, land use and the distribution of water logging over a period of time and the groundwater modelling identifies the groundwater regime of the area and estimates the total recharge to the area due to surface water irrigation and rainfall and suggests suitable method to control water logging in the area.

  9. Robust design principles for reducing variation in functional performance

    DEFF Research Database (Denmark)

    Christensen, Martin Ebro; Howard, Thomas J.

    2016-01-01

    This paper identifies, describes and classifies a comprehensive collection of variation reduction principles (VRP) that can be used to increase the robustness of a product and reduce its variation in functional performance. Performance variation has a negative effect on the reliability and percei......This paper identifies, describes and classifies a comprehensive collection of variation reduction principles (VRP) that can be used to increase the robustness of a product and reduce its variation in functional performance. Performance variation has a negative effect on the reliability...... and perceived quality of a product and efforts should be made to minimise it. The design principles are identified by a systematic decomposition of the Taguchi Transfer Function in combination with the use of existing literature and the authors’ experience. The paper presents 15 principles and describes...... their advantages and disadvantages along with example cases. Subsequently, the principles are classified based on their applicability in the various development and production stages. The VRP are to be added to existing robust design methodologies, helping the designer to think beyond robust design tool and method...

  10. USING DISTANCE SENSORS TO PERFORM COLLISION AVOIDANCE MANEUVRES ON UAV APPLICATIONS

    Directory of Open Access Journals (Sweden)

    A. Raimundo

    2017-08-01

    Full Text Available The Unmanned Aerial Vehicles (UAV and its applications are growing for both civilian and military purposes. The operability of an UAV proved that some tasks and operations can be done easily and at a good cost-efficiency ratio. Nowadays, an UAV can perform autonomous missions. It is very useful to certain UAV applications, such as meteorology, vigilance systems, agriculture, environment mapping and search and rescue operations. One of the biggest problems that an UAV faces is the possibility of collision with other objects in the flight area. To avoid this, an algorithm was developed and implemented in order to prevent UAV collision with other objects. “Sense and Avoid” algorithm was developed as a system for UAVs to avoid objects in collision course. This algorithm uses a Light Detection and Ranging (LiDAR, to detect objects facing the UAV in mid-flights. This light sensor is connected to an on-board hardware, Pixhawk’s flight controller, which interfaces its communications with another hardware: Raspberry Pi. Communications between Ground Control Station and UAV are made via Wi-Fi or cellular third or fourth generation (3G/4G. Some tests were made in order to evaluate the “Sense and Avoid” algorithm’s overall performance. These tests were done in two different environments: A 3D simulated environment and a real outdoor environment. Both modes worked successfully on a simulated 3D environment, and “Brake” mode on a real outdoor, proving its concepts.

  11. Using Distance Sensors to Perform Collision Avoidance Maneuvres on Uav Applications

    Science.gov (United States)

    Raimundo, A.; Peres, D.; Santos, N.; Sebastião, P.; Souto, N.

    2017-08-01

    The Unmanned Aerial Vehicles (UAV) and its applications are growing for both civilian and military purposes. The operability of an UAV proved that some tasks and operations can be done easily and at a good cost-efficiency ratio. Nowadays, an UAV can perform autonomous missions. It is very useful to certain UAV applications, such as meteorology, vigilance systems, agriculture, environment mapping and search and rescue operations. One of the biggest problems that an UAV faces is the possibility of collision with other objects in the flight area. To avoid this, an algorithm was developed and implemented in order to prevent UAV collision with other objects. "Sense and Avoid" algorithm was developed as a system for UAVs to avoid objects in collision course. This algorithm uses a Light Detection and Ranging (LiDAR), to detect objects facing the UAV in mid-flights. This light sensor is connected to an on-board hardware, Pixhawk's flight controller, which interfaces its communications with another hardware: Raspberry Pi. Communications between Ground Control Station and UAV are made via Wi-Fi or cellular third or fourth generation (3G/4G). Some tests were made in order to evaluate the "Sense and Avoid" algorithm's overall performance. These tests were done in two different environments: A 3D simulated environment and a real outdoor environment. Both modes worked successfully on a simulated 3D environment, and "Brake" mode on a real outdoor, proving its concepts.

  12. Network analysis of patient flow in two UK acute care hospitals identifies key sub-networks for A&E performance.

    Science.gov (United States)

    Bean, Daniel M; Stringer, Clive; Beeknoo, Neeraj; Teo, James; Dobson, Richard J B

    2017-01-01

    The topology of the patient flow network in a hospital is complex, comprising hundreds of overlapping patient journeys, and is a determinant of operational efficiency. To understand the network architecture of patient flow, we performed a data-driven network analysis of patient flow through two acute hospital sites of King's College Hospital NHS Foundation Trust. Administration databases were queried for all intra-hospital patient transfers in an 18-month period and modelled as a dynamic weighted directed graph. A 'core' subnetwork containing only 13-17% of all edges channelled 83-90% of the patient flow, while an 'ephemeral' network constituted the remainder. Unsupervised cluster analysis and differential network analysis identified sub-networks where traffic is most associated with A&E performance. Increased flow to clinical decision units was associated with the best A&E performance in both sites. The component analysis also detected a weekend effect on patient transfers which was not associated with performance. We have performed the first data-driven hypothesis-free analysis of patient flow which can enhance understanding of whole healthcare systems. Such analysis can drive transformation in healthcare as it has in industries such as manufacturing.

  13. Application of balanced score card in the development of performance indicator system in nuclear power plant

    International Nuclear Information System (INIS)

    Shen Shuguang; Huang Fang; Fang Zhaoxia

    2013-01-01

    Performance indicator, which is one of ten performance monitoring tools recommended by WANO performance improvement model, has become an effective tool for performance improvement of nuclear power plant. At present, performance indicator system has been built in nuclear power plant. However, how to establish the performance indicator system that is reasonable and applicable for plant is still a question to be discussed. Performance indictor is closely tied to the strategic direction of a corporation by a balanced score card, and the performance indicator system is established from the point of performance management and strategic development. The performance indicator system of nuclear power plant is developed by introducing the balanced score card, and can be as a reference for other domestic nuclear power plants. (authors)

  14. Multidimensional evaluation of performance with experimental application of balanced scorecard: a two year experience

    Directory of Open Access Journals (Sweden)

    Antonioli Paola

    2011-05-01

    Full Text Available Abstract Background In today's dynamic health-care system, organizations such as hospitals are required to improve their performance for multiple stakeholders and deliver an integrated care that means to work effectively, be innovative and organize efficiently. Achieved goals and levels of quality can be successfully measured by a multidimensional approach like Balanced Scorecard (BSC. The aim of the study was to verify the opportunity to introduce BSC framework to measure performance in St. Anna University Hospital of Ferrara, applying it to the Clinical Laboratory Operative Unit in order to compare over time performance results and achievements of assigned targets. Methods In the first experience with BSC we distinguished four perspectives, according to Kaplan and Norton, identified Key Performance Areas and Key Performance Indicators, set standards and weights for each objective, collected data for all indicators, recognized cause-and-effect relationships in a strategic map. One year later we proceeded with the next data collection and analysed the preservation of framework aptitude to measure Operative Unit performance. In addition, we verified the ability to underline links between strategic actions belonging to different perspectives in producing outcomes changes. Results The BSC was found to be effective for underlining existing problems and identifying opportunities for improvements. The BSC also revealed the specific perspective contribution to overall performance enhancement. After time results comparison was possible depending on the selection of feasible and appropriate key performance indicators, which was occasionally limited by data collection problems. Conclusions The first use of BSC to compare performance at Operative Unit level, in course of time, suggested this framework can be successfully adopted for results measuring and revealing effective health factors, allowing health-care quality improvements.

  15. Application of ANN-SCE model on the evaluation of automatic generation control performance

    Energy Technology Data Exchange (ETDEWEB)

    Chang-Chien, L.R.; Lo, C.S.; Lee, K.S. [National Cheng Kung Univ., Tainan, Taiwan (China)

    2005-07-01

    An accurate evaluation of load frequency control (LFC) performance is needed to balance minute-to-minute electricity generation and demand. In this study, an artificial neural network-based system control error (ANN-SCE) model was used to assess the performance of automatic generation controls (AGC). The model was used to identify system dynamics for control references in supplementing AGC logic. The artificial neural network control error model was used to track a single area's LFC dynamics in Taiwan. The model was used to gauge the impacts of regulation control. Results of the training, evaluating, and projecting processes showed that the ANN-SCE model could be algebraically decomposed into components corresponding to different impact factors. The SCE information obtained from testing of various AGC gains provided data for the creation of a new control approach. The ANN-SCE model was used in conjunction with load forecasting and scheduled generation data to create an ANN-SCE identifier. The model successfully simulated SCE dynamics. 13 refs., 10 figs.

  16. Whole genome association study identifies regions of the bovine genome and biological pathways involved in carcass trait performance in Holstein-Friesian cattle.

    Science.gov (United States)

    Doran, Anthony G; Berry, Donagh P; Creevey, Christopher J

    2014-10-01

    Four traits related to carcass performance have been identified as economically important in beef production: carcass weight, carcass fat, carcass conformation of progeny and cull cow carcass weight. Although Holstein-Friesian cattle are primarily utilized for milk production, they are also an important source of meat for beef production and export. Because of this, there is great interest in understanding the underlying genomic structure influencing these traits. Several genome-wide association studies have identified regions of the bovine genome associated with growth or carcass traits, however, little is known about the mechanisms or underlying biological pathways involved. This study aims to detect regions of the bovine genome associated with carcass performance traits (employing a panel of 54,001 SNPs) using measures of genetic merit (as predicted transmitting abilities) for 5,705 Irish Holstein-Friesian animals. Candidate genes and biological pathways were then identified for each trait under investigation. Following adjustment for false discovery (q-value carcass traits using a single SNP regression approach. Using a Bayesian approach, 46 QTL were associated (posterior probability > 0.5) with at least one of the four traits. In total, 557 unique bovine genes, which mapped to 426 human orthologs, were within 500kbs of QTL found associated with a trait using the Bayesian approach. Using this information, 24 significantly over-represented pathways were identified across all traits. The most significantly over-represented biological pathway was the peroxisome proliferator-activated receptor (PPAR) signaling pathway. A large number of genomic regions putatively associated with bovine carcass traits were detected using two different statistical approaches. Notably, several significant associations were detected in close proximity to genes with a known role in animal growth such as glucagon and leptin. Several biological pathways, including PPAR signaling, were

  17. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    Highlights: ► The application of advanced validation techniques (sensitivity, calibration and prediction) to nuclear performance codes FRAPCON and LIFE-4 is the focus of the paper. ► A sensitivity ranking methodology narrows down the number of selected modeling parameters from 61 to 24 for FRAPCON and from 69 to 35 for LIFE-4. ► Fuel creep, fuel thermal conductivity, fission gas transport/release, crack/boundary, and fuel gap conductivity models of LIFE-4 are identified for improvements. ► FRAPCON sensitivity results indicated the importance of the fuel thermal conduction and the fission gas release models. -- Abstract: Evolving nuclear energy programs expect to use enhanced modeling and simulation (M and S) capabilities, using multiscale, multiphysics modeling approaches, to reduce both cost and time from the design through the licensing phases. Interest in the development of the multiscale, multiphysics approach has increased in the last decade because of the need for predictive tools for complex interacting processes as a means of eliminating the limited use of empirically based model development. Complex interacting processes cannot be predicted by analyzing each individual component in isolation. In most cases, the mathematical models of complex processes and their boundary conditions are nonlinear. As a result, the solutions of these mathematical models often require high-performance computing capabilities and resources. The use of multiscale, multiphysics (MS/MP) models in conjunction with high-performance computational software and hardware introduces challenges in validating these predictive tools—traditional methodologies will have to be modified to address these challenges. The advanced MS/MP codes for nuclear fuels and reactors are being developed within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of the US Department of Energy (DOE) – Nuclear Energy (NE). This paper does not directly address challenges in calibration

  18. Mobile phone application for mathematics learning

    Science.gov (United States)

    Supandi; Ariyanto, L.; Kusumaningsih, W.; Aini, A. N.

    2018-03-01

    This research was aimed to determine the role of the use of Mobile Phone Application (MPA) in Mathematics learning. The Pre and Post-test Quasy Experiment method was applied. The Pre-test was performed to understand the initial capability. In contrast, the Post-test was selected to identify changes in student ability after they were introduced to the application of Mobile Technology. Student responses to the use of this application were evaluated by a questionnaire. Based on the questionnaire, high scores were achieved, indicating the student's interest in this application. Also, learning results showed significant improvement in the learning achievement and the student learning behaviour. It was concluded that education supported by the MPA application gave a positive impact on learning outcomes as well as learning atmosphere both in class and outside the classroom.

  19. Application of controllable unit approach (CUA) to performance-criterion-based nuclear material control and accounting

    International Nuclear Information System (INIS)

    Foster, K.W.; Rogers, D.R.

    1979-01-01

    The Nuclear Regulatory Commission is considering the use of maximum-loss performance criteria as a means of controlling SNM in nuclear plants. The Controllable Unit Approach to material control and accounting (CUA) was developed by Mound to determine the feasibility of controlling a plant to a performance criterion. The concept was tested with the proposed Anderson, SC, mixed-oxide plant, and it was shown that CUA is indeed a feasible method for controlling a complex process to a performance criterion. The application of CUA to an actual low-enrichment plant to assist the NRC in establishing performance criteria for uranium processes is discussed. 5 refs

  20. Design and performance evaluation of a hall effect magnetic compass for oceanographic and meteorological applications

    Digital Repository Service at National Institute of Oceanography (India)

    Joseph, A.; Desai, R.G.P.; Agarvadekar, Y.; Tengali, T.; Mishra, M.; Fadate, C.; Gomes, L.

    A Hall Effect magnetic compass, suitable for oceanographic and meteorological applications, has been designed and its performance characteristics have been evaluated. Slope of the least-squares-fitted linear graph was found to be close to the ideal...

  1. Development of high performance Schottky barrier diode and its application to plasma diagnostics

    International Nuclear Information System (INIS)

    Fujita, Junji; Kawahata, Kazuo; Okajima, Shigeki

    1993-10-01

    At the conclusion of the Supporting Collaboration Research on 'Development of High Performance Detectors in the Far Infrared Range' carried out from FY1990 to FY1992, the results of developing Schottky barrier diode and its application to plasma diagnostics are summarized. Some remarks as well as technical know-how for the correct use of diodes are also described. (author)

  2. Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements

    International Nuclear Information System (INIS)

    Azmy, Y.Y.; Barnett, D.A.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  3. Multitasking TORT under UNICOS: Parallel performance models and measurements

    International Nuclear Information System (INIS)

    Barnett, A.; Azmy, Y.Y.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  4. Performance analysis of multidimensional wavefront algorithms with application to deterministic particle transport

    International Nuclear Information System (INIS)

    Hoisie, A.; Lubeck, O.; Wasserman, H.

    1998-01-01

    The authors develop a model for the parallel performance of algorithms that consist of concurrent, two-dimensional wavefronts implemented in a message passing environment. The model, based on a LogGP machine parameterization, combines the separate contributions of computation and communication wavefronts. They validate the model on three important supercomputer systems, on up to 500 processors. They use data from a deterministic particle transport application taken from the ASCI workload, although the model is general to any wavefront algorithm implemented on a 2-D processor domain. They also use the validated model to make estimates of performance and scalability of wavefront algorithms on 100-TFLOPS computer systems expected to be in existence within the next decade as part of the ASCI program and elsewhere. In this context, the authors analyze two problem sizes. Their model shows that on the largest such problem (1 billion cells), inter-processor communication performance is not the bottleneck. Single-node efficiency is the dominant factor

  5. Robust approximation-free prescribed performance control for nonlinear systems and its application

    Science.gov (United States)

    Sun, Ruisheng; Na, Jing; Zhu, Bin

    2018-02-01

    This paper presents a robust prescribed performance control approach and its application to nonlinear tail-controlled missile systems with unknown dynamics and uncertainties. The idea of prescribed performance function (PPF) is incorporated into the control design, such that both the steady-state and transient control performance can be strictly guaranteed. Unlike conventional PPF-based control methods, we further tailor a recently proposed systematic control design procedure (i.e. approximation-free control) using the transformed tracking error dynamics, which provides a proportional-like control action. Hence, the function approximators (e.g. neural networks, fuzzy systems) that are widely used to address the unknown nonlinearities in the nonlinear control designs are not needed. The proposed control design leads to a robust yet simplified function approximation-free control for nonlinear systems. The closed-loop system stability and the control error convergence are all rigorously proved. Finally, comparative simulations are conducted based on nonlinear missile systems to validate the improved response and the robustness of the proposed control method.

  6. Virginia power's human performance evaluation system (HPES)

    International Nuclear Information System (INIS)

    Patterson, W.E.

    1991-01-01

    This paper reports on the Human Performance Evaluation System (HPES) which was initially developed by the Institute of Nuclear Power Operations (INPO) using the Aviation Safety Reporting System (ASRS) as a guide. After a pilot program involving three utilities ended in 1983, the present day program was instituted. A methodology was developed, for specific application to nuclear power plant employees, to aid trained coordinators/evaluators in determining those factors that exert a negative influence on human behavior in the nuclear power plant environment. HPES is for anyone and everyone on site, from contractors to plant staff to plant management. No one is excluded from participation. Virginia Power's HPES program goal is to identify and correct the root causes of human performance problems. Evaluations are performed on reported real or perceived conditions that may have an adverse influence on members of the nuclear team. A report is provided to management identifying root cause and contributing factors along with recommended corrective actions

  7. The impact of the degree of application of e-commerce on operational performance among Taiwans high-tech manufacturers

    Directory of Open Access Journals (Sweden)

    Yi-Chan Chung

    2013-11-01

    Full Text Available This study probes the correlation of types of operational strategy, degrees of organisational learning, types of organisational culture, the degree of the application of e-commerce, and operational performance among high-tech firms in Taiwan. The data was collected by questionnaires distributed via mail to senior supervisors at high-tech firms in six industries at three Taiwanese science parks. The results showed that a higher degree of e-commerce application leads to a significant and positive effect on operational performance. This study suggests that, in order to upgrade operational performance, firms should enhance their organisational learning and e-commerce, along with their rational, hierarchical, consensual, and developmental cultures, and the execution of prospector and defender strategies.

  8. A Framework for Treating Uncertainty to Facilitate Waste Disposal Decision Making - Application of the Approach to GCD Performance Assessment

    International Nuclear Information System (INIS)

    Brown, T.J.; Cochran, J.R.; Gallegos, D.P.

    1999-01-01

    This paper presents an approach for treating uncertainty in the performance assessment process to efficiently address regulatory performance objectives for radioactive waste disposal and discusses the application of the approach at the Greater Confinement Disposal site. In this approach, the performance assessment methodology uses probabilistic risk assessment concepts to guide effective decisions about site characterization activities and provides a path toward reasonable assurance regarding regulatory compliance decisions. Although the approach is particularly amenable to requirements that are probabilistic in nature, the approach is also applicable to deterministic standards such as the dose-based and concentration-based requirements

  9. Experimental Performance Evaluation of a Supersonic Turbine for Rocket Engine Applications

    Science.gov (United States)

    Snellgrove, Lauren M.; Griffin, Lisa W.; Sieja, James P.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis and testing of the turbomachinery is necessary. To support this requirement, a task was developed at NASA Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. These tools were applied to optimize a supersonic turbine design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned-to obtain an increased efficiency. The goal of the demonstration was to increase the total-to- static efficiency of the turbine by eight points over the baseline design. A sub-scale, cold flow test article modeling the final optimized turbine was designed, manufactured, and tested in air at MSFC s Turbine Airflow Facility. Extensive on- and off- design point performance data, steady-state data, and unsteady blade loading data were collected during testing.

  10. Identifying Memory Allocation Patterns in HEP Software

    Science.gov (United States)

    Kama, S.; Rauschmayr, N.

    2017-10-01

    HEP applications perform an excessive amount of allocations/deallocations within short time intervals which results in memory churn, poor locality and performance degradation. These issues are already known for a decade, but due to the complexity of software frameworks and billions of allocations for a single job, up until recently no efficient mechanism has been available to correlate these issues with source code lines. However, with the advent of the Big Data era, many tools and platforms are now available to do large scale memory profiling. This paper presents, a prototype program developed to track and identify each single (de-)allocation. The CERN IT Hadoop cluster is used to compute memory key metrics, like locality, variation, lifetime and density of allocations. The prototype further provides a web based visualization back-end that allows the user to explore the results generated on the Hadoop cluster. Plotting these metrics for every single allocation over time gives a new insight into application’s memory handling. For instance, it shows which algorithms cause which kind of memory allocation patterns, which function flow causes how many short-lived objects, what are the most commonly allocated sizes etc. The paper will give an insight into the prototype and will show profiling examples for the LHC reconstruction, digitization and simulation jobs.

  11. Performance of a transmutation advanced device for sustainable energy application

    International Nuclear Information System (INIS)

    Garcia, C.; Rosales, J.; Garcia, L.; Perez-Navarro, A.; Escriva, A.; Abanades, A.

    2009-01-01

    Preliminary studies have been performed to design a device for nuclear waste transmutation and hydrogen generation based on a gas cooled pebble bed accelerator driven system, TADSEA (transmutation advanced device for sustainable energy application). In previous studies we have addressed the viability of an ADS Transmutation device that uses as fuel wastes from the existing LWR power plants, encapsulated in graphite in the form of pebble beds, being cooled by helium which enables high temperatures, in the order of 1200 K, to facilitate hydrogen generation from water either by high temperature electrolysis or by thermo chemical cycles. To design this device several configurations were studied, including several reactors thickness, to achieve the desired parameters, the transmutation of nuclear waste and the production of 100 MW. of thermal power. In this paper we are presenting new studies performed on deep burn in-core fuel management strategy for LWR waste. We analyze the fuel cycle on TADSEA device based on driver and transmutation fuel that were proposed for the General Atomic design of a gas turbine-modular helium reactor. We compare the transmutation results of the three fuel management strategies, using driven and transmutation, and standard LWR spend fuel, and present several parameters that describe the neutron performance of TADSEA nuclear core as the fuel and moderator temperature reactivity coefficients and transmutation chain. (author)

  12. Performance of a transmutation advanced device for sustainable energy application

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, C.; Rosales, J.; Garcia, L. [Instituto Superior de Tecnologias y Ciencias Aplicadas (INSTEC), La Habana (Cuba); Perez-Navarro, A.; Escriva, A. [Universidad Politecnica de Valencia, Valencia (Spain). Inst. de Ingenieria Energetica; Abanades, A. [Universidad Politecnica de Madrid (Spain). Grupo de Modelizacion de Sistemas Termoenergeticos

    2009-07-01

    Preliminary studies have been performed to design a device for nuclear waste transmutation and hydrogen generation based on a gas cooled pebble bed accelerator driven system, TADSEA (transmutation advanced device for sustainable energy application). In previous studies we have addressed the viability of an ADS Transmutation device that uses as fuel wastes from the existing LWR power plants, encapsulated in graphite in the form of pebble beds, being cooled by helium which enables high temperatures, in the order of 1200 K, to facilitate hydrogen generation from water either by high temperature electrolysis or by thermo chemical cycles. To design this device several configurations were studied, including several reactors thickness, to achieve the desired parameters, the transmutation of nuclear waste and the production of 100 MW. of thermal power. In this paper we are presenting new studies performed on deep burn in-core fuel management strategy for LWR waste. We analyze the fuel cycle on TADSEA device based on driver and transmutation fuel that were proposed for the General Atomic design of a gas turbine-modular helium reactor. We compare the transmutation results of the three fuel management strategies, using driven and transmutation, and standard LWR spend fuel, and present several parameters that describe the neutron performance of TADSEA nuclear core as the fuel and moderator temperature reactivity coefficients and transmutation chain. (author)

  13. Improving coal mining production performance through the application of total production management

    Energy Technology Data Exchange (ETDEWEB)

    Emery, J.C. [Devman Consulting Pty Ltd. (Australia)

    1998-12-31

    This paper describes the application of the Total Productive Management (TPM) technique as a performance improvement initiative for a coal mining operation. It discusses the objectives of TPM, with the driver for improved production performance being the Overall Equipment Effectiveness (OEE) of the equipment or process, and with the development of ownership as the behavioral approach to equipment management and continuous improvement through cross-functional and area-based teams. It illustrates the concept of equipment management as defects management. The scope for application of TPM to the coal mining industry is immense. The harshness of the operating environment can be a major generator of equipment defects, and a current paradigm in the industry accepts these defects as an unavoidable outcome defining maintenance costs in this environment. However recent benchmarking studies have highlighted that maintenance costs per operating hour in some mining operations are more than double the vendor`s estimate of best practice. The paper refers to these studies which also compare maintenance costs of fixed and mobile plant and equipment to best practice outcomes in comparable process industries. The ultimate goal of any operating strategy must be to translate results to the bottom line through adding revenue from increased volume and quality of operations output, better safety performance, and reducing costs of production through lower operating and maintenance costs. These lower costs result from removal of defects generators, improved maintenance planning, and identification and reduction of hidden operating costs resulting from poor equipment maintenance. Finally the paper outlines the minesite procedures required for successful implementation of TPM to sustain these desired results for all stakeholders. 3 refs., 6 figs.

  14. Performance analyses of Z-source and quasi Z-source inverter for photovoltaic applications

    Science.gov (United States)

    Himabind, S.; Priya, T. Hari; Manjeera, Ch.

    2018-04-01

    This paper presents the comparative analysis of Z-source and Quasi Z-source converter for renewable energy applications. Due to the dependency of renewable energy sources on external weather conditions the output voltage, current changes accordingly which effects the performance of traditional voltage source and current source inverters connected across it. To overcome the drawbacks of VSI and CSI, Z-source and Quasi Z-source inverter (QZSI) are used, which can perform multiple tasks like ac-to-dc, dc-to-ac, ac-to-ac, dc-to-dc conversion. They can be used for both buck and boost operations, by utilizing the shoot-through zero state. The QZSI is derived from the ZSI topology, with a slight change in the impedance network and it overcomes the drawbacks of ZSI. The QZSI draws a constant current from the source when compared to ZSI. A comparative analysis is performed between Z-source and Quasi Z-source inverter, simulation is performed in MATLAB/Simulink environment.

  15. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  16. Structural identifiability of systems biology models: a critical comparison of methods.

    Directory of Open Access Journals (Sweden)

    Oana-Teodora Chis

    Full Text Available Analysing the properties of a biological system through in silico experimentation requires a satisfactory mathematical representation of the system including accurate values of the model parameters. Fortunately, modern experimental techniques allow obtaining time-series data of appropriate quality which may then be used to estimate unknown parameters. However, in many cases, a subset of those parameters may not be uniquely estimated, independently of the experimental data available or the numerical techniques used for estimation. This lack of identifiability is related to the structure of the model, i.e. the system dynamics plus the observation function. Despite the interest in knowing a priori whether there is any chance of uniquely estimating all model unknown parameters, the structural identifiability analysis for general non-linear dynamic models is still an open question. There is no method amenable to every model, thus at some point we have to face the selection of one of the possibilities. This work presents a critical comparison of the currently available techniques. To this end, we perform the structural identifiability analysis of a collection of biological models. The results reveal that the generating series approach, in combination with identifiability tableaus, offers the most advantageous compromise among range of applicability, computational complexity and information provided.

  17. Performance specification methodology: introduction and application to displays

    Science.gov (United States)

    Hopper, Darrel G.

    1998-09-01

    Acquisition reform is based on the notion that DoD must rely on the commercial marketplace insofar as possible rather than solely looking inward to a military marketplace to meet its needs. This reform forces a fundamental change in the way DoD conducts business, including a heavy reliance on private sector models of change. The key to more reliance on the commercial marketplace is the performance specifications (PS). This paper introduces some PS concepts and a PS classification principal to help bring some structure to the analysis of risk (cost, schedule, capability) in weapons system development and the management of opportunities for affordable ownership (maintain/increase capability via technology insertion, reduce cost) in this new paradigm. The DoD shift toward commercial components is nowhere better exemplified than in displays. Displays are the quintessential dual-use technology and are used herein to exemplify these PS concepts and principal. The advent of flat panel displays as a successful technology is setting off an epochal shift in cockpits and other military applications. Displays are installed in every DoD weapon system, and are, thus, representative of a range of technologies where issues and concerns throughout industry and government have been raised regarding the increased DoD reliance on the commercial marketplace. Performance specifications require metrics: the overall metrics of 'information-thrust' with units of Mb/s and 'specific info- thrust' with units of Mb/s/kg are introduced to analyze value of a display to the warfighter and affordability to the taxpayer.

  18. The application of cloud computing to scientific workflows: a study of cost and performance.

    Science.gov (United States)

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  19. An Ambulatory Method of Identifying Anterior Cruciate Ligament Reconstructed Gait Patterns

    Directory of Open Access Journals (Sweden)

    Matthew R. Patterson

    2014-01-01

    Full Text Available The use of inertial sensors to characterize pathological gait has traditionally been based on the calculation of temporal and spatial gait variables from inertial sensor data. This approach has proved successful in the identification of gait deviations in populations where substantial differences from normal gait patterns exist; such as in Parkinsonian gait. However, it is not currently clear if this approach could identify more subtle gait deviations, such as those associated with musculoskeletal injury. This study investigates whether additional analysis of inertial sensor data, based on quantification of gyroscope features of interest, would provide further discriminant capability in this regard. The tested cohort consisted of a group of anterior cruciate ligament reconstructed (ACL-R females and a group of non-injured female controls, each performed ten walking trials. Gait performance was measured simultaneously using inertial sensors and an optoelectronic marker based system. The ACL-R group displayed kinematic and kinetic deviations from the control group, but no temporal or spatial deviations. This study demonstrates that quantification of gyroscope features can successfully identify changes associated with ACL-R gait, which was not possible using spatial or temporal variables. This finding may also have a role in other clinical applications where small gait deviations exist.

  20. Performance of a PET detector module utilizing an array of silicon photodiodes to identify the crystal of interaction

    International Nuclear Information System (INIS)

    Moses, W.W.; Derenzo, S.E.; Nutt, R.; Digby, W.M.; Williams, C.W.; Andreaco, M.

    1993-01-01

    The authors initial performance results for a new multi-layer PET detector module consisting of an array of 3 mm square by 30 mm deep BGO crystals coupled on one end to a single photomultiplier tube and on the opposite end to an array of 3 mm square silicon photodiodes. The photomultiplier tube provides an accurate timing pulse and energy discrimination for all the crystals in the module, while the silicon photodiodes identify the crystal of interaction. When a single BGO crystal at +25 C is excited with 511 keV photons, the authors measure a photodiode signal centered at 700 electrons (e - ) with noise of 375 e - fwhm. When a four crystal/photodiode module is excited with a collimated line source of 511 keV photons, the crystal of interaction is correctly identified 82% of the time. The misidentification rate can be greatly reduced and an 8 x 8 crystal/photodiode module constructed by using thicker depletion layer photodiodes or cooling to 0 C

  1. Performance Shaping Factors Assessments and Application to PHWR Outages

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Woo

    2007-02-15

    Human reliability analysis is definitely related to the quality of PSA because human errors have been identified as major contributors to PSA. According to NRC's 'Office of analysis and evaluation of operational data (AEOD)',82% of the reactor trips and accident during outage is caused by the events related to human errors. There is, however, no one HRA method universally accepted. Furthermore, HRA during PHWR outages has not been performed around the world yet. HRA during PHWR outages is especially important since manual management of operator is more required during PHWR. In this study, accident scenarios which HYU developed are used to perform a quantification of human error probability. In this study, overall procedures of standard HRA methodology are introduced and follows the quantification of 10 possible selected human actions during PHWR outages based on standard HRA methodology. To see the verification, quantified values were compared with the values from 'Generic CANDU Probabilistic Safety Assessment' and the values estimated by ASEP.Core Damage Frequency was estimated 3.35 x 10{sup -4} more higher than CDF estimated by AECL data. It was considered that the differences between the HEPs for OPAFW and OPECC3 make CDF higher. Therefore, complementary study of reestimating HEP for OPAFW and OPECC3 in detail is required for increasing the qualities of HRA and PSA. Moreover, one of the difficulties in performing human reliability analysis is to evaluate performance shaping factors which represent the characteristics and circumstances. For assessing a specific human action more exactly, it is necessary to consider all of the PSFs at the same time which makes an effect on the human action. Also, it requires the effect comparison among PSFs to minimize the uncertainties which are usually caused by the subjective judgements of HRA analysts. To see the sensitivity, performance shaping factors of each decision rule are changed which resulted

  2. Performance Shaping Factors Assessments and Application to PHWR Outages

    International Nuclear Information System (INIS)

    Lee, Seung Woo

    2007-02-01

    Human reliability analysis is definitely related to the quality of PSA because human errors have been identified as major contributors to PSA. According to NRC's 'Office of analysis and evaluation of operational data (AEOD)',82% of the reactor trips and accident during outage is caused by the events related to human errors. There is, however, no one HRA method universally accepted. Furthermore, HRA during PHWR outages has not been performed around the world yet. HRA during PHWR outages is especially important since manual management of operator is more required during PHWR. In this study, accident scenarios which HYU developed are used to perform a quantification of human error probability. In this study, overall procedures of standard HRA methodology are introduced and follows the quantification of 10 possible selected human actions during PHWR outages based on standard HRA methodology. To see the verification, quantified values were compared with the values from 'Generic CANDU Probabilistic Safety Assessment' and the values estimated by ASEP.Core Damage Frequency was estimated 3.35 x 10 -4 more higher than CDF estimated by AECL data. It was considered that the differences between the HEPs for OPAFW and OPECC3 make CDF higher. Therefore, complementary study of reestimating HEP for OPAFW and OPECC3 in detail is required for increasing the qualities of HRA and PSA. Moreover, one of the difficulties in performing human reliability analysis is to evaluate performance shaping factors which represent the characteristics and circumstances. For assessing a specific human action more exactly, it is necessary to consider all of the PSFs at the same time which makes an effect on the human action. Also, it requires the effect comparison among PSFs to minimize the uncertainties which are usually caused by the subjective judgements of HRA analysts. To see the sensitivity, performance shaping factors of each decision rule are changed which resulted in changes of core damage

  3. The application of cat swarm optimisation algorithm in classifying small loan performance

    Science.gov (United States)

    Kencana, Eka N.; Kiswanti, Nyoman; Sari, Kartika

    2017-10-01

    It is common for banking system to analyse the feasibility of credit application before its approval. Although this process has been carefully done, there is no warranty that all credits will be repaid smoothly. This study aimed to know the accuracy of Cat Swarm Optimisation (CSO) algorithm in classifying small loans’ performance that is approved by Bank Rakyat Indonesia (BRI), one of several public banks in Indonesia. Data collected from 200 lenders were used in this work. The data matrix consists of 9 independent variables that represent profile of the credit, and one categorical dependent variable reflects credit’s performance. Prior to the analyses, data was divided into two data subset with equal size. Ordinal logistic regression (OLR) procedure is applied for the first subset and gave 3 out of 9 independent variables i.e. the amount of credit, credit’s period, and income per month of lender proved significantly affect credit performance. By using significantly estimated parameters from OLR procedure as the initial values for observations at the second subset, CSO procedure started. This procedure gave 76 percent of classification accuracy of credit performance, slightly better compared to 64 percent resulted from OLR procedure.

  4. Improvement of energy performances of existing buildings by application of solar thermal systems

    Directory of Open Access Journals (Sweden)

    Krstić-Furundžić Aleksandra

    2009-01-01

    Full Text Available Improvement of energy performances of the existing buildings in the suburban settlement Konjarnik in Belgrade, by the application of solar thermal systems is the topic presented in this paper. Hypothetical models of building improvements are created to allow the benefits of applying solar thermal collectors to residential buildings in Belgrade climate conditions to be estimated. This case study presents different design variants of solar thermal collectors integrated into a multifamily building envelope. The following aspects of solar thermal systems integration are analyzed in the paper: energy, architectural, ecological and economic. The results show that in Belgrade climatic conditions significant energy savings and reduction of CO2 emissions can be obtained with the application of solar thermal collectors.

  5. Modeling and identification of induction micromachines in microelectromechanical systems applications

    Energy Technology Data Exchange (ETDEWEB)

    Lyshevski, S.E. [Purdue University at Indianapolis (United States). Dept. of Electrical and Computer Engineering

    2002-11-01

    Microelectromechanical systems (MEMS), which integrate motion microstructures, radiating energy microdevices, controlling and signal processing integrated circuits (ICs), are widely used. Rotational and translational electromagnetic based micromachines are used in MEMS as actuators and sensors. Brushless high performance micromachines are the preferable choice in different MEMS applications, and therefore, synchronous and induction micromachines are the best candidates. Affordability, good performance characteristics (efficiency, controllability, robustness, reliability, power and torque densities etc.) and expanded operating envelopes result in a strong interest in the application of induction micromachines. In addition, induction micromachines can be easily fabricated using surface micromachining and high aspect ratio fabrication technologies. Thus, it is anticipated that induction micromachines, controlled using different control algorithms implemented using ICs, will be widely used in MEMS. Controllers can be implemented using specifically designed ICs to attain superior performance, maximize efficiency and controllability, minimize losses and electromagnetic interference, reduce noise and vibration, etc. In order to design controllers, the induction micromachine must be modeled, and its mathematical model parameters must be identified. Using microelectromechanics, nonlinear mathematical models are derived. This paper illustrates the application of nonlinear identification methods as applied to identify the unknown parameters of three phase induction micromachines. Two identification methods are studied. In particular, nonlinear error mapping technique and least squares identification are researched. Analytical and numerical results, as well as practical capabilities and effectiveness, are illustrated, identifying the unknown parameters of a three phase brushless induction micromotor. Experimental results fully support the identification methods. (author)

  6. Mobile Applications' Impact on Student Performance and Satisfaction

    Science.gov (United States)

    Alqahtani, Maha; Mohammad, Heba

    2015-01-01

    Mobile applications are rapidly growing in importance and can be used for various purposes. They had been used widely in education. One of the educational purposes for which mobile applications can be used is learning the right way to read and pronounce the verses of the Holy Quran. There are many applications that translate the Quran into several…

  7. High Performance Computing - Power Application Programming Interface Specification Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ward, H. Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  8. Performance Monitoring Enterprise Applications with the BlackBird System

    Science.gov (United States)

    Germano, João P.; da Silva, Alberto Rodrigues; Silva, Fernando M.

    This work describes the BlackBird system, which is an analysis and monitoring service for data-intensive enterprise applications, without restrictions on the targeted architecture or employed technologies. A case study is presented for the monitoring of Billing applications from Vodafone Portugal. Monitoring systems are an essential tool for the effective management of Enterprise Applications and the attainment of the demanding service level agreements imposed to these applications. However, due to the increasing complexity and diversity of these applications, adequate monitoring systems are rarely available. The BlackBird monitoring system is able to interact with these applications through different technologies employed by the Monitored Application, and is able to produce Metrics regarding the application service level goals. The BlackBird system can be specified using a set of pre-defined Configuration Objects, allowing it to be extensible and adaptable for applications with different architectures.

  9. Application of High-performance Visual Analysis Methods to Laser Wakefield Particle Acceleration Data

    International Nuclear Information System (INIS)

    Rubel, Oliver; Prabhat, Mr.; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2008-01-01

    Our work combines and extends techniques from high-performance scientific data management and visualization to enable scientific researchers to gain insight from extremely large, complex, time-varying laser wakefield particle accelerator simulation data. We extend histogram-based parallel coordinates for use in visual information display as well as an interface for guiding and performing data mining operations, which are based upon multi-dimensional and temporal thresholding and data subsetting operations. To achieve very high performance on parallel computing platforms, we leverage FastBit, a state-of-the-art index/query technology, to accelerate data mining and multi-dimensional histogram computation. We show how these techniques are used in practice by scientific researchers to identify, visualize and analyze a particle beam in a large, time-varying dataset

  10. Application of computational fluid dynamics in building performance simulation for the outdoor environment: an overview

    NARCIS (Netherlands)

    Blocken, B.J.E.; Stathopoulos, T.; Carmeliet, J.; Hensen, J.L.M.

    2011-01-01

    This paper provides an overview of the application of CFD in building performance simulation for the outdoor environment, focused on four topics: (1) pedestrian wind environment around buildings, (2) wind-driven rain on building facades, (3) convective heat transfer coefficients at exterior building

  11. Low Cost High Performance Generator Technology Program. Volume 4. Mission application study

    International Nuclear Information System (INIS)

    1975-07-01

    Results of initial efforts to investigate application of selenide thermoelectric RTG's to specific missions as well as an indication of development requirements to enable satisfaction of emerging RTG performance criteria are presented. Potential mission applications in DoD such as SURVSATCOM, Advance Defense Support Program, Laser Communication Satellite, Satellite Data System, Global Positioning Satellite, Deep Space Surveillance Satellite, and Unmanned Free Swimming Submersible illustrate power requirements in the range of 500 to 1000 W. In contrast, the NASA applications require lower power ranging from 50 W for outer planetary atmospheric probes to about 200 W for spacecraft flights to Jupiter and other outer planets. The launch dates for most of these prospective missions is circa 1980, a requirement roughly compatible with selenide thermoelectric and heat source technology development. A discussion of safety criteria is included to give emphasis to the requirements for heat source design. In addition, the observation is made that the potential accident environments of all launch vehicles are similar so that a reasonable composite set of design specifications may be derived to satisfy almost all applications. Details of the LCHPG application potential is afforded by three designs: an 80 W RTG using improved selenide thermoelectric material, a 55 to 65 W LCHPG using current and improved selenide materials, and the final 500 W LCHPG as reported in Volume 2. The final results of the LCHPG design study have shown that in general, all missions can expect an LCHPG design which yields 10 percent efficiency at 3 W/lb with the current standard selenide thermoelectric materials, with growth potential to 14 percent at greater than 4 W/lb in the mid 1980's time frame

  12. Improvement on the Performance of Canal Network and Method of ...

    African Journals Online (AJOL)

    This paper presents the required improvement on the performance of canal network and method of on-farm water application systems at Tunga-Kawo irrigation scheme, Wushishi, Niger state. The problems of poor delivery of water to the farmland were identified to include erosion of canal embarkment, lack of water ...

  13. Corrosion Performance of New Generation Aluminum-Lithium Alloys for Aerospace Applications

    Science.gov (United States)

    Moran, James P.; Bovard, Francine S.; Chrzan, James D.; Vandenburgh, Peter

    Over the past several years, a new generation of aluminum-lithium alloys has been developed. These alloys are characterized by excellent strength, low density, and high modulus of elasticity and are therefore of interest for lightweight structural materials applications particularly for construction of current and future aircraft. These new alloys have also demonstrated significant improvements in corrosion resistance when compared with the legacy and incumbent alloys. This paper documents the superior corrosion resistance of the current commercial tempers of these materials and also discusses the corrosion performance as a function of the degree of artificial aging. Results from laboratory corrosion tests are compared with results from exposures in a seacoast atmosphere to assess the predictive capability of the laboratory tests. The correlations that have been developed between the laboratory tests and the seacoast exposures provide confidence that a set of available methods can provide an accurate assessment of the corrosion performance of this new generation of alloys.

  14. DEVELOPMENT OF NEW VALVE STEELS FOR APPLICATION IN HIGH PERFORMANCE ENGINES

    Directory of Open Access Journals (Sweden)

    Alexandre Bellegard Farina

    2013-12-01

    Full Text Available UNS N07751 and UNS N07080 alloys are commonly applied for automotive valves production for high performance internal combustion engines. These alloys present high hot resistance to mechanical strength, oxidation, corrosion, creep and microstructural stability. However, these alloys presents low wear resistance and high cost due to the high nickel contents. In this work it is presented the development of two new Ni-based alloys for application in high performance automotive valve as an alternative to the alloys UNS N07751 and UNS N07080. The new developed alloys are based on a high nickel-chromium austenitic matrix with dispersion of γ’ and γ’’ phases and containing different NbC contents. Due to the nickel content reduction in the developed alloys in comparison with these actually used alloys, the new alloys present an economical advantage for substitution of UNS N07751 and UNS N0780 alloys.

  15. Automated transit planning, operation, and applications

    CERN Document Server

    Liu, Rongfang

    2016-01-01

    This book analyzes the successful implementations of automated transit in various international locations, such as Paris, Toronto, London, and Kuala Lumpur, and investigates the apparent lack of automated transit applications in the urban environment in the United States. The book begins with a brief definition of automated transit and its historical development. After a thorough description of the technical specifications, the author highlights a few applications from each sub-group of the automated transit spectrum. International case studies display various technologies and their applications, and identify vital factors that affect each system and performance evaluations of existing applications. The book then discusses the planning and operation of automated transit applications at both macro and micro levels. Finally, the book covers a number of less successful concepts, as well as the lessons learned, allow ng readers to gain a comprehensive understanding of the topic.

  16. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  17. Performance evaluation of photovoltaic-thermosyphon system for subtropical climate application

    Energy Technology Data Exchange (ETDEWEB)

    Chow, T.T.; He, W.; Chan, A.L.S. [Division of Building Science and Technology, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong SAR (China); Ji, J. [Department of Thermal Science and Energy Engineering, University of Science and Technology of China, Anhui (China)

    2007-01-15

    The rapid development and sales volume of photovoltaic (PV) modules has created a promising business environment in the foreseeable future. However, the current electricity cost from PV is still several times higher than from the conventional power generation. One way to shorten the payback period is to bring in the hybrid photovoltaic-thermal (PVT) technology, which multiplies the energy outputs from the same collector surface area. In this paper, the performance evaluation of a new water-type PVT collector system is presented. The thermal collection making use of the thermosyphon principle eliminates the expense of pumping power. Experimental rigs were successfully built. A dynamic simulation model of the PVT collector system was developed and validated by the experimental measurements, together with two other similar models developed for PV module and solar hot-water collector. These were then used to predict the energy outputs and the payback periods for their applications in the subtropical climate, with Hong Kong as an example. The numerical results show that a payback period of 12 year for the PVT collector system is comparable to the side-by-side system, and is much shorter than the plain PV application. This is a great encouragement in marketing the PVT technology. (author)

  18. Performance evaluation of photovoltaic-thermosyphon system for subtropical climate application

    International Nuclear Information System (INIS)

    Chow, T.T.; He, W.; Chan, A.L.S.; Ji, J.

    2007-01-01

    The rapid development and sales volume of photovoltaic (PV) modules has created a promising business environment in the foreseeable future. However, the current electricity cost from PV is still several times higher than from the conventional power generation. One way to shorten the payback period is to bring in the hybrid photovoltaic-thermal (PVT) technology, which multiplies the energy outputs from the same collector surface area. In this paper, the performance evaluation of a new water-type PVT collector system is presented. The thermal collection making use of the thermosyphon principle eliminates the expense of pumping power. Experimental rigs were successfully built. A dynamic simulation model of the PVT collector system was developed and validated by the experimental measurements, together with two other similar models developed for PV module and solar hot-water collector. These were then used to predict the energy outputs and the payback periods for their applications in the subtropical climate, with Hong Kong as an example. The numerical results show that a payback period of 12 year for the PVT collector system is comparable to the side-by-side system, and is much shorter than the plain PV application. This is a great encouragement in marketing the PVT technology. (author)

  19. Preliminary Analysis of Remote Monitoring and Robotic Concepts for Performance Confirmation

    International Nuclear Information System (INIS)

    McAffee, D.A.

    1997-01-01

    As defined in 10 CFR Part 60.2, Performance Confirmation is the ''program of tests, experiments and analyses which is conducted to evaluate the accuracy and adequacy of the information used to determine with reasonable assurance that the performance objectives for the period after permanent closure will be met''. The overall Performance Confirmation program begins during site characterization and continues up to repository closure. The main purpose of this document is to develop, explore and analyze initial concepts for using remotely operated and robotic systems in gathering repository performance information during Performance Confirmation. This analysis focuses primarily on possible Performance Confirmation related applications within the emplacement drifts after waste packages have been emplaced (post-emplacement) and before permanent closure of the repository (preclosure). This will be a period of time lasting approximately 100 years and basically coincides with the Caretaker phase of the project. This analysis also examines, to a lesser extent, some applications related to Caretaker operations. A previous report examined remote handling and robotic technologies that could be employed during the waste package emplacement phase of the project (Reference 5.1). This analysis is being prepared to provide an early investigation of possible design concepts and technical challenges associated with developing remote systems for monitoring and inspecting activities during Performance Confirmation. The writing of this analysis preceded formal development of Performance Confirmation functional requirements and program plans and therefore examines, in part, the fundamental Performance Confirmation monitoring needs and operating conditions. The scope and primary objectives of this analysis are to: (1) Describe the operating environment and conditions expected in the emplacement drifts during the preclosure period. (Presented in Section 7.2). (2) Identify and discuss the

  20. Using simulation to evaluate the performance of resilience strategies and process failures

    Energy Technology Data Exchange (ETDEWEB)

    Levy, Scott N.; Topp, Bryan Embry; Arnold, Dorian C; Ferreira, Kurt Brian; Widener, Patrick; Hoefler, Torsten

    2014-01-01

    Fault-tolerance has been identified as a major challenge for future extreme-scale systems. Current predictions suggest that, as systems grow in size, failures will occur more frequently. Because increases in failure frequency reduce the performance and scalability of these systems, significant effort has been devoted to developing and refining resilience mechanisms to mitigate the impact of failures. However, effective evaluation of these mechanisms has been challenging. Current systems are smaller and have significantly different architectural features (e.g., interconnect, persistent storage) than we expect to see in next-generation systems. To overcome these challenges, we propose the use of simulation. Simulation has been shown to be an effective tool for investigating performance characteristics of applications on future systems. In this work, we: identify the set of system characteristics that are necessary for accurate performance prediction of resilience mechanisms for HPC systems and applications; demonstrate how these system characteristics can be incorporated into an existing large-scale simulator; and evaluate the predictive performance of our modified simulator. We also describe how we were able to optimize the simulator for large temporal and spatial scales-allowing the simulator to run 4x faster and use over 100x less memory.

  1. Leading product-related environmental performance indicators: a selection guide and database

    DEFF Research Database (Denmark)

    Issa, Isabela I.; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    2015-01-01

    Ecodesign is a proactive environmental management and improvement approach employed in the product development process, which aims to minimize the environmental impacts caused during a product's life cycle and thus improve its environmental performance. The establishment of measurable environmental...... in the selection and application of environmental performance indicators - a more structured approach is still lacking. This paper presents the efforts made to identify and systematize existing leading product-related environmental performance indicators, based on a systematic literature review, and to develop...

  2. An Allometric Modelling Approach to Identify the Optimal Body Shape Associated with, and Differences between Brazilian and Peruvian Youth Motor Performance.

    Directory of Open Access Journals (Sweden)

    Simonete Silva

    Full Text Available Children from developed and developing countries differ in their body size and shape due to marked differences across their life history caused by social, economic and cultural differences which are also linked to their motor performance (MP. We used allometric models to identify size/shape characteristics associated with MP tests between Brazilian and Peruvian schoolchildren. A total of 4,560 subjects, 2,385 girls and 2,175 boys aged 9-15 years were studied. Height and weight were measured; biological maturation was estimated with the maturity offset technique; MP measures included the 12 minute run (12MR, handgrip strength (HG, standing long jump (SLJ and the shuttle run speed (SR tests; physical activity (PA was assessed using the Baecke questionnaire. A multiplicative allometric model was adopted to adjust for body size differences across countries. Reciprocal ponderal index (RPI was found to be the most suitable body shape indicator associated with the 12MR, SLJ, HG and SR performance. A positive maturation offset parameter was also associated with a better performance in SLJ, HG and SR tests. Sex differences were found in all motor tests. Brazilian youth showed better scores in MP than their Peruvian peers, even when controlling for their body size differences The current study identified the key body size associated with four body mass-dependent MP tests. Biological maturation and PA were associated with strength and motor performance. Sex differences were found in all motor tests, as well as across countries favoring Brazilian children even when accounting for their body size/shape differences.

  3. An Allometric Modelling Approach to Identify the Optimal Body Shape Associated with, and Differences between Brazilian and Peruvian Youth Motor Performance.

    Science.gov (United States)

    Silva, Simonete; Bustamante, Alcibíades; Nevill, Alan; Katzmarzyk, Peter T; Freitas, Duarte; Prista, António; Maia, José

    2016-01-01

    Children from developed and developing countries differ in their body size and shape due to marked differences across their life history caused by social, economic and cultural differences which are also linked to their motor performance (MP). We used allometric models to identify size/shape characteristics associated with MP tests between Brazilian and Peruvian schoolchildren. A total of 4,560 subjects, 2,385 girls and 2,175 boys aged 9-15 years were studied. Height and weight were measured; biological maturation was estimated with the maturity offset technique; MP measures included the 12 minute run (12MR), handgrip strength (HG), standing long jump (SLJ) and the shuttle run speed (SR) tests; physical activity (PA) was assessed using the Baecke questionnaire. A multiplicative allometric model was adopted to adjust for body size differences across countries. Reciprocal ponderal index (RPI) was found to be the most suitable body shape indicator associated with the 12MR, SLJ, HG and SR performance. A positive maturation offset parameter was also associated with a better performance in SLJ, HG and SR tests. Sex differences were found in all motor tests. Brazilian youth showed better scores in MP than their Peruvian peers, even when controlling for their body size differences The current study identified the key body size associated with four body mass-dependent MP tests. Biological maturation and PA were associated with strength and motor performance. Sex differences were found in all motor tests, as well as across countries favoring Brazilian children even when accounting for their body size/shape differences.

  4. An Allometric Modelling Approach to Identify the Optimal Body Shape Associated with, and Differences between Brazilian and Peruvian Youth Motor Performance

    Science.gov (United States)

    Silva, Simonete; Bustamante, Alcibíades; Nevill, Alan; Katzmarzyk, Peter T.; Freitas, Duarte; Prista, António; Maia, José

    2016-01-01

    Children from developed and developing countries differ in their body size and shape due to marked differences across their life history caused by social, economic and cultural differences which are also linked to their motor performance (MP). We used allometric models to identify size/shape characteristics associated with MP tests between Brazilian and Peruvian schoolchildren. A total of 4,560 subjects, 2,385 girls and 2,175 boys aged 9–15 years were studied. Height and weight were measured; biological maturation was estimated with the maturity offset technique; MP measures included the 12 minute run (12MR), handgrip strength (HG), standing long jump (SLJ) and the shuttle run speed (SR) tests; physical activity (PA) was assessed using the Baecke questionnaire. A multiplicative allometric model was adopted to adjust for body size differences across countries. Reciprocal ponderal index (RPI) was found to be the most suitable body shape indicator associated with the 12MR, SLJ, HG and SR performance. A positive maturation offset parameter was also associated with a better performance in SLJ, HG and SR tests. Sex differences were found in all motor tests. Brazilian youth showed better scores in MP than their Peruvian peers, even when controlling for their body size differences The current study identified the key body size associated with four body mass-dependent MP tests. Biological maturation and PA were associated with strength and motor performance. Sex differences were found in all motor tests, as well as across countries favoring Brazilian children even when accounting for their body size/shape differences. PMID:26939118

  5. Performance evaluation of microturbine generation system for microgrid applications

    Energy Technology Data Exchange (ETDEWEB)

    Salam, A.A.; Mohamed, A.; Hannan, M.A.; Shareef, H.; Wanik, M.Z.C. [Kebangsaan Malaysia Univ., Selangor (Malaysia). Dept. of Electrical, Electronic and Systems Engineering, Faculty of Engineering and Built Environment

    2009-03-11

    A control system for microturbine generation system (MGS) units in microgrid applications was presented. A dynamic model of the microturbine and power electronics interface systems was used to determine converter control strategies for distributed generation operation. Back-to-back converters were used to interface the microturbine-based distributed generation system to the grid. The controllers were used to regulate the output voltage value at the reference bus voltage and the frequency of the whole grid. Reference values were predetermined in the control scheme in order to obtain the desired value of voltage amplitude and frequency. An investigation of system dynamics was conducted using simulations in both grid-connected and islanded modes. Results of the simulations demonstrated the ability of the MGS to improve electricity grid reliability. The model can be used to accurately simulate MGS dynamic performance for both grid- and islanded modes of operation. 10 refs., 17 figs.

  6. Performance of GeantV EM Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2016-10-14

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  7. Performance of GeantV EM Physics Models

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2017-10-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  8. Performance of GeantV EM Physics Models

    CERN Document Server

    Amadio, G; Apostolakis, J; Aurora, A; Bandieramonte, M; Bhattacharyya, A; Bianchini, C; Brun, R; Canal P; Carminati, F; Cosmo, G; Duhem, L; Elvira, D; Folger, G; Gheata, A; Gheata, M; Goulas, I; Iope, R; Jun, S Y; Lima, G; Mohanty, A; Nikitina, T; Novak, M; Pokorski, W; Ribon, A; Seghal, R; Shadura, O; Vallecorsa, S; Wenzel, S; Zhang, Y

    2017-01-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  9. Performance characteristics and typical industrial applications of Selfshield electron accelerators (<300kV)

    International Nuclear Information System (INIS)

    Aaronson, J.N.; Nablo, S.V.

    1985-01-01

    Selfshielded electron accelerators have been successfully used in industry for more than ten years. One of the important advantages of these machines is their compactness for easy adaptation to conventional coating and product finishing machinery. It is equally important that these machines qualify for use under 'unrestricted' conditions as specified by OSHA. The shielding and product handling configurations which make this unrestricted designation possible for operating voltages under 300 kV are discussed. Thin film dosimetry techniques used for the determination of the machine performance parameters are discussed along with the rotary scanner techniques employed for the dose rate studies which are important in the application of the processors. Paper and wood coatings, which are important industrial applications involving electron initiated polymerization, are reviewed. The sterilization and disinfestation applications are also discussed. The increasing concern of these industries for the more effective use of energy and for compliance with more stringent pollution regulations, coupled with the novel processes this energy source makes possible, assure a bright future for this developing technology. (orig.)

  10. Performance limitations of imaging microscopes for soft x-ray applications

    International Nuclear Information System (INIS)

    Lewotsky, K.L.; Kotha, A.; Harvey, J.E.

    1993-01-01

    Recent advances in the fabrication of nanometer-scale multilayer structures have yielded high-reflectance mirrors operating at near-normal incidence for soft X-ray wavelengths. These developments have stimulated renewed interest in high-resolution soft X-ray microscopy. The design of a Schwarzschild imaging microscope for soft X-ray applications has been reported by Hoover and Shealy. Based upon a geometrical ray-trace analysis of the residual design errors, diffraction-limited performance at a wavelength of 100 angstrom was predicted over an object size (diameter) of 0.4 mm. In this paper the authors expand upon the previous analysis of the Schwarzschild X-ray microscope design by determining the total image degradation due to diffraction, geometrical aberrations, alignment errors, and realistic assumptions concerning optical fabrication errors. NASA's Optical Surface Analysis Code (OSAC) is used to model the image degradation effects of residual surface irregularities over the entire range of relevant spatial frequencies. This includes small angle scattering effects due to mid spatial frequency surface errors falling between the traditional figure and finish specifications. Performance predictions are presented parametrically to provide some insight into the optical fabrication and alignment tolerances necessary to meet a particular image quality requirement

  11. Trial application of PRATOOL: Performance of sensitivity studies on service water systems

    International Nuclear Information System (INIS)

    Gregg, R.E.; Wood, S.T.

    1991-03-01

    PRATOOL is a computer program developed to supplement the Integrated Reliability and Risk Analysis System (IRRAS) probabilistic risk assessment (PRA) computer program. It is intended to be a tool for performing PRA sensitivity analyses. This will allow a PRA's results to be easily reevaluated to show its sensitivity to changing parameter (system, component type, basic event, etc.) values (i.e., failure rates). This report documents a trial application of PRATOOL. It evaluates the sensitivity of core damage frequency to service water system availability at several nuclear power plants. For the purpose of this trail application of PRATOOL, service water system sensitivity studies were performed using the results of several NUREG- 1150 PRAs. These plants were chosen because their PRAs were already loaded into the IRRAS data base. Therefore, they are readily available for this type of study. PRATOOL contains a small subset of the functions provided by IRRAS. The user can select sets of parameters that will act as the bases for sensitivity analyses. Depending on the PRA used, the parameters selected can include: systems, component types, trains, locations, failure modes, and basic events. It will also perform and AND-ing operation on selected parameters. This allows the user define any number of possible combinations of parameters. The results of the four sensitivity studies would tend to indicate that the SWS availability is important for BWRs but unimportant for PWRs. Both BWRs showed significant increases in CDF if their SWS failure rates are allowed to increase. From the results of the four sensitivity studies, it is interesting to note that no improvements made to any of the SWSs' failure rates would result in a significant decrease on CDF. 10 figs

  12. High Performance Multi-GPU SpMV for Multi-component PDE-Based Applications

    KAUST Repository

    Abdelfattah, Ahmad

    2015-07-25

    Leveraging optimization techniques (e.g., register blocking and double buffering) introduced in the context of KBLAS, a Level 2 BLAS high performance library on GPUs, the authors implement dense matrix-vector multiplications within a sparse-block structure. While these optimizations are important for high performance dense kernel executions, they are even more critical when dealing with sparse linear algebra operations. The most time-consuming phase of many multicomponent applications, such as models of reacting flows or petroleum reservoirs, is the solution at each implicit time step of large, sparse spatially structured or unstructured linear systems. The standard method is a preconditioned Krylov solver. The Sparse Matrix-Vector multiplication (SpMV) is, in turn, one of the most time-consuming operations in such solvers. Because there is no data reuse of the elements of the matrix within a single SpMV, kernel performance is limited by the speed at which data can be transferred from memory to registers, making the bus bandwidth the major bottleneck. On the other hand, in case of a multi-species model, the resulting Jacobian has a dense block structure. For contemporary petroleum reservoir simulations, the block size typically ranges from three to a few dozen among different models, and still larger blocks are relevant within adaptively model-refined regions of the domain, though generally the size of the blocks, related to the number of conserved species, is constant over large regions within a given model. This structure can be exploited beyond the convenience of a block compressed row data format, because it offers opportunities to hide the data motion with useful computations. The new SpMV kernel outperforms existing state-of-the-art implementations on single and multi-GPUs using matrices with dense block structure representative of porous media applications with both structured and unstructured multi-component grids.

  13. Numerical Investigation of the Thermal Management Performance of MEPCM Modules for PV Applications

    Directory of Open Access Journals (Sweden)

    Chao-Yang Huang

    2013-08-01

    Full Text Available The efficiency of photovoltaic modules decreases as the cell temperature increases. It is necessary to have an adequate thermal management mechanism for a photovoltaic module, especially when combined with a building construction system. This study aims to investigate via computational fluid dynamics simulations the heat transfer characteristics and thermal management performance of microencapsulated phase change material modules for photovoltaic applications under temporal variations of daily solar irradiation. The results show that the aspect ratio of the microencapsulated phase change material layer has significant effects on the heat transfer characteristics and the overall thermal performance of the two cases examined with different melting points (26 °C and 34 °C are approximately the same.

  14. Quantitative performance targets by using balanced scorecard system: application to waste management and public administration.

    Science.gov (United States)

    Mendes, Paula; Nunes, Luis Miguel; Teixeira, Margarida Ribau

    2014-09-01

    This article demonstrates how decision-makers can be guided in the process of defining performance target values in the balanced scorecard system. We apply a method based on sensitivity analysis with Monte Carlo simulation to the municipal solid waste management system in Loulé Municipality (Portugal). The method includes two steps: sensitivity analysis of performance indicators to identify those performance indicators with the highest impact on the balanced scorecard model outcomes; and sensitivity analysis of the target values for the previously identified performance indicators. Sensitivity analysis shows that four strategic objectives (IPP1: Comply with the national waste strategy; IPP4: Reduce nonrenewable resources and greenhouse gases; IPP5: Optimize the life-cycle of waste; and FP1: Meet and optimize the budget) alone contribute 99.7% of the variability in overall balanced scorecard value. Thus, these strategic objectives had a much stronger impact on the estimated balanced scorecard outcome than did others, with the IPP1 and the IPP4 accounting for over 55% and 22% of the variance in overall balanced scorecard value, respectively. The remaining performance indicators contribute only marginally. In addition, a change in the value of a single indicator's target value made the overall balanced scorecard value change by as much as 18%. This may lead to involuntarily biased decisions by organizations regarding performance target-setting, if not prevented with the help of methods such as that proposed and applied in this study. © The Author(s) 2014.

  15. Challenges in Optimizing a Prostate Carcinoma Binding Peptide, Identified through the Phage Display Technology

    Directory of Open Access Journals (Sweden)

    Jürgen Debus

    2011-02-01

    Full Text Available The transfer of peptides identified through the phage display technology to clinical applications is difficult. Major drawbacks are the metabolic degradation and label instability. The aim of our work is the optimization of DUP-1, a peptide which was identified by phage display to specifically target human prostate carcinoma. To investigate the influence of chelate conjugation, DOTA was coupled to DUP-1 and labeling was performed with 111In. To improve serum stability cyclization of DUP-1 and targeted D-amino acid substitution were carried out. Alanine scanning was performed for identification of the binding site and based on the results peptide fragments were chemically synthesized. The properties of modified ligands were investigated in in vitro binding and competition assays. In vivo biodistribution studies were carried out in mice, carrying human prostate tumors subcutaneously. DOTA conjugation resulted in different cellular binding kinetics, rapid in vivo renal clearance and increased tumor-to-organ ratios. Cyclization and D-amino acid substitution increased the metabolic stability but led to binding affinity decrease. Fragment investigation indicated that the sequence NRAQDY might be significant for target-binding. Our results demonstrate challenges in optimizing peptides, identified through phage display libraries, and show that careful investigation of modified derivatives is necessary in order to improve their characteristics.

  16. Multimedia applications in differential services

    International Nuclear Information System (INIS)

    Mahfooz, S.; Merabti, M.; Pereira, R.

    2003-01-01

    In this paper we present a mechanism to provide Quality of Service (QoS) guarantees to different multimedia applications that share link bandwidth in IP-based differential services domain. In this mechanism weights are associated with each and to individual users according to their priorities. In order to evaluate the performance of our scheme we conducted simulations. The test data used portray different multimedia applications i.e. MPEG-2, IP telephony. The simulation results obtained show the effectiveness of our scheme for multimedia applications by allocating link share to each multimedia application and minimising end-to-end transmission delay 9Y bringing them in line with the recommended standard acceptable transmission delay for multimedia applications. This paper also presents extension to our proposed Relative Bandwidth Sharing (RES) scheme for differential services. We have identified and highlighted the role of border routers and core routers in differential services domain. Exploring features of Internet Protocol IPv6 in our architecture. (author)

  17. Identifying critical nitrogen application rate for maize yield and nitrate leaching in a Haplic Luvisol soil using the DNDC model.

    Science.gov (United States)

    Zhang, Yitao; Wang, Hongyuan; Liu, Shen; Lei, Qiuliang; Liu, Jian; He, Jianqiang; Zhai, Limei; Ren, Tianzhi; Liu, Hongbin

    2015-05-01

    Identification of critical nitrogen (N) application rate can provide management supports for ensuring grain yield and reducing amount of nitrate leaching to ground water. A five-year (2008-2012) field lysimeter (1 m × 2 m × 1.2 m) experiment with three N treatments (0, 180 and 240 kg Nha(-1)) was conducted to quantify maize yields and amount of nitrate leaching from a Haplic Luvisol soil in the North China Plain. The experimental data were used to calibrate and validate the process-based model of Denitrification-Decomposition (DNDC). After this, the model was used to simulate maize yield production and amount of nitrate leaching under a series of N application rates and to identify critical N application rate based on acceptable yield and amount of nitrate leaching for this cropping system. The results of model calibration and validation indicated that the model could correctly simulate maize yield and amount of nitrate leaching, with satisfactory values of RMSE-observation standard deviation ratio, model efficiency and determination coefficient. The model simulations confirmed the measurements that N application increased maize yield compared with the control, but the high N rate (240 kg Nha(-1)) did not produce more yield than the low one (120 kg Nha(-1)), and that the amount of nitrate leaching increased with increasing N application rate. The simulation results suggested that the optimal N application rate was in a range between 150 and 240 kg ha(-1), which would keep the amount of nitrate leaching below 18.4 kg NO₃(-)-Nha(-1) and meanwhile maintain acceptable maize yield above 9410 kg ha(-1). Furthermore, 180 kg Nha(-1) produced the highest yields (9837 kg ha(-1)) and comparatively lower amount of nitrate leaching (10.0 kg NO₃(-)-Nha(-1)). This study will provide a valuable reference for determining optimal N application rate (or range) in other crop systems and regions in China. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. High-Performance MIM Capacitors for a Secondary Power Supply Application

    Directory of Open Access Journals (Sweden)

    Jiliang Mu

    2018-02-01

    Full Text Available Microstructure is important to the development of energy devices with high performance. In this work, a three-dimensional Si-based metal-insulator-metal (MIM capacitor has been reported, which is fabricated by microelectromechanical systems (MEMS technology. Area enlargement is achieved by forming deep trenches in a silicon substrate using the deep reactive ion etching method. The results indicate that an area of 2.45 × 103 mm2 can be realized in the deep trench structure with a high aspect ratio of 30:1. Subsequently, a dielectric Al2O3 layer and electrode W/TiN layers are deposited by atomic layer deposition. The obtained capacitor has superior performance, such as a high breakdown voltage (34.1 V, a moderate energy density (≥1.23 mJ/cm2 per unit planar area, a high breakdown electric field (6.1 ± 0.1 MV/cm, a low leakage current (10−7 A/cm2 at 22.5 V, and a low quadratic voltage coefficient of capacitance (VCC (≤63.1 ppm/V2. In addition, the device’s performance has been theoretically examined. The results show that the high energy supply and small leakage current can be attributed to the Poole–Frenkel emission in the high-field region and the trap-assisted tunneling in the low-field region. The reported capacitor has potential application as a secondary power supply.

  19. The comparison of the performance of two screening strategies identifying newly-diagnosed HIV during pregnancy.

    Science.gov (United States)

    Boer, Kees; Smit, Colette; van der Flier, Michiel; de Wolf, Frank

    2011-10-01

    In the Netherlands, a non-selective opt-out instead of a selective opt-in antenatal HIV screening strategy was implemented in 2004. In case of infection, screening was followed by prevention of mother-to-child-transmission (PMTCT). We compared the performance of the two strategies in terms of detection of new cases of HIV and vertical transmission. HIV-infected pregnant women were identified retrospectively from the Dutch HIV cohort ATHENA January 2000 to January 2008. Apart from demographic, virological and immunological data, the date of HIV infection in relation to the index pregnancy was established. Separately, all infants diagnosed with HIV born following implementation of the screening program were identified by a questionnaire via the paediatric HIV centres. 162/481 (33.7%) HIV-positive pregnant women were diagnosed with HIV before 2004 and 172/214 (80.3%) after January 2004. Multivariate analysis showed an 8-fold (95% confidence interval 5.47-11.87) increase in the odds of HIV detection during pregnancy after the national introduction of the opt-out strategy. Still, three children born during a 5-year period after July 2004 were infected due to de novo infection in pregnancy. Implementation of a nation-wide screening strategy based upon non-selective opt-out screening followed by effective PMTCT appeared to detect more HIV-infected women for the first time in pregnancy and to reduce vertical transmission of HIV substantially. Nonetheless, still few children are infected because of maternal infection after the first trimester. We propose the introduction of partner screening on HIV as part of the antenatal screening strategy.

  20. High Performance Fortran for Aerospace Applications

    National Research Council Canada - National Science Library

    Mehrotra, Piyush

    2000-01-01

    .... HPF is a set of Fortran extensions designed to provide users with a high-level interface for programming data parallel scientific applications while delegating to the compiler/runtime system the task...

  1. Combining UML2 Application and SystemC Platform Modelling for Performance Evaluation of Real-Time Embedded Systems

    Directory of Open Access Journals (Sweden)

    Qu Yang

    2008-01-01

    Full Text Available Abstract Future mobile devices will be based on heterogeneous multiprocessing platforms accommodating several stand-alone applications. The network-on-chip communication and device networking combine the design challenges of conventional distributed systems and resource constrained real-time embedded systems. Interoperable design space exploration for both the application and platform development is required. Application designer needs abstract platform models to rapidly check the feasibility of a new feature or application. Platform designer needs abstract application models for defining platform computation and communication capacities. We propose a layered UML application/workload and SystemC platform modelling approach that allow application and platform to be modelled at several levels of abstraction, which enables early performance evaluation of the resulting system. The overall approach has been experimented with a mobile video player case study, while different load extraction methods have been validated by applying them to MPEG-4 encoder, Quake2 3D game, and MP3 decoder case studies previously.

  2. Improving Transactional Memory Performance for Irregular Applications

    OpenAIRE

    Pedrero, Manuel; Gutiérrez, Eladio; Romero, Sergio; Plata, Óscar

    2015-01-01

    Transactional memory (TM) offers optimistic concurrency support in modern multicore archi- tectures, helping the programmers to extract parallelism in irregular applications when data dependence information is not available before runtime. In fact, recent research focus on ex- ploiting thread-level parallelism using TM approaches. However, the proposed techniques are of general use, valid for any type of application. This work presents ReduxSTM, a software TM system specially d...

  3. Application of the Delphi technique in healthcare maintenance.

    Science.gov (United States)

    Njuangang, Stanley; Liyanage, Champika; Akintoye, Akintola

    2017-10-09

    Purpose The purpose of this paper is to examine the research design, issues and considerations in the application of the Delphi technique to identify, refine and rate the critical success factors and performance measures in maintenance-associated infections. Design/methodology/approach In-depth literature review through the application of open and axial coding were applied to formulate the interview and research questions. These were used to conduct an exploratory case study of two healthcare maintenance managers, randomly selected from two National Health Service Foundation Trusts in England. The results of exploratory case study provided the rationale for the application of the Delphi technique in this research. The different processes in the application of the Delphi technique in healthcare research are examined thoroughly. Findings This research demonstrates the need to apply and integrate different research methods to enhance the validity of the Delphi technique. The rationale for the application of the Delphi technique in this research is because some healthcare maintenance managers lack knowledge about basic infection control (IC) principles to make hospitals safe for patient care. The result of first round of the Delphi exercise is a useful contribution in its own rights. It identified a number of salient issues and differences in the opinions of the Delphi participants, noticeably between healthcare maintenance managers and members of the infection control team. It also resulted in useful suggestions and comments to improve the quality and presentation of the second- and third-round Delphi instruments. Practical implications This research provides a research methodology that can be adopted by researchers investigating new and emerging issues in the healthcare sector. As this research demonstrates, the Delphi technique is relevant in soliciting expert knowledge and opinion to identify performance measures to control maintenance-associated infections in

  4. Performing surgery: commonalities with performers outside medicine

    Directory of Open Access Journals (Sweden)

    Roger Lister Kneebone

    2016-08-01

    Full Text Available This paper argues for the inclusion of surgery within the canon of performance science. The world of medicine presents rich, complex but relatively under-researched sites of performance. Performative aspects of clinical practice are overshadowed by a focus on the processes and outcomes of medical care, such as diagnostic accuracy and the results of treatment. The primacy of this ‘clinical’ viewpoint - framed by clinical professionals as the application of medical knowledge - hides resonances with performance in other domains. Yet the language of performance is embedded in the culture of surgery - surgeons ‘perform’ operations, work in an operating ‘theatre’ and use ‘instruments’. This paper asks what might come into view if we take this performative language at face value and interrogate surgery from the perspective of performance science. It addresses the following questions: 1.To what extent and in what ways can surgical practice (both consultation and operation be considered as performance?2.How does comparison with two domains domains of non-surgical performance (close-up magic and puppetry illuminate understanding of surgical practice as performance?3.In what ways might including surgery within the canon of performance studies enrich the field of performance science?Two detailed case studies over 5 years with magicians (71.5 hours contact time and puppeteers (50.5 hours contact time identified performative aspects of surgical practice from the perspectives of professionals (as individuals or in groups and audiences. Physical simulation provided a means for non-clinicians to access and experience elements of the surgical world, acting as a prompt for discussion. Thematic analysis was used to establish themes and sub-themes.Key themes were: 1 clinical consultation can be viewed as ‘close-up live performance with a very small audience’ and 2 operative surgery can be viewed as ‘reading bodies within a dextrous team

  5. Development of Magneto-Resistive Angular Position Sensors for Space Applications

    Science.gov (United States)

    Hahn, Robert; Langendorf, Sven; Seifart, Klaus; Slatter, Rolf; Olberts, Bastian; Romera, Fernando

    2015-09-01

    Magnetic microsystems in the form of magneto- resistive (MR) sensors are firmly established in automobiles and industrial applications. They measure path, angle, electrical current, or magnetic fields. MR technology opens up new sensor possibilities in space applications and can be an enabling technology for optimal performance, high robustness and long lifetime at reasonable costs. In a recent assessment study performed by HTS GmbH and Sensitec GmbH under ESA Contract a market survey has confirmed that space industry has a very high interest in novel, contactless position sensors based on MR technology. Now, a detailed development stage is pursued, to advance the sensor design up to Engineering Qualification Model (EQM) level and to perform qualification testing for a representative pilot space application.The paper briefly reviews the basics of magneto- resistive effects and possible sensor applications and describes the key benefits of MR angular sensors with reference to currently operational industrial and space applications. The results of the assessment study are presented and potential applications and uses of contactless magneto-resistive angular sensors for spacecraft are identified. The baseline mechanical and electrical sensor design will be discussed. An outlook on the EQM development and qualification tests is provided.

  6. Imaging-Based Screen Identifies Laminin 411 as a Physiologically Relevant Niche Factor with Importance for i-Hep Applications

    Directory of Open Access Journals (Sweden)

    John Ong

    2018-03-01

    Full Text Available Summary: Use of hepatocytes derived from induced pluripotent stem cells (i-Heps is limited by their functional differences in comparison with primary cells. Extracellular niche factors likely play a critical role in bridging this gap. Using image-based characterization (high content analysis; HCA of freshly isolated hepatocytes from 17 human donors, we devised and validated an algorithm (Hepatocyte Likeness Index; HLI for comparing the hepatic properties of cells against a physiological gold standard. The HLI was then applied in a targeted screen of extracellular niche factors to identify substrates driving i-Heps closer to the standard. Laminin 411, the top hit, was validated in two additional induced pluripotent stem cell (iPSC lines, primary tissue, and an in vitro model of α1-antitrypsin deficiency. Cumulatively, these data provide a reference method to control and screen for i-Hep differentiation, identify Laminin 411 as a key niche protein, and underscore the importance of combining substrates, soluble factors, and HCA when developing iPSC applications. : Rashid and colleagues demonstrate the utility of a high-throughput imaging platform for identification of physiologically relevant extracellular niche factors to advance i-Heps closer to their primary tissue counterparts. The extracellular matrix (ECM protein screen identified Laminin 411 as an important niche factor facilitating i-Hep-based disease modeling in vitro. Keywords: iPS hepatocytes, extracellular niche, image-based screening, disease modeling, laminin

  7. Construction Project Performance Improvement through Radio Frequency Identification Technology Application on a Project Supply Chain

    Science.gov (United States)

    Wang, Heng

    2017-01-01

    Construction project productivity typically lags other industries and it has been the focus of numerous studies in order to improve the project performance. This research investigated the application of Radio Frequency Identification (RFID) technology on construction projects' supply chain and determined that RFID technology can improve the…

  8. Applications Performance Under MPL and MPI on NAS IBM SP2

    Science.gov (United States)

    Saini, Subhash; Simon, Horst D.; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    On July 5, 1994, an IBM Scalable POWER parallel System (IBM SP2) with 64 nodes, was installed at the Numerical Aerodynamic Simulation (NAS) Facility Each node of NAS IBM SP2 is a "wide node" consisting of a RISC 6000/590 workstation module with a clock of 66.5 MHz which can perform four floating point operations per clock with a peak performance of 266 Mflop/s. By the end of 1994, 64 nodes of IBM SP2 will be upgraded to 160 nodes with a peak performance of 42.5 Gflop/s. An overview of the IBM SP2 hardware is presented. The basic understanding of architectural details of RS 6000/590 will help application scientists the porting, optimizing, and tuning of codes from other machines such as the CRAY C90 and the Paragon to the NAS SP2. Optimization techniques such as quad-word loading, effective utilization of two floating point units, and data cache optimization of RS 6000/590 is illustrated, with examples giving performance gains at each optimization step. The conversion of codes using Intel's message passing library NX to codes using native Message Passing Library (MPL) and the Message Passing Interface (NMI) library available on the IBM SP2 is illustrated. In particular, we will present the performance of Fast Fourier Transform (FFT) kernel from NAS Parallel Benchmarks (NPB) under MPL and MPI. We have also optimized some of Fortran BLAS 2 and BLAS 3 routines, e.g., the optimized Fortran DAXPY runs at 175 Mflop/s and optimized Fortran DGEMM runs at 230 Mflop/s per node. The performance of the NPB (Class B) on the IBM SP2 is compared with the CRAY C90, Intel Paragon, TMC CM-5E, and the CRAY T3D.

  9. 9975 Shipping Package Performance Of Alternate Materials For Long-Term Storage Application

    International Nuclear Information System (INIS)

    Skidmore, E.; Hoffman, E.; Daugherty, W.

    2010-01-01

    The Model 9975 shipping package specifies the materials of construction for its various components. With the loss of availability of material for two components (cane fiberboard overpack and Viton(reg s ign) GLT O-rings), alternate materials of construction were identified and approved for use for transport (softwood fiberboard and Viton(reg s ign) GLT-S O-rings). As these shipping packages are part of a long-term storage configuration at the Savannah River Site, additional testing is in progress to verify satisfactory long-term performance of the alternate materials under storage conditions. The test results to date can be compared to comparable results on the original materials of construction to draw preliminary conclusions on the performance of the replacement materials.

  10. Profiling an application for power consumption during execution on a compute node

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E

    2013-09-17

    Methods, apparatus, and products are disclosed for profiling an application for power consumption during execution on a compute node that include: receiving an application for execution on a compute node; identifying a hardware power consumption profile for the compute node, the hardware power consumption profile specifying power consumption for compute node hardware during performance of various processing operations; determining a power consumption profile for the application in dependence upon the application and the hardware power consumption profile for the compute node; and reporting the power consumption profile for the application.

  11. 3D printed high performance strain sensors for high temperature applications

    Science.gov (United States)

    Rahman, Md Taibur; Moser, Russell; Zbib, Hussein M.; Ramana, C. V.; Panat, Rahul

    2018-01-01

    Realization of high temperature physical measurement sensors, which are needed in many of the current and emerging technologies, is challenging due to the degradation of their electrical stability by drift currents, material oxidation, thermal strain, and creep. In this paper, for the first time, we demonstrate that 3D printed sensors show a metamaterial-like behavior, resulting in superior performance such as high sensitivity, low thermal strain, and enhanced thermal stability. The sensors were fabricated using silver (Ag) nanoparticles (NPs), using an advanced Aerosol Jet based additive printing method followed by thermal sintering. The sensors were tested under cyclic strain up to a temperature of 500 °C and showed a gauge factor of 3.15 ± 0.086, which is about 57% higher than that of those available commercially. The sensor thermal strain was also an order of magnitude lower than that of commercial gages for operation up to a temperature of 500 °C. An analytical model was developed to account for the enhanced performance of such printed sensors based on enhanced lateral contraction of the NP films due to the porosity, a behavior akin to cellular metamaterials. The results demonstrate the potential of 3D printing technology as a pathway to realize highly stable and high-performance sensors for high temperature applications.

  12. Performance Investigation of an Exhaust Thermoelectric Generator for Military SUV Application

    Directory of Open Access Journals (Sweden)

    Rui Quan

    2018-01-01

    Full Text Available To analyze the thermoelectric power generation for sports utility vehicle (SUV application, a novel thermoelectric generator (TEG based on low-temperature Bi2Te3 thermoelectric modules (TEMs and a chaos-shaped brass heat exchanger is constructed. The temperature distribution of the TEG is analyzed based on an experimental setup, and the temperature uniformity optimization method is performed by chipping peak off and filling valley is taken to validate the improved output power. An automobile exhaust thermoelectric generator (AETEG using four TEGs connected thermally in parallel and electrically in series is assembled into a prototype military SUV, its temperature distribution, output voltage, output power, system efficiency, inner resistance, and backpressure is analyzed, and several important influencing factors such as vehicle speed, clamping pressure, engine coolant flow rate, and ambient temperature on its output performance are tested. Experimental results demonstrate that higher vehicle speed, larger clamping pressure, faster engine coolant flow rate and lower ambient temperature can enhance the overall output performance, but the ambient temperature and coolant flow rate are less significant. The maximum output power of AETEG is 646.26 W, the corresponding conversion efficiency is 1.03%, and the increased backpressure changes from 1681 Pa to 1807 Pa when the highest vehicle speed is 125 km/h.

  13. Almost-sure identifiability of multidimensional harmonic retrieval

    NARCIS (Netherlands)

    Jiang, T; Sidiropoulos, ND; ten Berge, JMF

    Two-dimensional (2-D) and, more generally, multidimensional harmonic retrieval is of interest in a variety of applications, including transmitter localization and joint time and frequency offset estimation in wireless communications. The associated identifiability problem is key in understanding the

  14. Performance of the modified DREAMS ion source for {sup 36}Cl applications

    Energy Technology Data Exchange (ETDEWEB)

    Pavetich, Stefan; Akhmadaliev, Shavkat; Merchel, Silke; Rugel, Georg; Ziegenruecker, Rene [Helmholtz-Zentrum Dresden-Rossendorf (Germany)

    2014-07-01

    First analyses of real {sup 36}Cl-AMS samples were performed with the newly developed low memory-effect ion source at the DREsden Accelerator Mass Spectrometry (DREAMS) facility. Considerable improvements have been reached with respect to the overall ion source performance. Especially, parameters like current output, ion source fractionation effects, normalization factors, blank values and sulphur suppression factors have been investigated to enhance accuracy of {sup 36}Cl-data. Applications cover a wide spectrum, which implies also highly variable {sup 36}Cl/{sup 35+37}Cl-ratios ranging from nearly background level of ∝10{sup -15} up to 10{sup -10}. Samples from aquifers in arid regions for groundwater dating and modelling were analysed. Meteorite samples were measured to investigate the constancy of the galactic cosmic radiation, production rates from sulphur, and reconstruction of exposure histories of individual meteorites.

  15. Application of Entropy-Based Metrics to Identify Emotional Distress from Electroencephalographic Recordings

    Directory of Open Access Journals (Sweden)

    Beatriz García-Martínez

    2016-06-01

    Full Text Available Recognition of emotions is still an unresolved challenge, which could be helpful to improve current human-machine interfaces. Recently, nonlinear analysis of some physiological signals has shown to play a more relevant role in this context than their traditional linear exploration. Thus, the present work introduces for the first time the application of three recent entropy-based metrics: sample entropy (SE, quadratic SE (QSE and distribution entropy (DE to discern between emotional states of calm and negative stress (also called distress. In the last few years, distress has received growing attention because it is a common negative factor in the modern lifestyle of people from developed countries and, moreover, it may lead to serious mental and physical health problems. Precisely, 279 segments of 32-channel electroencephalographic (EEG recordings from 32 subjects elicited to be calm or negatively stressed have been analyzed. Results provide that QSE is the first single metric presented to date with the ability to identify negative stress. Indeed, this metric has reported a discriminant ability of around 70%, which is only slightly lower than the one obtained by some previous works. Nonetheless, discriminant models from dozens or even hundreds of features have been previously obtained by using advanced classifiers to yield diagnostic accuracies about 80%. Moreover, in agreement with previous neuroanatomy findings, QSE has also revealed notable differences for all the brain regions in the neural activation triggered by the two considered emotions. Consequently, given these results, as well as easy interpretation of QSE, this work opens a new standpoint in the detection of emotional distress, which may gain new insights about the brain’s behavior under this negative emotion.

  16. Performance Validity Testing in Neuropsychology: Scientific Basis and Clinical Application-A Brief Review.

    Science.gov (United States)

    Greher, Michael R; Wodushek, Thomas R

    2017-03-01

    Performance validity testing refers to neuropsychologists' methodology for determining whether neuropsychological test performances completed in the course of an evaluation are valid (ie, the results of true neurocognitive function) or invalid (ie, overly impacted by the patient's effort/engagement in testing). This determination relies upon the use of either standalone tests designed for this sole purpose, or specific scores/indicators embedded within traditional neuropsychological measures that have demonstrated this utility. In response to a greater appreciation for the critical role that performance validity issues play in neuropsychological testing and the need to measure this variable to the best of our ability, the scientific base for performance validity testing has expanded greatly over the last 20 to 30 years. As such, the majority of current day neuropsychologists in the United States use a variety of measures for the purpose of performance validity testing as part of everyday forensic and clinical practice and address this issue directly in their evaluations. The following is the first article of a 2-part series that will address the evolution of performance validity testing in the field of neuropsychology, both in terms of the science as well as the clinical application of this measurement technique. The second article of this series will review performance validity tests in terms of methods for development of these measures, and maximizing of diagnostic accuracy.

  17. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  18. Possible applications for municipal solid waste fly ash.

    Science.gov (United States)

    Ferreira, C; Ribeiro, A; Ottosen, L

    2003-01-31

    The present study focuses on existing practices related to the reuse of Municipal Solid Waste (MSW) fly ash and identifies new potential uses. Nine possible applications were identified and grouped into four main categories: construction materials (cement, concrete, ceramics, glass and glass-ceramics); geotechnical applications (road pavement, embankments); "agriculture" (soil amendment); and, miscellaneous (sorbent, sludge conditioning). Each application is analysed in detail, including final-product technical characteristics, with a special emphasis on environmental impacts. A comparative analysis of the different options is performed, stressing the advantages but also the weaknesses of each option. This information is systemized in order to provide a framework for the selection of best technology and final products. The results presented here show new possibilities for this waste reuse in a short-term, in a wide range of fields, resulting in great advantages in waste minimization as well as resources conservation.

  19. Instrument validation system of general application

    International Nuclear Information System (INIS)

    Filshtein, E.L.

    1990-01-01

    This paper describes the Instrument Validation System (IVS) as a software system which has the capability of evaluating the performance of a set of functionally related instrument channels to identify failed instruments and to quantify instrument drift. Under funding from Combustion Engineering (C-E), the IVS has been developed to the extent that a computer program exists whose use has been demonstrated. The initial development work shows promise for success and for wide application, not only to power plants, but also to industrial manufacturing and process control. Applications in the aerospace and military sector are also likely

  20. Salt site performance assessment activities

    International Nuclear Information System (INIS)

    Kircher, J.F.; Gupta, S.K.

    1983-01-01

    During this year the first selection of the tools (codes) for performance assessments of potential salt sites have been tentatively selected and documented; the emphasis has shifted from code development to applications. During this period prior to detailed characterization of a salt site, the focus is on bounding calculations, sensitivity and with the data available. The development and application of improved methods for sensitivity and uncertainty analysis is a focus for the coming years activities and the subject of a following paper in these proceedings. Although the assessments to date are preliminary and based on admittedly scant data, the results indicate that suitable salt sites can be identified and repository subsystems designed which will meet the established criteria for protecting the health and safety of the public. 36 references, 5 figures, 2 tables

  1. Progress of the new CSIRO-GEMOC nuclear microprobe: first results, performance and recent applications

    International Nuclear Information System (INIS)

    Ryan, C.G.; Cripps, G.; Sie, S.H.; Suter, G.F.; Jamieson, D.N.; Griffin, W.L.; Commonwealth Scientific and Industrial Research Organisation

    1999-01-01

    The new CSIRO-GEMOC Nuclear Microprobe (NMP) features a number of technical advances for high resolution, high sensitivity microanalysis. It was designed at the CSIRO and developed as collaboration between the CSlRO, the GEMOC key-centre at Macquarie University and the MARC group of the University of Melbourne. For imaging applications, it also features a software system using a powerful algorithm called Dynamic Analysis, developed at the CSIRO for unmixing elemental signatures in proton induced X-ray emission (PIXE) data, to provide a tool for rapid quantitative imaging of trace and major element spatial distribution in minerals. This paper reports on the performance of the NMP and examples of its application over the past 6 months since completion

  2. Socio-metrics: Identifying Invisible Deviant Adversaries

    Science.gov (United States)

    2015-12-07

    like MySQL since it has been found that graph databases work well on highly connected data (which is the case with OSNs). Moreover, the performance...web application. We also used the Java Vaadin framework, which is built upon the Google Web Toolkit (GWT) since it supports rapid application...development, provides the ability to build professional Uis and also scales well. Compared to Java Web framework, it has been observed that Vaadin has shown

  3. Identifying Opportunities for Exploiting Cross-Layer Interactions in Adaptive Wireless Systems

    Directory of Open Access Journals (Sweden)

    Troy Weingart

    2007-01-01

    Full Text Available The flexibility of cognitive and software-defined radio heralds an opportunity for researchers to reexamine how network protocol layers operate with respect to providing quality of service aware transmission among wireless nodes. This opportunity is enhanced by the continued development of spectrally responsive devices—ones that can detect and respond to changes in the radio frequency environment. Present wireless network protocols define reliability and other performance-related tasks narrowly within layers. For example, the frame size employed on 802.11 can substantially influence the throughput, delay, and jitter experienced by an application, but there is no simple way to adapt this parameter. Furthermore, while the data link layer of 802.11 provides error detection capabilities across a link, it does not specify additional features, such as forward error correction schemes, nor does it provide a means for throttling retransmissions at the transport layer (currently, the data link and transport layer can function counterproductively with respect to reliability. This paper presents an analysis of the interaction of physical, data link, and network layer parameters with respect to throughput, bit error rate, delay, and jitter. The goal of this analysis is to identify opportunities where system designers might exploit cross-layer interactions to improve the performance of Voice over IP (VoIP, instant messaging (IM, and file transfer applications.

  4. What predicts performance during clinical psychology training?

    OpenAIRE

    Scior, Katrina; Bradley, Caroline E; Potts, Henry W W; Woolf, Katherine; de C Williams, Amanda C

    2013-01-01

    Objectives While the question of who is likely to be selected for clinical psychology training has been studied, evidence on performance during training is scant. This study explored data from seven consecutive intakes of the UK's largest clinical psychology training course, aiming to identify what factors predict better or poorer outcomes. Design Longitudinal cross-sectional study using prospective and retrospective data. Method Characteristics at application were analysed in relation to a r...

  5. Thermodynamic analysis and performance assessment of an integrated heat pump system for district heating applications

    International Nuclear Information System (INIS)

    Soltani, Reza; Dincer, Ibrahim; Rosen, Marc A.

    2015-01-01

    A Rankine cycle-driven heat pump system is modeled for district heating applications with superheated steam and hot water as products. Energy and exergy analyses are performed, followed by parametric studies to determine the effects of varying operating conditions and environmental parameters on the system performance. The district heating section is observed to be the most inefficient part of system, exhibiting a relative irreversibility of almost 65%, followed by the steam evaporator and the condenser, with relative irreversibilities of about 18% and 9%, respectively. The ambient temperature is observed to have a significant influence on the overall system exergy destruction. As the ambient temperature decreases, the system exergy efficiency increases. The electricity generated can increase the system exergy efficiency at the expense of a high refrigerant mass flow rate, mainly due to the fact that the available heat source is low quality waste heat. For instance, by adding 2 MW of excess electricity on top of the targeted 6 MW of product heat, the refrigerant mass flow rate increases from 12 kg/s (only heat) to 78 kg/s (heat and electricity), while the production of 8 MW of product heat (same total output, but in form of heat) requires a refrigerant mass flow rate of only 16 kg/s. - Highlights: • A new integrated heat pump system is developed for district heating applications. • An analysis and assessment study is undertaken through exergy analysis methodology. • A comparative efficiency evaluation is performed for practical applications. • A parametric study is conducted to investigate how varying operating conditions and state properties affect energy and exergy efficiencies.

  6. Improved performance of high average power semiconductor arrays for applications in diode pumped solid state lasers

    International Nuclear Information System (INIS)

    Beach, R.; Emanuel, M.; Benett, W.; Freitas, B.; Ciarlo, D.; Carlson, N.; Sutton, S.; Skidmore, J.; Solarz, R.

    1994-01-01

    The average power performance capability of semiconductor diode laser arrays has improved dramatically over the past several years. These performance improvements, combined with cost reductions pursued by LLNL and others in the fabrication and packaging of diode lasers, have continued to reduce the price per average watt of laser diode radiation. Presently, we are at the point where the manufacturers of commercial high average power solid state laser systems used in material processing applications can now seriously consider the replacement of their flashlamp pumps with laser diode pump sources. Additionally, a low cost technique developed and demonstrated at LLNL for optically conditioning the output radiation of diode laser arrays has enabled a new and scalable average power diode-end-pumping architecture that can be simply implemented in diode pumped solid state laser systems (DPSSL's). This development allows the high average power DPSSL designer to look beyond the Nd ion for the first time. Along with high average power DPSSL's which are appropriate for material processing applications, low and intermediate average power DPSSL's are now realizable at low enough costs to be attractive for use in many medical, electronic, and lithographic applications

  7. Performance Portability for Unstructured Mesh Physics

    Energy Technology Data Exchange (ETDEWEB)

    Keasler, J A

    2012-03-23

    ASC legacy software must be ported to emerging hardware architectures. This paper notes that many programming models used by DOE applications are similar, and suggests that constructing a common terminology across these models could reveal a performance portable programming model. The paper then highlights how the LULESH mini-app is used to explore new programming models with outside solution providers. Finally, we suggest better tools to identify parallelism in software, and give suggestions for enhancing the co-design process with vendors.

  8. Considerations in the development of subsurface containment barrier performance standards

    International Nuclear Information System (INIS)

    Dunstan, S.; Zdinak, A.P.; Lodman, D.

    1997-01-01

    The U.S. Department of Energy (DOE) is supporting subsurface barriers as an alternative remedial option for management of contamination problems at their facilities. Past cleanup initiatives have sometimes proven ineffective or extremely expensive. Economic considerations coupled with changing public and regulatory philosophies regarding remediation techniques makes subsurface barriers a promising technology for future cleanup efforts. As part of the initiative to develop subsurface containment barriers as an alternative remedial option, DOE funded MSE Technology Applications, Inc. (MSE) to conduct a comprehensive review to identify performance considerations for the acceptability of subsurface barrier technologies as a containment method. Findings from this evaluation were intended to provide a basis for selection and application of containment technologies to address waste problems at DOE sites. Based on this study, the development of performance standards should consider: (1) sustainable low hydraulic conductivity; (2) capability to meet applicable regulations; (3) compatibility with subsurface environmental conditions; (4) durability and long-term stability; (5) repairability; and (6) verification and monitoring. This paper describes the approach for determining considerations for performance standards

  9. Performances of an atmospheric tritium sampler and its application

    International Nuclear Information System (INIS)

    Inoue, Yoshikazu; Kahn, B.; Carter, M.W.

    1983-01-01

    A sampling system for atmospheric tritium in the form of water vapor, hydrogen and hydrocarbons was designed and built. The air was passed first through molecular sieve which adsorbed water vapor, then over palladium catalyst which oxidized hydrogen and adsorbed resulting water in situ, and finally over hot Hopcalite catalyst, which oxidized hydrocarbons and the resulting water was adsorbed on a following molecular sieve column. Three water samples were extracted from adsorbers and their tritium contents were measured by liquid scintillation counting. Performances of this sampler were examined for retrieval of tritiated water from molecular sieve, oxidation of hydrogen on palladium catalyst and oxidation of methane on Hopcalite. The portable sampler was applied to analyze tritium in a duct air of a heavy water moderated research reactor. More than 99% of total tritium was in vapor form. Trace amounts of tritiated hydrogen and hydrocarbon were also detected. This tritium sampler is applicable to detect all of atmospheric tritium as high as ten times of ambient levels. (author)

  10. Protein Correlation Profiles Identify Lipid Droplet Proteins with High Confidence*

    Science.gov (United States)

    Krahmer, Natalie; Hilger, Maximiliane; Kory, Nora; Wilfling, Florian; Stoehr, Gabriele; Mann, Matthias; Farese, Robert V.; Walther, Tobias C.

    2013-01-01

    Lipid droplets (LDs) are important organelles in energy metabolism and lipid storage. Their cores are composed of neutral lipids that form a hydrophobic phase and are surrounded by a phospholipid monolayer that harbors specific proteins. Most well-established LD proteins perform important functions, particularly in cellular lipid metabolism. Morphological studies show LDs in close proximity to and interacting with membrane-bound cellular organelles, including the endoplasmic reticulum, mitochondria, peroxisomes, and endosomes. Because of these close associations, it is difficult to purify LDs to homogeneity. Consequently, the confident identification of bona fide LD proteins via proteomics has been challenging. Here, we report a methodology for LD protein identification based on mass spectrometry and protein correlation profiles. Using LD purification and quantitative, high-resolution mass spectrometry, we identified LD proteins by correlating their purification profiles to those of known LD proteins. Application of the protein correlation profile strategy to LDs isolated from Drosophila S2 cells led to the identification of 111 LD proteins in a cellular LD fraction in which 1481 proteins were detected. LD localization was confirmed in a subset of identified proteins via microscopy of the expressed proteins, thereby validating the approach. Among the identified LD proteins were both well-characterized LD proteins and proteins not previously known to be localized to LDs. Our method provides a high-confidence LD proteome of Drosophila cells and a novel approach that can be applied to identify LD proteins of other cell types and tissues. PMID:23319140

  11. Application of the Pareto principle to identify and address drug-therapy safety issues.

    Science.gov (United States)

    Müller, Fabian; Dormann, Harald; Pfistermeister, Barbara; Sonst, Anja; Patapovas, Andrius; Vogler, Renate; Hartmann, Nina; Plank-Kiegele, Bettina; Kirchner, Melanie; Bürkle, Thomas; Maas, Renke

    2014-06-01

    Adverse drug events (ADE) and medication errors (ME) are common causes of morbidity in patients presenting at emergency departments (ED). Recognition of ADE as being drug related and prevention of ME are key to enhancing pharmacotherapy safety in ED. We assessed the applicability of the Pareto principle (~80 % of effects result from 20 % of causes) to address locally relevant problems of drug therapy. In 752 cases consecutively admitted to the nontraumatic ED of a major regional hospital, ADE, ME, contributing drugs, preventability, and detection rates of ADE by ED staff were investigated. Symptoms, errors, and drugs were sorted by frequency in order to apply the Pareto principle. In total, 242 ADE were observed, and 148 (61.2 %) were assessed as preventable. ADE contributed to 110 inpatient hospitalizations. The ten most frequent symptoms were causally involved in 88 (80.0 %) inpatient hospitalizations. Only 45 (18.6 %) ADE were recognized as drug-related problems until discharge from the ED. A limited set of 33 drugs accounted for 184 (76.0 %) ADE; ME contributed to 57 ADE. Frequency-based listing of ADE, ME, and drugs involved allowed identification of the most relevant problems and development of easily to implement safety measures, such as wall and pocket charts. The Pareto principle provides a method for identifying the locally most relevant ADE, ME, and involved drugs. This permits subsequent development of interventions to increase patient safety in the ED admission process that best suit local needs.

  12. The Importance of identifiers: IWGSC Meeting 20170720

    OpenAIRE

    Haak, Laurel

    2017-01-01

    Presentation by Laure Haak at the 20 July 2017 meeting of the IWGSC, about use of identifiers in connecting researchers, funding, facilities, and publications. Description of approach and initial results of User Facilities and Publications Working Group, and applications for Scientific Collections.

  13. Image processing system performance prediction and product quality evaluation

    Science.gov (United States)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  14. Load Disaggregation Technologies: Real World and Laboratory Performance

    Energy Technology Data Exchange (ETDEWEB)

    Mayhorn, Ebony T.; Sullivan, Greg P.; Petersen, Joseph M.; Butner, Ryan S.; Johnson, Erica M.

    2016-09-28

    Low cost interval metering and communication technology improvements over the past ten years have enabled the maturity of load disaggregation (or non-intrusive load monitoring) technologies to better estimate and report energy consumption of individual end-use loads. With the appropriate performance characteristics, these technologies have the potential to enable many utility and customer facing applications such as billing transparency, itemized demand and energy consumption, appliance diagnostics, commissioning, energy efficiency savings verification, load shape research, and demand response measurement. However, there has been much skepticism concerning the ability of load disaggregation products to accurately identify and estimate energy consumption of end-uses; which has hindered wide-spread market adoption. A contributing factor is that common test methods and metrics are not available to evaluate performance without having to perform large scale field demonstrations and pilots, which can be costly when developing such products. Without common and cost-effective methods of evaluation, more developed disaggregation technologies will continue to be slow to market and potential users will remain uncertain about their capabilities. This paper reviews recent field studies and laboratory tests of disaggregation technologies. Several factors are identified that are important to consider in test protocols, so that the results reflect real world performance. Potential metrics are examined to highlight their effectiveness in quantifying disaggregation performance. This analysis is then used to suggest performance metrics that are meaningful and of value to potential users and that will enable researchers/developers to identify beneficial ways to improve their technologies.

  15. What is the Value Proposition of Persistent Identifiers?

    Science.gov (United States)

    Klump, Jens; Huber, Robert

    2017-04-01

    Persistent identifiers (PID) are widely used today in scientific communication and documentation. Global unique identification plus persistent resolution of links to referenced digital research objects have been strong selling points for PID Systems as enabling technical infrastructures. Novel applications of PID Systems in research now go beyond the identification of file based objects such as literature or data sets and include the identification of dynamically changing datasets accessed through web services, physical objects, persons and organisations. But not only do we see more use cases but also a proliferation of identifier systems. An analysis of PID Systems used by 1381 repositories listed in the Registry of Research Data Repositories (re3data.org, status of 14 Dec 2015) showed that many disciplinary data repositories make use of PID that are not among the systems promoted by the libraries and publishers (DOI, PURL, ARK). This indicates that a number of communities have developed their own PID Systems. This begs the question, do we need more identifier systems? What makes their value proposition more appealing than those of already existing systems? On the other hand, some of these new use cases deal with entities outside the digital domain, the original scope of application for PIDs. It is therefore necessary to critically appraise the value propositions of available PID Systems and compare these against the requirements of new use cases for PID. Undoubtedly, DOI are the most used persistent identifier in scholarly communication. It was originally designed "to link customers with publishers, facilitate electronic commerce, and enable copyright management systems." Today, the DOI system is described as providing "a technical and social infrastructure for the registration and use of persistent interoperable identifiers for use on digital networks". This example shows how value propositions can change over time. Additional value can be gained by cross

  16. Marshall Application Realignment System (MARS) Architecture

    Science.gov (United States)

    Belshe, Andrea; Sutton, Mandy

    2010-01-01

    interested in Phase 3 because this is where the data analysis, scoring, and recommendation capability is realized. Stakeholders want to see the benefits derived from reducing the steady-state application base and identify opportunities for portfolio performance improvement and application realignment.

  17. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  18. Performance of a Supercritical CO2 Bottoming Cycle for Aero Applications

    Directory of Open Access Journals (Sweden)

    Florian Jacob

    2017-03-01

    Full Text Available By 2050, the evolutionary approach to aero engine research may no longer provide meaningful returns on investment, whereas more radical approaches to improving thermal efficiency and reducing emissions might still prove cost effective. One such radical concept is the addition of a secondary power cycle that utilizes the otherwise largely wasted residual heat in the core engine’s exhaust gases. This could provide additional shaft power. Supercritical carbon dioxide closed-circuit power cycles are currently being investigated primarily for stationary power applications, but their high power density and efficiency, even for modest peak cycle temperatures, makes them credible bottoming cycle options for aero engine applications. Through individual geometric design and performance studies for each of the bottoming cycle’s major components, it was determined that a simple combined cycle aero engine could offer a 1.9% mission fuel burn benefit over a state-of-the-art geared turbofan for the year 2050. However, the even greater potential of more complex systems demands further investigation. For example, adding inter-turbine reheat (ITR to the combined cycle is predicted to significantly improve the fuel burn benefit.

  19. Identification of New Tools to Predict Surgical Performance of Novices using a Plastic Surgery Simulator.

    Science.gov (United States)

    Kazan, Roy; Viezel-Mathieu, Alex; Cyr, Shantale; Hemmerling, Thomas M; Lin, Samuel J; Gilardino, Mirko S

    2018-04-09

    To identify new tools capable of predicting surgical performance of novices on an augmentation mammoplasty simulator. The pace of technical skills acquisition varies between residents and may necessitate more time than that allotted by residency training before reaching competence. Identifying applicants with superior innate technical abilities might shorten learning curves and the time to reach competence. The objective of this study is to identify new tools that could predict surgical performance of novices on a mammoplasty simulator. We recruited 14 medical students and recorded their performance in 2 skill-games: Mikado and Perplexus Epic, and in 2 video games: Star War Racer (Sony Playstation 3) and Super Monkey Ball 2 (Nintendo Wii). Then, each participant performed an augmentation mammoplasty procedure on a Mammoplasty Part-task Trainer, which allows the simulation of the essential steps of the procedure. The average age of participants was 25.4 years. Correlation studies showed significant association between Perplexus Epic, Star Wars Racer, Super Monkey Ball scores and the modified OSATS score with r s = 0.8491 (p 41 (p = 0.005), and r s = 0.7309 (p < 0.003), but not with the Mikado score r s = -0.0255 (p = 0.9). Linear regressions were strongest for Perplexus Epic and Super Monkey Ball scores with coefficients of determination of 0.59 and 0.55, respectively. A combined score (Perplexus/Super-Monkey-Ball) was computed and showed a significant correlation with the modified OSATS score having an r s = 0.8107 (p < 0.001) and R 2 = 0.75, respectively. This study identified a combination of skill games that correlated to better performance of novices on a surgical simulator. With refinement, such tools could serve to help screen plastic surgery applicants and identify those with higher surgical performance predictors. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  20. Performance and scalability analysis of teraflop-scale parallel architectures using multidimensional wavefront applications

    International Nuclear Information System (INIS)

    Hoisie, A.; Lubeck, O.; Wasserman, H.

    1998-01-01

    The authors develop a model for the parallel performance of algorithms that consist of concurrent, two-dimensional wavefronts implemented in a message passing environment. The model, based on a LogGP machine parameterization, combines the separate contributions of computation and communication wavefronts. They validate the model on three important supercomputer systems, on up to 500 processors. They use data from a deterministic particle transport application taken from the ASCI workload, although the model is general to any wavefront algorithm implemented on a 2-D processor domain. They also use the validated model to make estimates of performance and scalability of wavefront algorithms on 100-TFLOPS computer systems expected to be in existence within the next decade as part of the ASCI program and elsewhere. In this context, they analyze two problem sizes. The model shows that on the largest such problem (1 billion cells), inter-processor communication performance is not the bottleneck. Single-node efficiency is the dominant factor