WorldWideScience

Sample records for identify application performance

  1. Identifying performance gaps in hydrogen safety sensor technology for automotive and stationary applications

    International Nuclear Information System (INIS)

    Boon-Brett, L.; Bousek, J.; Black, G.; Moretto, P.; Castello, P.; Huebert, T.; Banach, U.

    2010-01-01

    A market survey has been performed of commercially available hydrogen safety sensors, resulting in a total sample size of 53 sensors from 21 manufacturers. The technical specifications, as provided by the manufacturer, have been collated and are displayed herein as a function of sensor working principle. These specifications comprise measuring range, response and recovery times, ambient temperature, pressure and relative humidity, power consumption and lifetime. These are then compared against known performance targets for both automotive and stationary applications in order to establish in how far current technology satisfies current requirements of sensor end users. Gaps in the performance of hydrogen sensing technologies are thus identified and areas recommended for future research and development. (author)

  2. Identifying High Performance ERP Projects

    OpenAIRE

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  3. Development and application of the Safe Performance Index as a risk-based methodology for identifying major hazard-related safety issues in underground coal mines

    Science.gov (United States)

    Kinilakodi, Harisha

    The underground coal mining industry has been under constant watch due to the high risk involved in its activities, and scrutiny increased because of the disasters that occurred in 2006-07. In the aftermath of the incidents, the U.S. Congress passed the Mine Improvement and New Emergency Response Act of 2006 (MINER Act), which strengthened the existing regulations and mandated new laws to address the various issues related to a safe working environment in the mines. Risk analysis in any form should be done on a regular basis to tackle the possibility of unwanted major hazard-related events such as explosions, outbursts, airbursts, inundations, spontaneous combustion, and roof fall instabilities. One of the responses by the Mine Safety and Health Administration (MSHA) in 2007 involved a new pattern of violations (POV) process to target mines with a poor safety performance, specifically to improve their safety. However, the 2010 disaster (worst in 40 years) gave an impression that the collective effort of the industry, federal/state agencies, and researchers to achieve the goal of zero fatalities and serious injuries has gone awry. The Safe Performance Index (SPI) methodology developed in this research is a straight-forward, effective, transparent, and reproducible approach that can help in identifying and addressing some of the existing issues while targeting (poor safety performance) mines which need help. It combines three injury and three citation measures that are scaled to have an equal mean (5.0) in a balanced way with proportionate weighting factors (0.05, 0.15, 0.30) and overall normalizing factor (15) into a mine safety performance evaluation tool. It can be used to assess the relative safety-related risk of mines, including by mine-size category. Using 2008 and 2009 data, comparisons were made of SPI-associated, normalized safety performance measures across mine-size categories, with emphasis on small-mine safety performance as compared to large- and

  4. Risk and Performance Technologies: Identifying the Keys to Successful Implementation

    International Nuclear Information System (INIS)

    McClain, Lynn; Smith, Art; O'Regan, Patrick

    2002-01-01

    The nuclear power industry has been utilizing risk and performance based technologies for over thirty years. Applications of these technologies have included risk assessment (e.g. Individual Plant Examinations), burden reduction (e.g. Risk-Informed Inservice Inspection, RI-ISI) and risk management (Maintenance Rule, 10CFR50.65). Over the last five to ten years the number of risk-informed (RI) burden reduction initiatives has increased. Unfortunately, the efficiencies of some of these applications have been questionable. This paper investigates those attributes necessary to support successful, cost-effective RI-applications. The premise to this paper is that by understanding the key attributes that support one successful application, insights can be gleaned that will streamline/coordinate future RI-applications. This paper is an extension to a paper presented at the Pressure Vessel and Piping (PVP-2001) Conference. In that paper, a number issues and opportunities were identified that needed to be assessed in order to support future (and efficient) RI-applications. It was noted in the paper that a proper understanding and resolution of these issues will facilitate implementation of risk and performance technology in the operation, maintenance and design disciplines. In addition, it will provide the foundation necessary to support regulatory review and approval. (authors)

  5. Identifying trace evidence in data wiping application software

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2012-06-01

    Full Text Available One area of particular concern for computer forensics examiners involves situations in which someone utilized software applications to destroy evidence. There are products available in the marketplace that are relatively inexpensive and advertised as being able to destroy targeted portions of data stored within a computer system. This study was undertaken to identify these tools and analyze them to determine the extent to which each of the evaluated data wiping applications perform their tasks and to identify trace evidence, if any, left behind on disk media after executing these applications. We evaluated five Windows 7 compatible software products whose advertised features include the ability for users to wipe targeted files, folders, or evidence of selected activities. We conducted a series of experiments that involved executing each application on systems with identical data, and we then analyzed the results and compared the before and after images for each application. We identified information for each application that is beneficial to forensics examiners when faced with similar situations. This paper describes our application selection process, our application evaluation methodology, and our findings. Following this, we describe limitations of this study and suggest areas of additional research that will benefit the study of digital forensics.

  6. Identifiability of PBPK Models with Applications to ...

    Science.gov (United States)

    Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy

  7. Identifying Importance-Performance Matrix Analysis (IPMA) of ...

    African Journals Online (AJOL)

    Identifying Importance-Performance Matrix Analysis (IPMA) of intellectual capital and Islamic work ethics in Malaysian SMES. ... capital and Islamic work ethics significantly influenced business performance. ... AJOL African Journals Online.

  8. Improving applicant selection: identifying qualities of the unsuccessful otolaryngology resident.

    Science.gov (United States)

    Badran, Karam W; Kelley, Kanwar; Conderman, Christian; Mahboubi, Hossein; Armstrong, William B; Bhandarkar, Naveen D

    2015-04-01

    To identify the prevalence and management of problematic residents. Additionally, we hope to identify the factors associated with successful remediation of unsuccessful otolaryngology residents. Self-reported Internet and paper-based survey. An anonymous survey was distributed to 152 current and former program directors (PDs) in 2012. The factors associated with unsuccessful otolaryngology residents and those associated with the successful remediation of problematic residents were investigated. An unsuccessful resident is defined as one who quit or was removed from the program for any reason, or one whose actions resulted in criminal action or citation against their medical license after graduation from residency. Remediation is defined as an individualized program implemented to correct documented weaknesses. The overall response rate was 26% (40 PDs). Seventy-three unsuccessful or problematic residents were identified. Sixty-six problematic or unsuccessful residents were identified during residency, with 58 of 66 (88%) undergoing remediation. Thirty-one (47%) residents did not graduate. The most commonly identified factors of an unsuccessful resident were: change in specialty (21.5%), interpersonal and communication skills with health professionals (13.9%), and clinical judgment (10.1%). Characteristics of those residents who underwent successful remediation include: poor performance on in-training examination (17%, P otolaryngology PDs in this sample identified at least one unsuccessful resident. Improved methods of applicant screening may assist in optimizing otolaryngology resident selection. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  9. Identifying and Ranking the Determinants of Tourism Performance

    DEFF Research Database (Denmark)

    Assaf, A.George; Josiassen, Alexander

    2012-01-01

    , their tourism industries, and tourism businesses seek to improve the performance of the tourism industry and its constituents by vigorously promoting themselves to international tourists, cutting costs, and identifying synergies in their tourism endeavors. In seeking to improve the tourism industry......, the determinants that affect tourism performance are of key interest to the stakeholders. A key obstacle toward improving performance is the multitude of determinants that can affect tourism performance. The literature has yet to provide concrete insights into the determinants of tourism performance...... and their relative importance. The present study addresses this important gap. We identify and rank the determinants of tourism performance. We also provide performance measures of international tourism destinations. The results are derived using the Data Envelopment Analysis (DEA) and bootstrap truncated regression...

  10. Mobile Application to Identify Indonesian Flowers on Android Platform

    Directory of Open Access Journals (Sweden)

    Tita Karlita

    2013-12-01

    Full Text Available Although many people love flowers, they do not know their name. Especially, many people do not recognize local flowers. To find the flower image, we can use search engine such as Google, but it does not give much help to find the name of local flower. Sometimes, Google cannotshow the correct name of local flowers. This study proposes an application to identify Indonesian flowers that runs on the Android platform for easy use anywhere. Flower recognition is based on the color features using the Hue-Index, shape feature using Centroid Contour Distance (CCD, and the similarity measurement using Entropy calculations. The outputs of this application are information about inputted flower image including Latinname, local name, description, distribution and ecology. Based on tests performed on 44 types of flowers with 181 images in the database, the best similarity percentage is 97.72%. With this application, people will be expected to know more about Indonesia flowers. Keywords: Indonesian flowers, android, hue-index, CCD, entropy

  11. Family and academic performance: identifying high school student profiles

    Directory of Open Access Journals (Sweden)

    Alicia Aleli Chaparro Caso López

    2016-01-01

    Full Text Available The objective of this study was to identify profiles of high school students, based on variables related to academic performance, socioeconomic status, cultural capital and family organization. A total of 21,724 high school students, from the five municipalities of the state of Baja California, took part. A K-means cluster analysis was performed to identify the profiles. The analyses identified two clearly-defined clusters: Cluster 1 grouped together students with high academic performance and who achieved higher scores for socioeconomic status, cultural capital and family involvement, whereas Cluster 2 brought together students with low academic achievement, and who also obtained lower scores for socioeconomic status and cultural capital, and had less family involvement. It is concluded that the family variables analyzed form student profiles that can be related to academic achievement.

  12. Measuring individual work performance: identifying and selecting indicators.

    Science.gov (United States)

    Koopmans, Linda; Bernaards, Claire M; Hildebrandt, Vincent H; de Vet, Henrica C W; van der Beek, Allard J

    2014-01-01

    Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions. This study was designed to (1) identify indicators for each dimension, (2) select the most relevant indicators, and (3) determine the relative weight of each dimension in ratings of work performance. IWP indicators were identified from multiple research disciplines, via literature, existing questionnaires, and expert interviews. Subsequently, experts selected the most relevant indicators per dimension and scored the relative weight of each dimension in ratings of IWP. In total, 128 unique indicators were identified. Twenty-three of these indicators were selected by experts as most relevant for measuring IWP. Task performance determined 36% of the work performance rating, while the other three dimensions respectively determined 22%, 20% and 21% of the rating. Notable consensus was found on relevant indicators of IWP, reducing the number from 128 to 23 relevant indicators. This provides an important step towards the development of a standardized, generic and short measurement instrument for assessing IWP.

  13. Identifying the connective strength between model parameters and performance criteria

    Directory of Open Access Journals (Sweden)

    B. Guse

    2017-11-01

    Full Text Available In hydrological models, parameters are used to represent the time-invariant characteristics of catchments and to capture different aspects of hydrological response. Hence, model parameters need to be identified based on their role in controlling the hydrological behaviour. For the identification of meaningful parameter values, multiple and complementary performance criteria are used that compare modelled and measured discharge time series. The reliability of the identification of hydrologically meaningful model parameter values depends on how distinctly a model parameter can be assigned to one of the performance criteria. To investigate this, we introduce the new concept of connective strength between model parameters and performance criteria. The connective strength assesses the intensity in the interrelationship between model parameters and performance criteria in a bijective way. In our analysis of connective strength, model simulations are carried out based on a latin hypercube sampling. Ten performance criteria including Nash–Sutcliffe efficiency (NSE, Kling–Gupta efficiency (KGE and its three components (alpha, beta and r as well as RSR (the ratio of the root mean square error to the standard deviation for different segments of the flow duration curve (FDC are calculated. With a joint analysis of two regression tree (RT approaches, we derive how a model parameter is connected to different performance criteria. At first, RTs are constructed using each performance criterion as the target variable to detect the most relevant model parameters for each performance criterion. Secondly, RTs are constructed using each parameter as the target variable to detect which performance criteria are impacted by changes in the values of one distinct model parameter. Based on this, appropriate performance criteria are identified for each model parameter. In this study, a high bijective connective strength between model parameters and performance criteria

  14. Identifying Architectural Technical Debt in Android Applications through Compliance Checking

    NARCIS (Netherlands)

    Verdecchia, R.

    By considering the fast pace at which mobile applications need to evolve, Architectural Technical Debt results to be a crucial yet implicit factor of success. In this research we present an approach to automatically identify Architectural Technical Debt in Android applications. The approach takes

  15. An Application Of Receptor Modeling To Identify Airborne Particulate ...

    African Journals Online (AJOL)

    An Application Of Receptor Modeling To Identify Airborne Particulate Sources In Lagos, Nigeria. FS Olise, OK Owoade, HB Olaniyi. Abstract. There have been no clear demarcations between industrial and residential areas of Lagos with focus on industry as the major source. There is need to identify potential source types in ...

  16. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts

  17. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  18. Plasma proteomics to identify biomarkers - Application to cardiovascular diseases

    DEFF Research Database (Denmark)

    Beck, Hans Christian; Overgaard, Martin; Melholt Rasmussen, Lars

    2015-01-01

    There is an unmet need for new cardiovascular biomarkers. Despite this only few biomarkers for the diagnosis or screening of cardiovascular diseases have been implemented in the clinic. Thousands of proteins can be analysed in plasma by mass spectrometry-based proteomics technologies. Therefore......, this technology may therefore identify new biomarkers that previously have not been associated with cardiovascular diseases. In this review, we summarize the key challenges and considerations, including strategies, recent discoveries and clinical applications in cardiovascular proteomics that may lead...

  19. Identifying poor performance among doctors in NHS organizations.

    Science.gov (United States)

    Locke, Rachel; Scallan, Samantha; Leach, Camilla; Rickenbach, Mark

    2013-10-01

    To account for the means by which poor performance among career doctors is identified by National Health Service organizations, whether the tools are considered effective and how these processes may be strengthened in the light of revalidation and the requirement for doctors to demonstrate their fitness to practice. This study sought to look beyond the 'doctor as individual'; as well as considering the typical approaches to managing the practice of an individual, the systems within which the doctor is working were reviewed, as these are also relevant to standards of performance. A qualitative review was undertaken consisting of a literature review of current practice, a policy review of current documentation from 15 trusts in one deanery locality, and 14 semi-structured interviews with respondents with an overview of processes in use. The framework for the analysis of the data considered tools at three levels: individual, team and organizational. Tools are, in the main, reactive--with an individual focus. They rely on colleagues and others to speak out, so their effectiveness is hindered by a reluctance to do so. Tools can lack an evidence base for their use, and there is limited linking of data across contexts and tools. There is more work to be done in evaluating current tools and developing stronger processes. Linkage between data sources needs to be improved and proactive tools at the organizational level need further development to help with the early identification of performance issues. This would also assist in balancing a wider systems approach with a current over emphasis on individual doctors. © 2012 John Wiley & Sons Ltd.

  20. Measuring individual work performance: Identifying and selecting indicators

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; de Vet, H.C.W.; van der Beek, A.J.

    2014-01-01

    BACKGROUND: Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions.

  1. Measuring individual work performance: identifying and selecting indicators

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Vet, H.C de; Beek, A.J. van der

    2014-01-01

    BACKGROUND: Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions. OBJECTIVE: This

  2. LFK, FORTRAN Application Performance Test

    International Nuclear Information System (INIS)

    McMahon, F.H.

    1991-01-01

    1 - Description of program or function: LFK, the Livermore FORTRAN Kernels, is a computer performance test that measures a realistic floating-point performance range for FORTRAN applications. Informally known as the Livermore Loops test, the LFK test may be used as a computer performance test, as a test of compiler accuracy (via checksums) and efficiency, or as a hardware endurance test. The LFK test, which focuses on FORTRAN as used in computational physics, measures the joint performance of the computer CPU, the compiler, and the computational structures in units of Mega-flops/sec or Mflops. A C language version of subroutine KERNEL is also included which executes 24 samples of C numerical computation. The 24 kernels are a hydrodynamics code fragment, a fragment from an incomplete Cholesky conjugate gradient code, the standard inner product function of linear algebra, a fragment from a banded linear equations routine, a segment of a tridiagonal elimination routine, an example of a general linear recurrence equation, an equation of state fragment, part of an alternating direction implicit integration code, an integrate predictor code, a difference predictor code, a first sum, a first difference, a fragment from a two-dimensional particle-in-cell code, a part of a one-dimensional particle-in-cell code, an example of how casually FORTRAN can be written, a Monte Carlo search loop, an example of an implicit conditional computation, a fragment of a two-dimensional explicit hydrodynamics code, a general linear recurrence equation, part of a discrete ordinates transport program, a simple matrix calculation, a segment of a Planck distribution procedure, a two-dimensional implicit hydrodynamics fragment, and determination of the location of the first minimum in an array. 2 - Method of solution: CPU performance rates depend strongly on the maturity of FORTRAN compiler machine code optimization. The LFK test-bed executes the set of 24 kernels three times, resetting the DO

  3. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  4. Staff Performance Analysis: A Method for Identifying Brigade Staff Tasks

    National Research Council Canada - National Science Library

    Ford, Laura

    1997-01-01

    ... members of conventional mounted brigade staff. Initial analysis of performance requirements in existing documentation revealed that the performance specifications were not sufficiently detailed for brigade battle staffs...

  5. Identifying The Most Applicable Renewable Energy Systems Of Iran

    Directory of Open Access Journals (Sweden)

    Nasibeh Mousavi

    2017-03-01

    Full Text Available These years because of energy crisis all of country try to find a new way to reduce energy consumptions and obtain maximum use of renewable energy. Iran also is not an exception of this progress. Renewable energy is energy that is provided by renewable sources such as the sun or wind. In general renewable energies are not adaptable to every single community. Because of location and special climate conditions of Iran most applicable renewable energy systems in Iran are solar and wind energy. Main purpose of this paper is to review and identify most applicable renewable energy systems of Iran and also review on traditional and current methods that utilized to obtain maximum use of these renewable energies.

  6. Application of identifying transmission spheres for spherical surface testing

    Science.gov (United States)

    Han, Christopher B.; Ye, Xin; Li, Xueyuan; Wang, Quanzhao; Tang, Shouhong; Han, Sen

    2017-06-01

    We developed a new application on Microsoft Foundation Classes (MFC) to identify correct transmission spheres (TS) for Spherical Surface Testing (SST). Spherical surfaces are important optical surfaces, and the wide application and high production rate of spherical surfaces necessitates an accurate and highly reliable measuring device. A Fizeau Interferometer is an appropriate tool for SST due to its subnanometer accuracy. It measures the contour of a spherical surface using a common path, which is insensitive to the surrounding circumstances. The Fizeau Interferometer transmits a wide laser beam, creating interference fringes from re-converging light from the transmission sphere and the test surface. To make a successful measurement, the application calculates and determines the appropriate transmission sphere for the test surface. There are 3 main inputs from the test surfaces that are utilized to determine the optimal sizes and F-numbers of the transmission spheres: (1) the curvatures (concave or convex), (2) the Radii of Curvature (ROC), and (3) the aperture sizes. The application will firstly calculate the F-numbers (i.e. ROC divided by aperture) of the test surface, secondly determine the correct aperture size of a convex surface, thirdly verify that the ROC of the test surface must be shorter than the reference surface's ROC of the transmission sphere, and lastly calculate the percentage of area that the test surface will be measured. However, the amount of interferometers and transmission spheres should be optimized when measuring large spherical surfaces to avoid requiring a large amount of interferometers and transmission spheres for each test surface. Current measuring practices involve tedious and potentially inaccurate calculations. This smart application eliminates human calculation errors, optimizes the selection of transmission spheres (including the least number required) and interferometer sizes, and increases efficiency.

  7. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  8. Identifying customer-focused performance measures : final report 655.

    Science.gov (United States)

    2010-10-01

    The Arizona Department of Transportation (ADOT) completed a comprehensive customer satisfaction : assessment in July 2009. ADOT commissioned the assessment to acquire statistically valid data from residents : and community leaders to help it identify...

  9. Performance testing to identify climate-ready trees

    Science.gov (United States)

    E.Gregory McPherson; Alison M. Berry; Natalie S. van Doorn

    2018-01-01

    Urban forests produce ecosystem services that can benefit city dwellers, but are especially vulnerable to climate change stressors such as heat, drought, extreme winds and pests. Tree selection is an important decision point for managers wanting to transition to a more stable and resilient urban forest structure. This study describes a five-step process to identify and...

  10. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  11. Identifying and weighting of key performance indicators of knowledge management2.0 in organizations

    Directory of Open Access Journals (Sweden)

    Saeed Khalilazar

    2016-03-01

    Full Text Available Main purpose of this research is identifying and weighting of key performance indicators of knowledge management2.0 in organizations. According to widespread permeation of technology, especially social media in different organizational dimensions and functional view to this phenomenon in knowledge management, performance measurement of this kind of media in order to meet organizational goals seems necessary. KM2.0 key performance indicators in this article has been identified and weighted through Delphi methodology, via questionnaire in three rounds. KM2.0 KPIs which are identified and weighted in this article are applicable in organizations that are eager to implement KM2.0 initiative and they can measure the performance of KM2.0 activities therefore this research is applicable in goal oriented approach. According to the results, KM2.0 participation process consists of 3 stages and 8 steps as mentioned below: First stage which is presence, consists of 3 steps which are registration, visit and download. Second stage which is feedback consists of 3 steps which are conversation, applause and amplification. Finally, third stage which is creation consists of 2 steps which are codification and personalization. Ultimate contribution of this research is identifying and weighting KPIs of KM2.0 in conceptual framework of KM2.0. Based on developing a conceptual framework and participation process in KM2.0 and listing related KPIs as an applicable solution in order to measure and improve the performance of organizational social media, this research has unique innovation among related and other articles.

  12. Identifying Enterprise Leverage Points in Defense Acquisition Program Performance

    Science.gov (United States)

    2009-09-01

    differentiated . [108] Table 1: Table of Validation and Approval Authority5 Beyond the major categories used for programs as noted above, there is also a...impossible to identify which “ uber -portfolio” a system should belong to as many “portfolios” claim a system as an integral part of the larger portfolio...to differentiate between programs. DOD 5002, Enclosure E states “A technology project or acquisition program shall be categorized based on its

  13. Benchmarks for enhanced network performance: hands-on testing of operating system solutions to identify the optimal application server platform for the Graduate School of Business and Public Policy

    OpenAIRE

    Burman, Rex; Coca, Anthony R.

    2010-01-01

    MBA Professional Report With the release of next generation operating systems, network managers face the prospect of upgrading their systems based on the assumption that "newer is better". The Graduate School of Business and Public Policy is in the process of upgrading their network application server and one of the most important decisions to be made is which Server Operating System to use. Based on hands-on benchmark tests and analysis we aim to assist the GSBPP by providing benchma...

  14. Identifying the neural substrates of intrinsic motivation during task performance.

    Science.gov (United States)

    Lee, Woogul; Reeve, Johnmarshall

    2017-10-01

    Intrinsic motivation is the inherent tendency to seek out novelty and challenge, to explore and investigate, and to stretch and extend one's capacities. When people imagine performing intrinsically motivating tasks, they show heightened anterior insular cortex (AIC) activity. To fully explain the neural system of intrinsic motivation, however, requires assessing neural activity while people actually perform intrinsically motivating tasks (i.e., while answering curiosity-inducing questions or solving competence-enabling anagrams). Using event-related functional magnetic resonance imaging, we found that the neural system of intrinsic motivation involves not only AIC activity, but also striatum activity and, further, AIC-striatum functional interactions. These findings suggest that subjective feelings of intrinsic satisfaction (associated with AIC activations), reward processing (associated with striatum activations), and their interactions underlie the actual experience of intrinsic motivation. These neural findings are consistent with the conceptualization of intrinsic motivation as the pursuit and satisfaction of subjective feelings (interest and enjoyment) as intrinsic rewards.

  15. Identifying reverse 3PL performance critical success factors

    OpenAIRE

    Sharif, A M

    2009-01-01

    The reverse and third party logistics operational process is now well known and established to be a vital component of modern day supply chain and product / service-based organizations (Marasco, 2007). Apart from being a vital component of such enterprises, many researchers and practitioners have also been noting the importance of this approach and its impact on customer service, satisfaction, profitability and other key performance indicators (Autry et al., 2001). However, studies relating t...

  16. Performance profiling for brachytherapy applications

    Science.gov (United States)

    Choi, Wonqook; Cho, Kihyeon; Yeo, Insung

    2018-05-01

    In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.

  17. On the importance of identifying, characterizing, and predicting fundamental phenomena towards microbial electrochemistry applications.

    Science.gov (United States)

    Torres, César Iván

    2014-06-01

    The development of microbial electrochemistry research toward technological applications has increased significantly in the past years, leading to many process configurations. This short review focuses on the need to identify and characterize the fundamental phenomena that control the performance of microbial electrochemical cells (MXCs). Specifically, it discusses the importance of recent efforts to discover and characterize novel microorganisms for MXC applications, as well as recent developments to understand transport limitations in MXCs. As we increase our understanding of how MXCs operate, it is imperative to continue modeling efforts in order to effectively predict their performance, design efficient MXC technologies, and implement them commercially. Thus, the success of MXC technologies largely depends on the path of identifying, understanding, and predicting fundamental phenomena that determine MXC performance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Exercise Intensity Thresholds: Identifying the Boundaries of Sustainable Performance.

    Science.gov (United States)

    Keir, Daniel A; Fontana, Federico Y; Robertson, Taylor C; Murias, Juan M; Paterson, Donald H; Kowalchuk, John M; Pogliaghi, Silvia

    2015-09-01

    Critical power (CP), respiratory compensation point (RCP), maximal lactate steady state (MLSS), and deoxyhemoglobin breakpoint ([HHb]BP) are alternative functional indices that are thought to demarcate the highest exercise intensity that can be tolerated for long durations. We tested the hypothesis that CP, RCP, MLSS, and [HHb]BP occur at the same metabolic intensity by examining the pulmonary oxygen uptake (V˙)O2p and power output (PO) associated with each "threshold." Twelve healthy men (mean ± SD age, 27 ± 3 yr) performed the following tests on a cycle ergometer: i) four to five exhaustive tests for determination of CP, ii) two to three 30-min constant-power trials for MLSS determination, and iii) a ramp incremental exercise test from which the V˙O2p and PO at RCP and [HHb]BP were determined. During each trial, breath-by-breath V˙O2p and ventilatory variables were measured with a metabolic cart and flowmeter turbine; near-infrared spectroscopy-derived [HHb] was monitored using a frequency domain multidistance system, and arterialized capillary blood lactate was sampled at regular intervals. There were no differences (P > 0.05) among the V˙O2p values associated with CP, RCP, MLSS, and [HHb]BP (CP, 3.29 ± 0.48; RCP, 3.34 ± 0.45; MLSS, 3.27 ± 0.44; [HHb]BP, 3.41 ± 0.46 L·min(-1)); however, the PO associated with RCP (262 ± 48 W) and [HHb]BP (273 ± 41 W) were greater (P 0.05). Although the standard methods for determination of CP, RCP, MLSS, and [HHb]BP are different, these indices occur at the same V˙O2p, suggesting that i) they may manifest as a result of similar physiological phenomenon and ii) each provides a valid delineation between tolerable and intolerable constant-power exercise.

  19. Methods for identifying 30 chronic conditions: application to administrative data.

    Science.gov (United States)

    Tonelli, Marcello; Wiebe, Natasha; Fortin, Martin; Guthrie, Bruce; Hemmelgarn, Brenda R; James, Matthew T; Klarenbach, Scott W; Lewanczuk, Richard; Manns, Braden J; Ronksley, Paul; Sargious, Peter; Straus, Sharon; Quan, Hude

    2015-04-17

    Multimorbidity is common and associated with poor clinical outcomes and high health care costs. Administrative data are a promising tool for studying the epidemiology of multimorbidity. Our goal was to derive and apply a new scheme for using administrative data to identify the presence of chronic conditions and multimorbidity. We identified validated algorithms that use ICD-9 CM/ICD-10 data to ascertain the presence or absence of 40 morbidities. Algorithms with both positive predictive value and sensitivity ≥70% were graded as "high validity"; those with positive predictive value ≥70% and sensitivity <70% were graded as "moderate validity". To show proof of concept, we applied identified algorithms with high to moderate validity to inpatient and outpatient claims and utilization data from 574,409 people residing in Edmonton, Canada during the 2008/2009 fiscal year. Of the 40 morbidities, we identified 30 that could be identified with high to moderate validity. Approximately one quarter of participants had identified multimorbidity (2 or more conditions), one quarter had a single identified morbidity and the remaining participants were not identified as having any of the 30 morbidities. We identified a panel of 30 chronic conditions that can be identified from administrative data using validated algorithms, facilitating the study and surveillance of multimorbidity. We encourage other groups to use this scheme, to facilitate comparisons between settings and jurisdictions.

  20. Application of artificial neural network to identify nuclear materials

    International Nuclear Information System (INIS)

    Xu Peng; Wang Zhe; Li Tiantuo

    2005-01-01

    Applying the neutral network, the article studied the technology of identifying the gamma spectra of the nuclear material in the nuclear components. In the article, theory of the network identifying the spectra is described, and the results of identification of gamma spectra are given.(authors)

  1. APPLICATION, PERFORMANCE, AND COSTS OF ...

    Science.gov (United States)

    A critical review of biological treatment processes for remediation of contaminated soils is presented. The focus of the review is on documented cost and performance of biological treatment technologies demonstrated at full- or field-scale. Some of the data were generated by the U.S. Environmental Protection Agency's (EPA's) Bioremediation in the Field Program, jointly supported by EPA's Office of Research and Development, EPA's Office of Solid Waste and Emergency Waste, and the EPA Regions through the Superfund Innovative Technology Evaluation Program (SITE) Program. Military sites proved to be another fertile data source. Technologies reviewed in this report include both ex-situ processes, (land treatment, biopile/biocell treatment, composting, and bioslurry reactor treatment) and in-situ alternatives (conventional bioventing, enhanced or cometabolic bioventing, anaerobic bioventing, bioslurping, phytoremediation, and natural attenuation). Targeted soil contaminants at the documented sites were primarily organic chemicals, including BTEX, petroleum hydrocarbons, polycyclic aromatic hydrocarbons (PAHs), chlorinated aliphatic hydrocarbons (CAHs), organic solvents, polychlorinated biphenyls (PCBs), pesticides, dioxin, and energetics. The advantages, limitations, and major cost drivers for each technology are discussed. Box and whisker plots are used to summarize before and after concentrations of important contaminant groups for those technologies consider

  2. Non-identifier based adaptive control in mechatronics theory and application

    CERN Document Server

    Hackl, Christoph M

    2017-01-01

    This book introduces non-identifier-based adaptive control (with and without internal model) and its application to the current, speed and position control of mechatronic systems such as electrical synchronous machines, wind turbine systems, industrial servo systems, and rigid-link, revolute-joint robots. In mechatronics, there is often only rough knowledge of the system. Due to parameter uncertainties, nonlinearities and unknown disturbances, model-based control strategies can reach their performance or stability limits without iterative controller design and performance evaluation, or system identification and parameter estimation. The non-identifier-based adaptive control presented is an alternative that neither identifies the system nor estimates its parameters but ensures stability. The adaptive controllers are easy to implement, compensate for disturbances and are inherently robust to parameter uncertainties and nonlinearities. For controller implementation only structural system knowledge (like relativ...

  3. State E-Government Strategies: Identifying Best Practices and Applications

    Science.gov (United States)

    2007-07-23

    Internet; ! Developing meaningful online applications for local government, businesses, educators, and other sectors; ! Establishing local “ eCommunity ...state, national, and international levels. However, frequently there is little meaningful coordination or communication between various e-government...weekly with the governor, 13% reported meeting monthly, and 21% reported “other,” meaning that these states have a different meeting schedule

  4. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  5. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  6. Modelling the regional application of stakeholder identified land management strategies.

    Science.gov (United States)

    Irvine, B. J.; Fleskens, L.; Kirkby, M. J.

    2012-04-01

    The DESIRE project has trialled a series of sustainable land management (SLM) technologies. These technologies have been identified as being beneficial in mitigating land degradation by local stakeholders from a range of semi-arid study sites. The field results and the qualitative WOCAT technology assessment ftom across the study sites have been used to develop the adapted PESERA SLM model. This paper considers the development of the adapted PESERA SLM model and the potential for applying locally successful SLM technologies across a wider range of climatic and environmental conditions with respect to degradation risk, biomass production and the investment cost interface (PESERA/DESMICE). The integrate PESERA/DESMICE model contributes to the policy debate by providing a biophysical and socio-economic assessment of technology and policy scenarios.

  7. Identifying fly puparia by clearing technique: application to forensic entomology.

    Science.gov (United States)

    Sukontason, Kabkaew L; Ngern-Klun, Radchadawan; Sripakdee, Duanghatai; Sukontason, Kom

    2007-10-01

    In forensic investigations, immature stages of the fly (egg, larva, or puparia) can be used as entomological evidence at death scenes, not only to estimate the postmortem interval (PMI), analyze toxic substances, and to determine the manner of death but also to indicate the movement of a corpse in homicide cases. Of these immature stages, puparia represent the longest developmental time, which makes them of useful. However, in order for forensic entomologists to use puparia effectively, it is crucial that they are able to accurately identify the species of fly found in a corpse. Typically, these puparia are similar in general appearance, being coarctate and light brown to dark brown in color, which makes identification difficult. In this study, we report on the clearing technique used to pale the integument of fly puparia, thereby allowing observation of the anterior end (second to fourth segments) and the profile of the posterior spiracle, which are important clues for identification. We used puparia of the blowfly, Chrysomya megacephala (F.), as the model species in this experiment. With placement in a 20% potassium hydroxide solution daily and mounting on a clearing medium (Permount(R), New Jersey), the profile of the posterior spiracle could be clearly examined under a light microscope beginning on the fifth day after pupation, and the number of papillae in the anterior spiracle could be counted easily starting from the ninth day. Comparison of morphological features of C. megacephala puparia with those of other blowflies (Chrysomya nigripes [Aubertin], Chrysomya rufifacies [Macquart], Chrysomya villeneuvi [Patton], Lucilia cuprina [Wiedemann], and Hemipyrellia ligurriens [Wiedemann]) and a housefly (Musca domestica L.) revealed that the anterior ends and the profiles of the posterior spiracles had markedly distinguishing characteristics. Morphometric analysis of the length and width of puparia, along with the length of the gaps between the posterior spiracles

  8. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  9. How to identify, assess and utilise mobile medical applications in clinical practice.

    Science.gov (United States)

    Aungst, T D; Clauson, K A; Misra, S; Lewis, T L; Husain, I

    2014-02-01

    There are thousands of medical applications for mobile devices targeting use by healthcare professionals. However, several factors related to the structure of the existing market for medical applications create significant barriers preventing practitioners from effectively identifying mobile medical applications for individual professional use. To define existing market factors relevant to selection of medical applications and describe a framework to empower clinicians to identify, assess and utilise mobile medical applications in their own practice. Resources available on the Internet regarding mobile medical applications, guidelines and published research on mobile medical applications. Mobile application stores (e.g. iTunes, Google Play) are not effective means of identifying mobile medical applications. Users of mobile devices that desire to implement mobile medical applications into practice need to carefully assess individual applications prior to utilisation. Searching and identifying mobile medical applications requires clinicians to utilise multiple references to determine what application is best for their individual practice methods. This can be done with a cursory exploration of mobile application stores and then moving onto other available resources published in the literature or through Internet resources (e.g. blogs, medical websites, social media). Clinicians must also take steps to ensure that an identified mobile application can be integrated into practice after carefully reviewing it themselves. Clinicians seeking to identify mobile medical application for use in their individual practice should use a combination of app stores, published literature, web-based resources, and personal review to ensure safe and appropriate use. © 2014 John Wiley & Sons Ltd.

  10. Identifying context factors explaining physician's low performance in communication assessment: an explorative study in general practice.

    NARCIS (Netherlands)

    Essers, G.; Dulmen, S. van; Weel, C. van; Vleuten, C. van der; Kramer, A.

    2011-01-01

    BACKGROUND: Communication is a key competence for health care professionals. Analysis of registrar and GP communication performance in daily practice, however, suggests a suboptimal application of communication skills. The influence of context factors could reveal why communication performance

  11. Application of electron paramagnetic resonance to identify irradiated soybean

    International Nuclear Information System (INIS)

    Bhaskar, S.; Behere, Arun; Sharma, Arun

    2006-01-01

    Full text: Electron paramagnetic resonance spectroscopy was applied to study free radicals in soy bean seed after gamma irradiation and to establish the potential of these radiation induced free radicals as the indicator of the radiation treatment. The radiation doses administered to the samples were 1 to 30 kGy. A stable doublet signal was detected at g = 2.0279 with hyperfine coupling constant of 2.8 mT, produced only by radiolysis. This signal can be used to identify irradiated soy bean seed samples. With the increase of the radiation dose the central line intensity and the intensities of the satellite lines showed almost a linear rise having linear correlation factors of 0.99724 and 0.99996, respectively. Thermal treatment at 373 deg K in air was studied. No line specific to thermolysis was observed. The spectrometer was operated with power 0.253 mW, microwave frequency 9.74 GHz, modulation frequency 100 kHz and scan range 10 mT. To study the stability of the signal, EPR spectra were obtained from the irradiated skin part of soy bean seeds samples following 1 and 90 days of storage after radiation treatment. The two satellite lines of g left = 2.0279 and g right 1.99529 were detected in all samples. This suggests that the signal is associated with a stable radical and therefore, the detection of a particular free radical as a marker of irradiation is proposed

  12. Identifying trends in climate: an application to the cenozoic

    Science.gov (United States)

    Richards, Gordon R.

    1998-05-01

    The recent literature on trending in climate has raised several issues, whether trends should be modeled as deterministic or stochastic, whether trends are nonlinear, and the relative merits of statistical models versus models based on physics. This article models trending since the late Cretaceous. This 68 million-year interval is selected because the reliability of tests for trending is critically dependent on the length of time spanned by the data. Two main hypotheses are tested, that the trend has been caused primarily by CO2 forcing, and that it reflects a variety of forcing factors which can be approximated by statistical methods. The CO2 data is obtained from model simulations. Several widely-used statistical models are found to be inadequate. ARIMA methods parameterize too much of the short-term variation, and do not identify low frequency movements. Further, the unit root in the ARIMA process does not predict the long-term path of temperature. Spectral methods also have little ability to predict temperature at long horizons. Instead, the statistical trend is estimated using a nonlinear smoothing filter. Both of these paradigms make it possible to model climate as a cointegrated process, in which temperature can wander quite far from the trend path in the intermediate term, but converges back over longer horizons. Comparing the forecasting properties of the two trend models demonstrates that the optimal forecasting model includes CO2 forcing and a parametric representation of the nonlinear variability in climate.

  13. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  14. APM Best Practices Realizing Application Performance Management

    CERN Document Server

    Sydor, Michael J

    2011-01-01

    The objective of APM Best Practices: Realizing Application Performance Management is to establish reliable application performance management (APM) practices - to demonstrate value, to do it quickly, and to adapt to the client circumstances. It's important to balance long-term goals with short-term deliverables, but without compromising usefulness or correctness. The successful strategy is to establish a few reasonable goals, achieve them quickly, and then iterate over the same topics two more times, with each successive iteration expanding the skills and capabilities of the APM team. This str

  15. Optical design applications for enhanced illumination performance

    Science.gov (United States)

    Gilray, Carl; Lewin, Ian

    1995-08-01

    Nonimaging optical design techniques have been applied in the illumination industry for many years. Recently however, powerful software has been developed which allows accurate simulation and optimization of illumination devices. Wide experience has been obtained in using such design techniques for practical situations. These include automotive lighting where safety is of greatest importance, commercial lighting systems designed for energy efficiency, and numerous specialized applications. This presentation will discuss the performance requirements of a variety of illumination devices. It will further cover design methodology and present a variety of examples of practical applications for enhanced system performance.

  16. Identifying context factors explaining physician's low performance in communication assessment: an explorative study in general practice.

    Science.gov (United States)

    Essers, Geurt; van Dulmen, Sandra; van Weel, Chris; van der Vleuten, Cees; Kramer, Anneke

    2011-12-13

    Communication is a key competence for health care professionals. Analysis of registrar and GP communication performance in daily practice, however, suggests a suboptimal application of communication skills. The influence of context factors could reveal why communication performance levels, on average, do not appear adequate. The context of daily practice may require different skills or specific ways of handling these skills, whereas communication skills are mostly treated as generic. So far no empirical analysis of the context has been made. Our aim was to identify context factors that could be related to GP communication. A purposive sample of real-life videotaped GP consultations was analyzed (N = 17). As a frame of reference we chose the MAAS-Global, a widely used assessment instrument for medical communication. By inductive reasoning, we analyzed the GP behaviour in the consultation leading to poor item scores on the MAAS-Global. In these cases we looked for the presence of an intervening context factor, and how this might explain the actual GP communication behaviour. We reached saturation after having viewed 17 consultations. We identified 19 context factors that could potentially explain the deviation from generic recommendations on communication skills. These context factors can be categorized into doctor-related, patient-related, and consultation-related factors. Several context factors seem to influence doctor-patient communication, requiring the GP to apply communication skills differently from recommendations on communication. From this study we conclude that there is a need to explicitly account for context factors in the assessment of GP (and GP registrar) communication performance. The next step is to validate our findings.

  17. Diagnostic performance of BMI percentiles to identify adolescents with metabolic syndrome.

    Science.gov (United States)

    Laurson, Kelly R; Welk, Gregory J; Eisenmann, Joey C

    2014-02-01

    To compare the diagnostic performance of the Centers for Disease Control and Prevention (CDC) and FITNESSGRAM (FGram) BMI standards for quantifying metabolic risk in youth. Adolescents in the NHANES (n = 3385) were measured for anthropometric variables and metabolic risk factors. BMI percentiles were calculated, and youth were categorized by weight status (using CDC and FGram thresholds). Participants were also categorized by presence or absence of metabolic syndrome. The CDC and FGram standards were compared by prevalence of metabolic abnormalities, various diagnostic criteria, and odds of metabolic syndrome. Receiver operating characteristic curves were also created to identify optimal BMI percentiles to detect metabolic syndrome. The prevalence of metabolic syndrome in obese youth was 19% to 35%, compared with <2% in the normal-weight groups. The odds of metabolic syndrome for obese boys and girls were 46 to 67 and 19 to 22 times greater, respectively, than for normal-weight youth. The receiver operating characteristic analyses identified optimal thresholds similar to the CDC standards for boys and the FGram standards for girls. Overall, BMI thresholds were more strongly associated with metabolic syndrome in boys than in girls. Both the CDC and FGram standards are predictive of metabolic syndrome. The diagnostic utility of the CDC thresholds outperformed the FGram values for boys, whereas FGram standards were slightly better thresholds for girls. The use of a common set of thresholds for school and clinical applications would provide advantages for public health and clinical research and practice.

  18. Identifying context factors explaining physician's low performance in communication assessment: an explorative study in general practice.

    NARCIS (Netherlands)

    Essers, G.T.J.M.; Dulmen, A.M. van; Weel, C. van; Vleuten, C.P.M. van der; Kramer, A.W.

    2011-01-01

    ABSTRACT: BACKGROUND: Communication is a key competence for health care professionals. Analysis of registrar and GP communication performance in daily practice, however, suggests a suboptimal application of communication skills. The influence of context factors could reveal why communication

  19. Scientific Applications Performance Evaluation on Burst Buffer

    KAUST Repository

    Markomanolis, George S.

    2017-10-19

    Parallel I/O is an integral component of modern high performance computing, especially in storing and processing very large datasets, such as the case of seismic imaging, CFD, combustion and weather modeling. The storage hierarchy includes nowadays additional layers, the latest being the usage of SSD-based storage as a Burst Buffer for I/O acceleration. We present an in-depth analysis on how to use Burst Buffer for specific cases and how the internal MPI I/O aggregators operate according to the options that the user provides during his job submission. We analyze the performance of a range of I/O intensive scientific applications, at various scales on a large installation of Lustre parallel file system compared to an SSD-based Burst Buffer. Our results show a performance improvement over Lustre when using Burst Buffer. Moreover, we show results from a data hierarchy library which indicate that the standard I/O approaches are not enough to get the expected performance from this technology. The performance gain on the total execution time of the studied applications is between 1.16 and 3 times compared to Lustre. One of the test cases achieved an impressive I/O throughput of 900 GB/s on Burst Buffer.

  20. Application of data mining in performance measures

    Science.gov (United States)

    Chan, Michael F. S.; Chung, Walter W.; Wong, Tai Sun

    2001-10-01

    This paper proposes a structured framework for exploiting data mining application for performance measures. The context is set in an airline company is illustrated for the use of such framework. The framework takes in consideration of how a knowledge worker interacts with performance information at the enterprise level to support them to make informed decision in managing the effectiveness of operations. A case study of applying data mining technology for performance data in an airline company is illustrated. The use of performance measures is specifically applied to assist in the aircraft delay management process. The increasingly dispersed and complex operations of airline operation put much strain on the part of knowledge worker in using search, acquiring and analyzing information to manage performance. One major problem faced with knowledge workers is the identification of root causes of performance deficiency. The large amount of factors involved in the analyze the root causes can be time consuming and the objective of applying data mining technology is to reduce the time and resources needed for such process. The increasing market competition for better performance management in various industries gives rises to need of the intelligent use of data. Because of this, the framework proposed here is very much generalizable to industries such as manufacturing. It could assist knowledge workers who are constantly looking for ways to improve operation effectiveness through new initiatives and the effort is required to be quickly done to gain competitive advantage in the marketplace.

  1. Automatic Energy Schemes for High Performance Applications

    Energy Technology Data Exchange (ETDEWEB)

    Sundriyal, Vaibhav [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Although high-performance computing traditionally focuses on the efficient execution of large-scale applications, both energy and power have become critical concerns when approaching exascale. Drastic increases in the power consumption of supercomputers affect significantly their operating costs and failure rates. In modern microprocessor architectures, equipped with dynamic voltage and frequency scaling (DVFS) and CPU clock modulation (throttling), the power consumption may be controlled in software. Additionally, network interconnect, such as Infiniband, may be exploited to maximize energy savings while the application performance loss and frequency switching overheads must be carefully balanced. This work first studies two important collective communication operations, all-to-all and allgather and proposes energy saving strategies on the per-call basis. Next, it targets point-to-point communications to group them into phases and apply frequency scaling to them to save energy by exploiting the architectural and communication stalls. Finally, it proposes an automatic runtime system which combines both collective and point-to-point communications into phases, and applies throttling to them apart from DVFS to maximize energy savings. The experimental results are presented for NAS parallel benchmark problems as well as for the realistic parallel electronic structure calculations performed by the widely used quantum chemistry package GAMESS. Close to the maximum energy savings were obtained with a substantially low performance loss on the given platform.

  2. Performance of the lot quality assurance sampling method compared to surveillance for identifying inadequately-performing areas in Matlab, Bangladesh.

    Science.gov (United States)

    Bhuiya, Abbas; Hanifi, S M A; Roy, Nikhil; Streatfield, P Kim

    2007-03-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population.

  3. Performance of Student Software Development Teams: The Influence of Personality and Identifying as Team Members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms…

  4. Performance-Oriented packaging: A guide to identifying, procuring, and using

    International Nuclear Information System (INIS)

    O'Brien, J.H.

    1992-09-01

    This document guides users through the process of correctly identifying, obtaining, and using performance-oriented packaging. Almost all hazardous material shipments can be made in commercially available performance-oriented packaging. To cover the remaining shipments requiring specially designed packaging, a design guide is being developed. The design guide is scheduled to be issued 1 year after this procurement guide

  5. A systematic literature search to identify performance measure outcomes used in clinical studies of racehorses.

    Science.gov (United States)

    Wylie, C E; Newton, J R

    2018-05-01

    Racing performance is often used as a measurable outcome variable in research studies investigating clinical diagnoses or interventions. However, the use of many different performance measures largely precludes conduct of meaningful comparative studies and, to date, those being used have not been collated. To systematically review the veterinary scientific literature for the use of racing performance as a measurable outcome variable in clinical studies of racehorses, collate and identify those most popular, and identify their advantages and disadvantages. Systematic literature search. The search criteria "((racing AND performance) AND (horses OR equidae))" were adapted for both MEDLINE and CAB Abstracts databases. Data were collected in standardised recording forms for binary, categorical and quantitative measures, and the use of performance indices. In total, 217 studies that described racing performance were identified, contributing 117 different performance measures. No one performance measure was used in all studies, despite 90.3% using more than one variable. Data regarding race starts and earnings were used most commonly, with 88.0% and 54.4% of studies including at least one measure of starts and earnings, respectively. Seventeen variables were used 10 times or more, with the top five comprising: 'return to racing', 'number of starts', 'days to first start', 'earnings per period of time' and 'earnings per start'. The search strategies may not have identified all relevant papers, introducing bias to the review. Performance indices have been developed to improve assessment of interventions; however, they are not widely adopted in the scientific literature. Use of the two most commonly identified measures, whether the horse returned to racing and number of starts over a defined period of time, would best facilitate future systematic reviews and meta-analyses in advance of the development of a gold-standard measure of race performance outcome. © 2017 EVJ Ltd.

  6. Identifying the most significant indicators of the total road safety performance index.

    Science.gov (United States)

    Tešić, Milan; Hermans, Elke; Lipovac, Krsto; Pešić, Dalibor

    2018-04-01

    The review of the national and international literature dealing with the assessment of the road safety level has shown great efforts of the authors who tried to define the methodology for calculating the composite road safety index on a territory (region, state, etc.). The procedure for obtaining a road safety composite index of an area has been largely harmonized. The question that has not been fully resolved yet concerns the selection of indicators. There is a wide range of road safety indicators used to show a road safety situation on a territory. Road safety performance index (RSPI) obtained on the basis of a larger number of safety performance indicators (SPIs) enable decision makers to more precisely define the earlier goal- oriented actions. However, recording a broader comprehensive set of SPIs helps identify the strengths and weaknesses of a country's road safety system. Providing high quality national and international databases that would include comparable SPIs seems to be difficult since a larger number of countries dispose of a small number of identical indicators available for use. Therefore, there is a need for calculating a road safety performance index with a limited number of indicators (RSPI ln n ) which will provide a comparison of a sufficient quality, of as many countries as possible. The application of the Data Envelopment Analysis (DEA) method and correlative analysis has helped to check if the RSPI ln n is likely to be of sufficient quality. A strong correlation between the RSPI ln n and the RSPI has been identified using the proposed methodology. Based on this, the most contributing indicators and methodologies for gradual monitoring of SPIs, have been defined for each country analyzed. The indicator monitoring phases in the analyzed countries have been defined in the following way: Phase 1- the indicators relating to alcohol, speed and protective systems; Phase 2- the indicators relating to roads and Phase 3- the indicators relating to

  7. Heat pumps for geothermal applications: availability and performance. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Reistad, G.M.; Means, P.

    1980-05-01

    A study of the performance and availability of water-source heat pumps was carried out. The primary purposes were to obtain the necessary basic information required for proper evaluation of the role of water-source heat pumps in geothermal energy utilization and/or to identify the research needed to provide this information. The Search of Relevant Literature considers the historical background, applications, achieved and projected performance evaluations and performance improvement techniques. The commercial water-source heat pump industry is considered in regard to both the present and projected availability and performance of units. Performance evaluations are made for units that use standard components but are redesigned for use in geothermal heating.

  8. Aplicação da análise fatorial para identificação dos principais indicadores de desempenho econômico-financeiro em instituições financeiras bancárias = Application of factor analysis to identify the main indicators of economic and financial performance in banking financial institutions

    Directory of Open Access Journals (Sweden)

    Júlia Alves e Souza

    2017-04-01

    2014 are encompassed. It was carried out a quantitative and descriptive study. For data analysis, the statistical technique of factor analysis was used. In the application process of that technique, the overall appropriateness of the model and of each variable were verified, in order to identify key indicators that will compose the analysis of banks. The study was developed from an initial set of 17 indicators used to analyze the economic and financial performance of such institutions. Following the criteria of factor analysis techniques, the indicators that explain the maximum of variance from the smallest possible number of variables were selected. The results show that the most relevant indicators for evaluating the performance of such institutions are: Return on Total Investment, Net Margin, Return on Equity, Ratio of Capital to Deposits, Loans/Deposits Ratio, Immediate Liquidity, Voluntary Fit and Interest Rate Sensitivity. These 8 indicators can also be replaced by 3 factors, which explain about 89,23% of the overall data range. The factors "Profitability and Profitability", "Capital and Liquidity" and "Fitting and Interest Sensitivity" allow us to classify and compare the performance of banking financial institutions.

  9. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  10. Identifying Knowledge Gaps in Clinicians Who Evaluate and Treat Vocal Performing Artists in College Health Settings.

    Science.gov (United States)

    McKinnon-Howe, Leah; Dowdall, Jayme

    2018-05-01

    The goal of this study was to identify knowledge gaps in clinicians who evaluate and treat performing artists for illnesses and injuries that affect vocal function in college health settings. This pilot study utilized a web-based cross-sectional survey design incorporating common clinical scenarios to test knowledge of evaluation and management strategies in the vocal performing artist. A web-based survey was administered to a purposive sample of 28 clinicians to identify the approach utilized to evaluate and treat vocal performing artists in college health settings, and factors that might affect knowledge gaps and influence referral patterns to voice specialists. Twenty-eight clinicians were surveyed, with 36% of respondents incorrectly identifying appropriate vocal hygiene measures, 56% of respondents failing to identify symptoms of vocal fold hemorrhage, 84% failing to identify other indications for referral to a voice specialist, 96% of respondents acknowledging unfamiliarity with the Voice Handicap Index and the Singers Voice Handicap Index, and 68% acknowledging unfamiliarity with the Reflux Symptom Index. The data elucidated specific knowledge gaps in college health providers who are responsible for evaluating and treating common illnesses that affect vocal function, and triaging and referring students experiencing symptoms of potential vocal emergencies. Future work is needed to improve the standard of care for this population. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  11. Students' Performance When Aurally Identifying Musical Harmonic Intervals: Experimentation of a Teaching Innovation Proposal

    Science.gov (United States)

    Ponsatí, Imma; Miranda, Joaquim; Amador, Miquel; Godall, Pere

    2016-01-01

    The aim of the study was to measure the performance reached by students (N = 138) when aurally identifying musical harmonic intervals (from m2 to P8) after having experienced a teaching innovation proposal for the Music Conservatories of Catalonia (Spain) based on observational methodology. Its design took into account several issues, which had…

  12. Key Issues in Empirically Identifying Chronically Low-Performing and Turnaround Schools

    Science.gov (United States)

    Hansen, Michael

    2012-01-01

    One of the US Department of Education's key priorities is turning around the nation's persistently low-achieving schools, yet exactly how to identify low-performing schools is a task left to state policy makers, and a myriad of definitions have been utilized. In addition, exactly how to recognize when a school begins to turn around is not well…

  13. School Correlates of Academic Behaviors and Performance among McKinney-Vento Identified Youth

    Science.gov (United States)

    Stone, Susan; Uretsky, Mathew

    2016-01-01

    We utilized a pooled sample of elementary, middle, and high school-aged children identified as homeless via definitions set forth by McKinney-Vento legislation in a large urban district in California to estimate the extent to which school factors contributed to student attendance, suspensions, test-taking behaviors, and performance on state…

  14. The Use of a Performance Assessment for Identifying Gifted Lebanese Students: Is DISCOVER Effective?

    Science.gov (United States)

    Sarouphim, Ketty M.

    2009-01-01

    The purpose of this study was to investigate the effectiveness of DISCOVER, a performance- based assessment in identifying gifted Lebanese students. The sample consisted of 248 students (121 boys, 127 girls) from Grades 3-5 at two private schools in Beirut, Lebanon. Students were administered DISCOVER and the Raven Standard Progressive Matrices…

  15. Predicting General Academic Performance and Identifying the Differential Contribution of Participating Variables Using Artificial Neural Networks

    Science.gov (United States)

    Musso, Mariel F.; Kyndt, Eva; Cascallar, Eduardo C.; Dochy, Filip

    2013-01-01

    Many studies have explored the contribution of different factors from diverse theoretical perspectives to the explanation of academic performance. These factors have been identified as having important implications not only for the study of learning processes, but also as tools for improving curriculum designs, tutorial systems, and students'…

  16. Common genetic variants associated with cognitive performance identified using the proxy-phenotype method

    NARCIS (Netherlands)

    C.A. Rietveld (Niels); T. Esko (Tõnu); G. Davies (Gail); T.H. Pers (Tune); P. Turley (Patrick); B. Benyamin (Beben); C.F. Chabris (Christopher F.); V. Emilsson (Valur); A.D. Johnson (Andrew); J.J. Lee (James J.); C. de Leeuw (Christiaan); R.E. Marioni (Riccardo); S.E. Medland (Sarah Elizabeth); M. Miller (Mike); O. Rostapshova (Olga); S.J. van der Lee (Sven); A.A.E. Vinkhuyzen (Anna A.); N. Amin (Najaf); D. Conley (Dalton); J. Derringer; C.M. van Duijn (Cornelia); R.S.N. Fehrmann (Rudolf); L. Franke (Lude); E.L. Glaeser (Edward L.); N.K. Hansell (Narelle); C. Hayward (Caroline); W.G. Iacono (William); C.A. Ibrahim-Verbaas (Carla); V.W.V. Jaddoe (Vincent); J. Karjalainen (Juha); D. Laibson (David); P. Lichtenstein (Paul); D.C. Liewald (David C.); P.K. Magnusson (Patrik); N.G. Martin (Nicholas); M. McGue (Matt); G. Mcmahon (George); N.L. Pedersen (Nancy); S. Pinker (Steven); D.J. Porteous (David J.); D. Posthuma (Danielle); F. Rivadeneira Ramirez (Fernando); B.H. Smithk (Blair H.); J.M. Starr (John); H.W. Tiemeier (Henning); N.J. Timpsonm (Nicholas J.); M. Trzaskowskin (Maciej); A.G. Uitterlinden (André); F.C. Verhulst (Frank); M.E. Ward (Mary); M.J. Wright (Margaret); G.D. Smith; I.J. Deary (Ian J.); M. Johannesson (Magnus); R. Plomin (Robert); P.M. Visscher (Peter); D.J. Benjamin (Daniel J.); D. Cesarini (David); Ph.D. Koellinger (Philipp)

    2014-01-01

    textabstractWe identify common genetic variants associated with cognitive performance using a two-stage approach, which we call the proxyphenotype method. First, we conduct a genome-wide association study of educational attainment in a large sample (n = 106,736), which produces a set of 69

  17. Video performance for high security applications

    International Nuclear Information System (INIS)

    Connell, Jack C.; Norman, Bradley C.

    2010-01-01

    The complexity of physical protection systems has increased to address modern threats to national security and emerging commercial technologies. A key element of modern physical protection systems is the data presented to the human operator used for rapid determination of the cause of an alarm, whether false (e.g., caused by an animal, debris, etc.) or real (e.g., a human adversary). Alarm assessment, the human validation of a sensor alarm, primarily relies on imaging technologies and video systems. Developing measures of effectiveness (MOE) that drive the design or evaluation of a video system or technology becomes a challenge, given the subjectivity of the application (e.g., alarm assessment). Sandia National Laboratories has conducted empirical analysis using field test data and mathematical models such as binomial distribution and Johnson target transfer functions to develop MOEs for video system technologies. Depending on the technology, the task of the security operator and the distance to the target, the Probability of Assessment (PAs) can be determined as a function of a variety of conditions or assumptions. PAs used as an MOE allows the systems engineer to conduct trade studies, make informed design decisions, or evaluate new higher-risk technologies. This paper outlines general video system design trade-offs, discusses ways video can be used to increase system performance and lists MOEs for video systems used in subjective applications such as alarm assessment.

  18. Experiential knowledge of expert coaches can help identify informational constraints on performance of dynamic interceptive actions.

    Science.gov (United States)

    Greenwood, Daniel; Davids, Keith; Renshaw, Ian

    2014-01-01

    Coordination of dynamic interceptive movements is predicated on cyclical relations between an individual's actions and information sources from the performance environment. To identify dynamic informational constraints, which are interwoven with individual and task constraints, coaches' experiential knowledge provides a complementary source to support empirical understanding of performance in sport. In this study, 15 expert coaches from 3 sports (track and field, gymnastics and cricket) participated in a semi-structured interview process to identify potential informational constraints which they perceived to regulate action during run-up performance. Expert coaches' experiential knowledge revealed multiple information sources which may constrain performance adaptations in such locomotor pointing tasks. In addition to the locomotor pointing target, coaches' knowledge highlighted two other key informational constraints: vertical reference points located near the locomotor pointing target and a check mark located prior to the locomotor pointing target. This study highlights opportunities for broadening the understanding of perception and action coupling processes, and the identified information sources warrant further empirical investigation as potential constraints on athletic performance. Integration of experiential knowledge of expert coaches with theoretically driven empirical knowledge represents a promising avenue to drive future applied science research and pedagogical practice.

  19. A prediction model to identify hospitalised, older adults with reduced physical performance

    DEFF Research Database (Denmark)

    Bruun, Inge H; Maribo, Thomas; Nørgaard, Birgitte

    2017-01-01

    of discharge, health systems could offer these patients additional therapy to maintain or improve health and prevent institutionalisation or readmission. The principle aim of this study was to identify predictors for persisting, reduced physical performance in older adults following acute hospitalisation......BACKGROUND: Identifying older adults with reduced physical performance at the time of hospital admission can significantly affect patient management and trajectory. For example, such patients could receive targeted hospital interventions such as routine mobilisation. Furthermore, at the time...... admission, falls, physical activity level, self-rated health, use of a walking aid before admission, number of prescribed medications, 30s-CST, and the De Morton Mobility Index. RESULTS: A total of 78 (67%) patients improved in physical performance in the interval between admission and follow-up assessment...

  20. Performance of student software development teams: the influence of personality and identifying as team members

    Science.gov (United States)

    Monaghan, Conal; Bizumic, Boris; Reynolds, Katherine; Smithson, Michael; Johns-Boast, Lynette; van Rooy, Dirk

    2015-01-01

    One prominent approach in the exploration of the variations in project team performance has been to study two components of the aggregate personalities of the team members: conscientiousness and agreeableness. A second line of research, known as self-categorisation theory, argues that identifying as team members and the team's performance norms should substantially influence the team's performance. This paper explores the influence of both these perspectives in university software engineering project teams. Eighty students worked to complete a piece of software in small project teams during 2007 or 2008. To reduce limitations in statistical analysis, Monte Carlo simulation techniques were employed to extrapolate from the results of the original sample to a larger simulated sample (2043 cases, within 319 teams). The results emphasise the importance of taking into account personality (particularly conscientiousness), and both team identification and the team's norm of performance, in order to cultivate higher levels of performance in student software engineering project teams.

  1. Performance of the Lot Quality Assurance Sampling Method Compared to Surveillance for Identifying Inadequately-performing Areas in Matlab, Bangladesh

    OpenAIRE

    Bhuiya, Abbas; Hanifi, S.M.A.; Roy, Nikhil; Streatfield, P. Kim

    2007-01-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers i...

  2. Assessing the Performance of a Machine Learning Algorithm in Identifying Bubbles in Dust Emission

    Science.gov (United States)

    Xu, Duo; Offner, Stella S. R.

    2017-12-01

    Stellar feedback created by radiation and winds from massive stars plays a significant role in both physical and chemical evolution of molecular clouds. This energy and momentum leaves an identifiable signature (“bubbles”) that affects the dynamics and structure of the cloud. Most bubble searches are performed “by eye,” which is usually time-consuming, subjective, and difficult to calibrate. Automatic classifications based on machine learning make it possible to perform systematic, quantifiable, and repeatable searches for bubbles. We employ a previously developed machine learning algorithm, Brut, and quantitatively evaluate its performance in identifying bubbles using synthetic dust observations. We adopt magnetohydrodynamics simulations, which model stellar winds launching within turbulent molecular clouds, as an input to generate synthetic images. We use a publicly available three-dimensional dust continuum Monte Carlo radiative transfer code, HYPERION, to generate synthetic images of bubbles in three Spitzer bands (4.5, 8, and 24 μm). We designate half of our synthetic bubbles as a training set, which we use to train Brut along with citizen-science data from the Milky Way Project (MWP). We then assess Brut’s accuracy using the remaining synthetic observations. We find that Brut’s performance after retraining increases significantly, and it is able to identify yellow bubbles, which are likely associated with B-type stars. Brut continues to perform well on previously identified high-score bubbles, and over 10% of the MWP bubbles are reclassified as high-confidence bubbles, which were previously marginal or ambiguous detections in the MWP data. We also investigate the influence of the size of the training set, dust model, evolutionary stage, and background noise on bubble identification.

  3. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  4. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Kohlhof, Hendrik; Heidt, Christoph; Bähler, Alexandrine; Kohl, Sandro; Gravius, Sascha; Friedrich, Max J.; Ziebarth, Kai; Stranzinger, Enno

    2015-01-01

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  5. Males Perform Better in Identifying Voices During Menstruation Than Females: A Pilot Study.

    Science.gov (United States)

    Wang, Xue; Xu, Xin; Liu, Yangyang

    2016-10-01

    The objective of the present study is to investigate gender differences in the ability to identify females' voice during menstruation. In Study 1, 55 male participants (M age = 19.6 years, SD = 1.0) were asked to listen to vocal samples from women during both ovulation and menstruation and to identify which recordings featured menstruating women. The results showed that the accuracy of men's responses (M = 56.73%, SD = 0.21) was significantly higher than 50%. In Study 2, 118 female students (M age = 19.4 years, SD = 1.6) completed the same task. The results indicated that the accuracy of women's performance was nearly 50%. These preliminary findings suggest that men are better able to identify women's voices during menstruation than women. Future work could consider several significant variables for the purpose of validating the results. © The Author(s) 2016.

  6. Identifying blood biomarkers and physiological processes that distinguish humans with superior performance under psychological stress.

    Directory of Open Access Journals (Sweden)

    Amanda M Cooksey

    2009-12-01

    Full Text Available Attrition of students from aviation training is a serious financial and operational concern for the U.S. Navy. Each late stage navy aviator training failure costs the taxpayer over $1,000,000 and ultimately results in decreased operational readiness of the fleet. Currently, potential aviators are selected based on the Aviation Selection Test Battery (ASTB, which is a series of multiple-choice tests that evaluate basic and aviation-related knowledge and ability. However, the ASTB does not evaluate a person's response to stress. This is important because operating sophisticated aircraft demands exceptional performance and causes high psychological stress. Some people are more resistant to this type of stress, and consequently better able to cope with the demands of naval aviation, than others.Although many psychological studies have examined psychological stress resistance none have taken advantage of the human genome sequence. Here we use high-throughput -omic biology methods and a novel statistical data normalization method to identify plasma proteins associated with human performance under psychological stress. We identified proteins involved in four basic physiological processes: innate immunity, cardiac function, coagulation and plasma lipid physiology.The proteins identified here further elucidate the physiological response to psychological stress and suggest a hypothesis that stress-susceptible pilots may be more prone to shock. This work also provides potential biomarkers for screening humans for capability of superior performance under stress.

  7. Benchmarking road safety performance: Identifying a meaningful reference (best-in-class).

    Science.gov (United States)

    Chen, Faan; Wu, Jiaorong; Chen, Xiaohong; Wang, Jianjun; Wang, Di

    2016-01-01

    For road safety improvement, comparing and benchmarking performance are widely advocated as the emerging and preferred approaches. However, there is currently no universally agreed upon approach for the process of road safety benchmarking, and performing the practice successfully is by no means easy. This is especially true for the two core activities of which: (1) developing a set of road safety performance indicators (SPIs) and combining them into a composite index; and (2) identifying a meaningful reference (best-in-class), one which has already obtained outstanding road safety practices. To this end, a scientific technique that can combine the multi-dimensional safety performance indicators (SPIs) into an overall index, and subsequently can identify the 'best-in-class' is urgently required. In this paper, the Entropy-embedded RSR (Rank-sum ratio), an innovative, scientific and systematic methodology is investigated with the aim of conducting the above two core tasks in an integrative and concise procedure, more specifically in a 'one-stop' way. Using a combination of results from other methods (e.g. the SUNflower approach) and other measures (e.g. Human Development Index) as a relevant reference, a given set of European countries are robustly ranked and grouped into several classes based on the composite Road Safety Index. Within each class the 'best-in-class' is then identified. By benchmarking road safety performance, the results serve to promote best practice, encourage the adoption of successful road safety strategies and measures and, more importantly, inspire the kind of political leadership needed to create a road transport system that maximizes safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Predictive Performance Tuning of OpenACC Accelerated Applications

    KAUST Repository

    Siddiqui, Shahzeb

    2014-05-04

    Graphics Processing Units (GPUs) are gradually becoming mainstream in supercomputing as their capabilities to significantly accelerate a large spectrum of scientific applications have been clearly identified and proven. Moreover, with the introduction of high level programming models such as OpenACC [1] and OpenMP 4.0 [2], these devices are becoming more accessible and practical to use by a larger scientific community. However, performance optimization of OpenACC accelerated applications usually requires an in-depth knowledge of the hardware and software specifications. We suggest a prediction-based performance tuning mechanism [3] to quickly tune OpenACC parameters for a given application to dynamically adapt to the execution environment on a given system. This approach is applied to a finite difference kernel to tune the OpenACC gang and vector clauses for mapping the compute kernels into the underlying accelerator architecture. Our experiments show a significant performance improvement against the default compiler parameters and a faster tuning by an order of magnitude compared to the brute force search tuning.

  9. Mobile Applications Privacy, Towards a methodology to identify over-privileged applications

    OpenAIRE

    NAI FOVINO Igor; NEISSE RICARDO; GENEIATAKIS DIMITRIOS; KOUNELIS IOANNIS

    2013-01-01

    Smart-phones are today used to perform a huge amount of online activities. They are used as interfaces to access the cloud, as storage resource, as social network tools, agenda, digital wallet, digital identity repository etc. In other words smart-phone are today the citizen’s digital companion, and, as such, they are the explicit or implicit repository of a huge amount of personal information. The criticality of these devices is generally due to the following considerations: ...

  10. Fragrance contact allergens in 5588 cosmetic products identified through a novel smartphone application

    DEFF Research Database (Denmark)

    Bennike, N H; Oturai, N B; Müller, S

    2018-01-01

    -on and 100 ppm or above in wash-off cosmetics. OBJECTIVE: To examine exposure, based on ingredient labelling, to the 26 fragrances in a sample of 5588 fragranced cosmetic products. METHODS: The investigated products were identified through a novel, non-profit smartphone application (app), designed to provide...

  11. Application of multi-locus analytical methods to identify interacting loci in case-control studies.

    NARCIS (Netherlands)

    Vermeulen, S.; Heijer, M. den; Sham, P.; Knight, J.

    2007-01-01

    To identify interacting loci in genetic epidemiological studies the application of multi-locus methods of analysis is warranted. Several more advanced classification methods have been developed in the past years, including multiple logistic regression, sum statistics, logic regression, and the

  12. Identifying key performance indicators for nursing and midwifery care using a consensus approach.

    Science.gov (United States)

    McCance, Tanya; Telford, Lorna; Wilson, Julie; Macleod, Olive; Dowd, Audrey

    2012-04-01

    The aim of this study was to gain consensus on key performance indicators that are appropriate and relevant for nursing and midwifery practice in the current policy context. There is continuing demand to demonstrate effectiveness and efficiency in health and social care and to communicate this at boardroom level. Whilst there is substantial literature on the use of clinical indicators and nursing metrics, there is less evidence relating to indicators that reflect the patient experience. A consensus approach was used to identify relevant key performance indicators. A nominal group technique was used comprising two stages: a workshop involving all grades of nursing and midwifery staff in two HSC trusts in Northern Ireland (n = 50); followed by a regional Consensus Conference (n = 80). During the workshop, potential key performance indicators were identified. This was used as the basis for the Consensus Conference, which involved two rounds of consensus. Analysis was based on aggregated scores that were then ranked. Stage one identified 38 potential indicators and stage two prioritised the eight top-ranked indicators as a core set for nursing and midwifery. The relevance and appropriateness of these indicators were confirmed with nurses and midwives working in a range of settings and from the perspective of service users. The eight indicators identified do not conform to the majority of other nursing metrics generally reported in the literature. Furthermore, they are strategically aligned to work on the patient experience and are reflective of the fundamentals of nursing and midwifery practice, with the focus on person-centred care. Nurses and midwives have a significant contribution to make in determining the extent to which these indicators are achieved in practice. Furthermore, measurement of such indicators provides an opportunity to evidence of the unique impact of nursing/midwifery care on the patient experience. © 2011 Blackwell Publishing Ltd.

  13. Nuclide identifier and grat data reader application for ORIGEN output file

    International Nuclear Information System (INIS)

    Arif Isnaeni

    2011-01-01

    ORIGEN is a one-group depletion and radioactive decay computer code developed at the Oak Ridge National Laboratory (ORNL). ORIGEN takes one-group neutronics calculation providing various nuclear material characteristics (the buildup, decay and processing of radioactive materials). ORIGEN output is a text-based file, ORIGEN output file contains only numbers in the form of group data nuclide, nuclide identifier and grat. This application was created to facilitate data collection nuclide identifier and grat, this application also has a function to acquire mass number data and calculate mass (gram) for each nuclide. Output from these applications can be used for computer code data input for neutronic calculations such as MCNP. (author)

  14. Solving Enterprise Applications Performance Puzzles Queuing Models to the Rescue

    CERN Document Server

    Grinshpan, Leonid

    2012-01-01

    A groundbreaking scientific approach to solving enterprise applications performance problems Enterprise applications are the information backbone of today's corporations, supporting vital business functions such as operational management, supply chain maintenance, customer relationship administration, business intelligence, accounting, procurement logistics, and more. Acceptable performance of enterprise applications is critical for a company's day-to-day operations as well as for its profitability. Unfortunately, troubleshooting poorly performing enterprise applications has traditionally

  15. The application of particle swarm optimization to identify gamma spectrum with neural network

    International Nuclear Information System (INIS)

    Shi Dongsheng; Di Yuming; Zhou Chunlin

    2006-01-01

    Aiming at the shortcomings that BP algorithm is usually trapped to a local optimum and it has a low speed of convergence in the application of neural network to identify gamma spectrum, according to the advantage of the globe optimal searching of particle swarm optimization, this paper put forward a new algorithm for neural network training by combining BP algorithm and Particle Swarm Optimization-mixed PSO-BP algorithm. In the application to identify gamma spectrum, the new algorithm overcomes the shortcoming that BP algorithm is usually trapped to a local optimum and the neural network trained by it has a high ability of generalization with identification result of one hundred percent correct. Practical example shows that the mixed PSO-BP algorithm can effectively and reliably be used to identify gamma spectrum. (authors)

  16. Developing and testing an instrument for identifying performance incentives in the Greek health care sector

    Directory of Open Access Journals (Sweden)

    Paleologou Victoria

    2006-09-01

    Full Text Available Abstract Background In the era of cost containment, managers are constantly pursuing increased organizational performance and productivity by aiming at the obvious target, i.e. the workforce. The health care sector, in which production processes are more complicated compared to other industries, is not an exception. In light of recent legislation in Greece in which efficiency improvement and achievement of specific performance targets are identified as undisputable health system goals, the purpose of this study was to develop a reliable and valid instrument for investigating the attitudes of Greek physicians, nurses and administrative personnel towards job-related aspects, and the extent to which these motivate them to improve performance and increase productivity. Methods A methodological exploratory design was employed in three phases: a content development and assessment, which resulted in a 28-item instrument, b pilot testing (N = 74 and c field testing (N = 353. Internal consistency reliability was tested via Cronbach's alpha coefficient and factor analysis was used to identify the underlying constructs. Tests of scaling assumptions, according to the Multitrait-Multimethod Matrix, were used to confirm the hypothesized component structure. Results Four components, referring to intrinsic individual needs and external job-related aspects, were revealed and explain 59.61% of the variability. They were subsequently labeled: job attributes, remuneration, co-workers and achievement. Nine items not meeting item-scale criteria were removed, resulting in a 19-item instrument. Scale reliability ranged from 0.782 to 0.901 and internal item consistency and discriminant validity criteria were satisfied. Conclusion Overall, the instrument appears to be a promising tool for hospital administrations in their attempt to identify job-related factors, which motivate their employees. The psychometric properties were good and warrant administration to a larger

  17. Developing and testing an instrument for identifying performance incentives in the Greek health care sector.

    Science.gov (United States)

    Paleologou, Victoria; Kontodimopoulos, Nick; Stamouli, Aggeliki; Aletras, Vassilis; Niakas, Dimitris

    2006-09-13

    In the era of cost containment, managers are constantly pursuing increased organizational performance and productivity by aiming at the obvious target, i.e. the workforce. The health care sector, in which production processes are more complicated compared to other industries, is not an exception. In light of recent legislation in Greece in which efficiency improvement and achievement of specific performance targets are identified as undisputable health system goals, the purpose of this study was to develop a reliable and valid instrument for investigating the attitudes of Greek physicians, nurses and administrative personnel towards job-related aspects, and the extent to which these motivate them to improve performance and increase productivity. A methodological exploratory design was employed in three phases: a) content development and assessment, which resulted in a 28-item instrument, b) pilot testing (N = 74) and c) field testing (N = 353). Internal consistency reliability was tested via Cronbach's alpha coefficient and factor analysis was used to identify the underlying constructs. Tests of scaling assumptions, according to the Multitrait-Multimethod Matrix, were used to confirm the hypothesized component structure. Four components, referring to intrinsic individual needs and external job-related aspects, were revealed and explain 59.61% of the variability. They were subsequently labeled: job attributes, remuneration, co-workers and achievement. Nine items not meeting item-scale criteria were removed, resulting in a 19-item instrument. Scale reliability ranged from 0.782 to 0.901 and internal item consistency and discriminant validity criteria were satisfied. Overall, the instrument appears to be a promising tool for hospital administrations in their attempt to identify job-related factors, which motivate their employees. The psychometric properties were good and warrant administration to a larger sample of employees in the Greek healthcare system.

  18. Performance test of a dual-purpose disc agrochemical applicator ...

    African Journals Online (AJOL)

    The performance test of a dual-purpose disc agrochemical applicator for field crop was conducted with view to assess the distribution patterns/droplet sizes and uniformity of spreading and or spraying for the agrochemical application. The equipment performances for both granular and liquid chemical application were ...

  19. Total System Performance Assessment - License Application Methods and Approach

    International Nuclear Information System (INIS)

    McNeish, J.

    2003-01-01

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document

  20. Genome-wide association study identifies three novel genetic markers associated with elite endurance performance

    DEFF Research Database (Denmark)

    Ahmetov, Ii; Kulemin, Na; Popov, Dv

    2015-01-01

    To investigate the association between multiple single-nucleotide polymorphisms (SNPs), aerobic performance and elite endurance athlete status in Russians. By using GWAS approach, we examined the association between 1,140,419 SNPs and relative maximal oxygen consumption rate ([Formula: see text]O2......max) in 80 international-level Russian endurance athletes (46 males and 34 females). To validate obtained results, we further performed case-control studies by comparing the frequencies of the most significant SNPs (with P endurance athletes and opposite cohorts (192...... Russian controls, 1367 European controls, and 230 Russian power athletes). Initially, six 'endurance alleles' were identified showing discrete associations with [Formula: see text]O2max both in males and females. Next, case-control studies resulted in remaining three SNPs (NFIA-AS2 rs1572312, TSHR rs...

  1. Identifying colon cancer risk modules with better classification performance based on human signaling network.

    Science.gov (United States)

    Qu, Xiaoli; Xie, Ruiqiang; Chen, Lina; Feng, Chenchen; Zhou, Yanyan; Li, Wan; Huang, Hao; Jia, Xu; Lv, Junjie; He, Yuehan; Du, Youwen; Li, Weiguo; Shi, Yuchen; He, Weiming

    2014-10-01

    Identifying differences between normal and tumor samples from a modular perspective may help to improve our understanding of the mechanisms responsible for colon cancer. Many cancer studies have shown that signaling transduction and biological pathways are disturbed in disease states, and expression profiles can distinguish variations in diseases. In this study, we integrated a weighted human signaling network and gene expression profiles to select risk modules associated with tumor conditions. Risk modules as classification features by our method had a better classification performance than other methods, and one risk module for colon cancer had a good classification performance for distinguishing between normal/tumor samples and between tumor stages. All genes in the module were annotated to the biological process of positive regulation of cell proliferation, and were highly associated with colon cancer. These results suggested that these genes might be the potential risk genes for colon cancer. Copyright © 2013. Published by Elsevier Inc.

  2. High Performance Fortran for Aerospace Applications

    National Research Council Canada - National Science Library

    Mehrotra, Piyush

    2000-01-01

    .... HPF is a set of Fortran extensions designed to provide users with a high-level interface for programming data parallel scientific applications while delegating to the compiler/runtime system the task...

  3. Identifying Social Impacts in Product Supply Chains:Overview and Application of the Social Hotspot Database

    Directory of Open Access Journals (Sweden)

    Gregory Norris

    2012-08-01

    Full Text Available One emerging tool to measure the social-related impacts in supply chains is Social Life Cycle Assessment (S-LCA, a derivative of the well-established environmental LCA technique. LCA has recently started to gain popularity among large corporations and initiatives, such as The Sustainability Consortium or the Sustainable Apparel Coalition. Both have made the technique a cornerstone of their applied-research program. The Social Hotspots Database (SHDB is an overarching, global database that eases the data collection burden in S-LCA studies. Proposed “hotspots” are production activities or unit processes (also defined as country-specific sectors in the supply chain that may be at risk for social issues to be present. The SHDB enables efficient application of S-LCA by allowing users to prioritize production activities for which site-specific data collection is most desirable. Data for three criteria are used to inform prioritization: (1 labor intensity in worker hours per unit process and (2 risk for, or opportunity to affect, relevant social themes or sub-categories related to Human Rights, Labor Rights and Decent Work, Governance and Access to Community Services (3 gravity of a social issue. The Worker Hours Model was developed using a global input/output economic model and wage rate data. Nearly 200 reputable sources of statistical data have been used to develop 20 Social Theme Tables by country and sector. This paper presents an overview of the SHDB development and features, as well as results from a pilot study conducted on strawberry yogurt. This study, one of seven Social Scoping Assessments mandated by The Sustainability Consortium, identifies the potential social hotspots existing in the supply chain of strawberry yogurt. With this knowledge, companies that manufacture or sell yogurt can refine their data collection efforts in order to put their social responsibility performance in perspective and effectively set up programs and

  4. Improving Transactional Memory Performance for Irregular Applications

    OpenAIRE

    Pedrero, Manuel; Gutiérrez, Eladio; Romero, Sergio; Plata, Óscar

    2015-01-01

    Transactional memory (TM) offers optimistic concurrency support in modern multicore archi- tectures, helping the programmers to extract parallelism in irregular applications when data dependence information is not available before runtime. In fact, recent research focus on ex- ploiting thread-level parallelism using TM approaches. However, the proposed techniques are of general use, valid for any type of application. This work presents ReduxSTM, a software TM system specially d...

  5. How to Identify Possible Applications of Product Configuration Systems in Engineer-to-Order Companies

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Hvam, Lars

    2017-01-01

    -toorder (ETO) companies that support gradual implementation of PCS due to large product variety and, several times, higher complexity of products and processes. The overall PCS process can thereby be broken down, and the risk minimised. This paper provides a three-step framework to identify different......Product configuration systems (PCS) play an essential role when providing customised and engineered products efficiently. Literature in the field describes numerous strategies to develop PCS but neglects to identify different application areas. This topic is particularly important for engineer...

  6. Scientific Applications Performance Evaluation on Burst Buffer

    KAUST Repository

    Markomanolis, George S.; Hadri, Bilel; Khurram, Rooh Ul Amin; Feki, Saber

    2017-01-01

    Parallel I/O is an integral component of modern high performance computing, especially in storing and processing very large datasets, such as the case of seismic imaging, CFD, combustion and weather modeling. The storage hierarchy includes nowadays

  7. Simulation-based Assessment to Reliably Identify Key Resident Performance Attributes.

    Science.gov (United States)

    Blum, Richard H; Muret-Wagstaff, Sharon L; Boulet, John R; Cooper, Jeffrey B; Petrusa, Emil R; Baker, Keith H; Davidyuk, Galina; Dearden, Jennifer L; Feinstein, David M; Jones, Stephanie B; Kimball, William R; Mitchell, John D; Nadelberg, Robert L; Wiser, Sarah H; Albrecht, Meredith A; Anastasi, Amanda K; Bose, Ruma R; Chang, Laura Y; Culley, Deborah J; Fisher, Lauren J; Grover, Meera; Klainer, Suzanne B; Kveraga, Rikante; Martel, Jeffrey P; McKenna, Shannon S; Minehart, Rebecca D; Mitchell, John D; Mountjoy, Jeremi R; Pawlowski, John B; Pilon, Robert N; Shook, Douglas C; Silver, David A; Warfield, Carol A; Zaleski, Katherine L

    2018-04-01

    Obtaining reliable and valid information on resident performance is critical to patient safety and training program improvement. The goals were to characterize important anesthesia resident performance gaps that are not typically evaluated, and to further validate scores from a multiscenario simulation-based assessment. Seven high-fidelity scenarios reflecting core anesthesiology skills were administered to 51 first-year residents (CA-1s) and 16 third-year residents (CA-3s) from three residency programs. Twenty trained attending anesthesiologists rated resident performances using a seven-point behaviorally anchored rating scale for five domains: (1) formulate a clear plan, (2) modify the plan under changing conditions, (3) communicate effectively, (4) identify performance improvement opportunities, and (5) recognize limits. A second rater assessed 10% of encounters. Scores and variances for each domain, each scenario, and the total were compared. Low domain ratings (1, 2) were examined in detail. Interrater agreement was 0.76; reliability of the seven-scenario assessment was r = 0.70. CA-3s had a significantly higher average total score (4.9 ± 1.1 vs. 4.6 ± 1.1, P = 0.01, effect size = 0.33). CA-3s significantly outscored CA-1s for five of seven scenarios and domains 1, 2, and 3. CA-1s had a significantly higher proportion of worrisome ratings than CA-3s (chi-square = 24.1, P < 0.01, effect size = 1.50). Ninety-eight percent of residents rated the simulations more educational than an average day in the operating room. Sensitivity of the assessment to CA-1 versus CA-3 performance differences for most scenarios and domains supports validity. No differences, by experience level, were detected for two domains associated with reflective practice. Smaller score variances for CA-3s likely reflect a training effect; however, worrisome performance scores for both CA-1s and CA-3s suggest room for improvement.

  8. Real-Time Application Performance Steering and Adaptive Control

    National Research Council Canada - National Science Library

    Reed, Daniel

    2002-01-01

    .... The objective of the Real-time Application Performance Steering and Adaptive Control project is to replace ad hoc, post-mortem performance optimization with an extensible, portable, and distributed...

  9. Predicting performance at medical school: can we identify at-risk students?

    Directory of Open Access Journals (Sweden)

    Shaban S

    2011-05-01

    Full Text Available Sami Shaban, Michelle McLeanDepartment of Medical Education, Faculty of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab EmiratesBackground: The purpose of this study was to examine the predictive potential of multiple indicators (eg, preadmission scores, unit, module and clerkship grades, course and examination scores on academic performance at medical school, with a view to identifying students at risk.Methods: An analysis was undertaken of medical student grades in a 6-year medical school program at the Faculty of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab Emirates, over the past 14 years.Results: While high school scores were significantly (P < 0.001 correlated with the final integrated examination, predictability was only 6.8%. Scores for the United Arab Emirates university placement assessment (Common Educational Proficiency Assessment were only slightly more promising as predictors with 14.9% predictability for the final integrated examination. Each unit or module in the first four years was highly correlated with the next unit or module, with 25%–60% predictability. Course examination scores (end of years 2, 4, and 6 were significantly correlated (P < 0.001 with the average scores in that 2-year period (59.3%, 64.8%, and 55.8% predictability, respectively. Final integrated examination scores were significantly correlated (P < 0.001 with National Board of Medical Examiners scores (35% predictability. Multivariate linear regression identified key grades with the greatest predictability of the final integrated examination score at three stages in the program.Conclusion: This study has demonstrated that it may be possible to identify “at-risk” students relatively early in their studies through continuous data archiving and regular analysis. The data analysis techniques used in this study are not unique to this institution.Keywords: at-risk students, grade

  10. Effects of Sound Painting Applications on Performance

    Science.gov (United States)

    Coskuner, Sonat

    2016-01-01

    Today, some of the important dilemmas of music education are that performers are too dependent on the notes in a written musical score and they are not being so able to improvise. Stage phobia, lack of motivation and problematic of perception regarding today's modern music are additional problems facing musicians. This research aims at revealing…

  11. Evaluating cryostat performance for naval applications

    Science.gov (United States)

    Knoll, David; Willen, Dag; Fesmire, James; Johnson, Wesley; Smith, Jonathan; Meneghelli, Barry; Demko, Jonathan; George, Daniel; Fowler, Brian; Huber, Patti

    2012-06-01

    The Navy intends to use High Temperature Superconducting Degaussing (HTSDG) coil systems on future Navy platforms. The Navy Metalworking Center (NMC) is leading a team that is addressing cryostat configuration and manufacturing issues associated with fabricating long lengths of flexible, vacuum-jacketed cryostats that meet Navy shipboard performance requirements. The project includes provisions to evaluate the reliability performance, as well as proofing of fabrication techniques. Navy cryostat performance specifications include less than 1 Wm-1 heat loss, 2 MPa working pressure, and a 25-year vacuum life. Cryostat multilayer insulation (MLI) systems developed on the project have been validated using a standardized cryogenic test facility and implemented on 5-meterlong test samples. Performance data from these test samples, which were characterized using both LN2 boiloff and flow-through measurement techniques, will be presented. NMC is working with an Integrated Project Team consisting of Naval Sea Systems Command, Naval Surface Warfare Center-Carderock Division, Southwire Company, nkt cables, Oak Ridge National Laboratory (ORNL), ASRC Aerospace, and NASA Kennedy Space Center (NASA-KSC) to complete these efforts. Approved for public release; distribution is unlimited. This material is submitted with the understanding that right of reproduction for governmental purposes is reserved for the Office of Naval Research, Arlington, Virginia 22203-1995.

  12. Optimizing Hydronic System Performance in Residential Applications

    Energy Technology Data Exchange (ETDEWEB)

    None

    2013-10-01

    Even though new homes constructed with hydronic heat comprise only 3% of the market (US Census Bureau 2009), of the 115 million existing homes in the United States, almost 14 million of those homes (11%) are heated with steam or hot water systems according to 2009 US Census data. Therefore, improvements in hydronic system performance could result in significant energy savings in the US.

  13. Application of Performance Ratios in Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    Aleš Kresta

    2015-01-01

    Full Text Available The cornerstone of modern portfolio theory was established by pioneer work of Harry Markowitz. Based on his mean-variance framework, Sharpe formulated his well-known Sharpe ratio aiming to measure the performance of mutual funds. The contemporary development in computer’s computational power allowed to apply more complex performance ratios, which take into account also higher moments of return probability distribution. Although these ratios were proposed to help the investors to improve the results of portfolio optimization, we empirically demonstrated in our paper that this may not necessarily be true. On the historical dataset of DJIA components we empirically showed that both Sharpe ratio and MAD ratio outperformed Rachev ratio. However, for Rachev ratio we assumed only one level of parameters value. Different set-ups of parameters may provide different results and thus further analysis is certainly required.

  14. High performance cloud auditing and applications

    CERN Document Server

    Choi, Baek-Young; Song, Sejun

    2014-01-01

    This book mainly focuses on cloud security and high performance computing for cloud auditing. The book discusses emerging challenges and techniques developed for high performance semantic cloud auditing, and presents the state of the art in cloud auditing, computing and security techniques with focus on technical aspects and feasibility of auditing issues in federated cloud computing environments.   In summer 2011, the United States Air Force Research Laboratory (AFRL) CyberBAT Cloud Security and Auditing Team initiated the exploration of the cloud security challenges and future cloud auditing research directions that are covered in this book. This work was supported by the United States government funds from the Air Force Office of Scientific Research (AFOSR), the AFOSR Summer Faculty Fellowship Program (SFFP), the Air Force Research Laboratory (AFRL) Visiting Faculty Research Program (VFRP), the National Science Foundation (NSF) and the National Institute of Health (NIH). All chapters were partially suppor...

  15. Phenotypic Screening Identifies Synergistically Acting Natural Product Enhancing the Performance of Biomaterial Based Wound Healing

    Directory of Open Access Journals (Sweden)

    Srinivasan Sivasubramanian

    2017-07-01

    Full Text Available The potential of multifunctional wound heal biomaterial relies on the optimal content of therapeutic constituents as well as the desirable physical, chemical, and biological properties to accelerate the healing process. Formulating biomaterials such as amnion or collagen based scaffolds with natural products offer an affordable strategy to develop dressing material with high efficiency in healing wounds. Using image based phenotyping and quantification, we screened natural product derived bioactive compounds for modulators of types I and III collagen production from human foreskin derived fibroblast cells. The identified hit was then formulated with amnion to develop a biomaterial, and its biophysical properties, in vitro and in vivo effects were characterized. In addition, we performed functional profiling analyses by PCR array to understand the effect of individual components of these materials on various genes such as inflammatory mediators including chemokines and cytokines, growth factors, fibroblast stimulating markers for collagen secretion, matrix metalloproteinases, etc., associated with wound healing. FACS based cell cycle analyses were carried out to evaluate the potential of biomaterials for induction of proliferation of fibroblasts. Western blot analyses was done to examine the effect of biomaterial on collagen synthesis by cells and compared to cells grown in the presence of growth factors. This work demonstrated an uncomplicated way of identifying components that synergistically promote healing. Besides, we demonstrated that modulating local wound environment using biomaterials with bioactive compounds could enhance healing. This study finds that the developed biomaterials offer immense scope for healing wounds by means of their skin regenerative features such as anti-inflammatory, fibroblast stimulation for collagen secretion as well as inhibition of enzymes and markers impeding the healing, hydrodynamic properties complemented

  16. The comparison of the performance of two screening strategies identifying newly-diagnosed HIV during pregnancy.

    Science.gov (United States)

    Boer, Kees; Smit, Colette; van der Flier, Michiel; de Wolf, Frank

    2011-10-01

    In the Netherlands, a non-selective opt-out instead of a selective opt-in antenatal HIV screening strategy was implemented in 2004. In case of infection, screening was followed by prevention of mother-to-child-transmission (PMTCT). We compared the performance of the two strategies in terms of detection of new cases of HIV and vertical transmission. HIV-infected pregnant women were identified retrospectively from the Dutch HIV cohort ATHENA January 2000 to January 2008. Apart from demographic, virological and immunological data, the date of HIV infection in relation to the index pregnancy was established. Separately, all infants diagnosed with HIV born following implementation of the screening program were identified by a questionnaire via the paediatric HIV centres. 162/481 (33.7%) HIV-positive pregnant women were diagnosed with HIV before 2004 and 172/214 (80.3%) after January 2004. Multivariate analysis showed an 8-fold (95% confidence interval 5.47-11.87) increase in the odds of HIV detection during pregnancy after the national introduction of the opt-out strategy. Still, three children born during a 5-year period after July 2004 were infected due to de novo infection in pregnancy. Implementation of a nation-wide screening strategy based upon non-selective opt-out screening followed by effective PMTCT appeared to detect more HIV-infected women for the first time in pregnancy and to reduce vertical transmission of HIV substantially. Nonetheless, still few children are infected because of maternal infection after the first trimester. We propose the introduction of partner screening on HIV as part of the antenatal screening strategy.

  17. EEG applications for sport and performance.

    Science.gov (United States)

    Thompson, Trevor; Steffert, Tony; Ros, Tomas; Leach, Joseph; Gruzelier, John

    2008-08-01

    One approach to understanding processes that underlie skilled performing has been to study electrical brain activity using electroencephalography (EEG). A notorious problem with EEG is that genuine cerebral data is often contaminated by artifacts of non-cerebral origin. Unfortunately, such artifacts tend to be exacerbated when the subject is in motion, meaning that obtaining reliable data during exercise is inherently problematic. These problems may explain the limited number of studies using EEG as a methodological tool in the sports sciences. This paper discusses how empirical studies have generally tackled the problem of movement artifact by adopting alternative paradigms which avoid recording during actual physical exertion. Moreover, the specific challenges that motion presents to obtaining reliable EEG data are discussed along with practical and computational techniques to confront these challenges. Finally, as EEG recording in sports is often underpinned by a desire to optimise performance, a brief review of EEG-biofeedback and peak performance studies is also presented. A knowledge of practical aspects of EEG recording along with the advent of new technology and increasingly sophisticated processing models offer a promising approach to minimising, if perhaps not entirely circumventing, the problem of obtaining reliable EEG data during motion.

  18. High-Performance Energy Applications and Systems

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-01-01

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “Foundational Tools for Petascale Computing”, SC0003922/FG02-10ER25940, UW PRJ27NU.

  19. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  20. Optimizing Hydronic System Performance in Residential Applications

    Energy Technology Data Exchange (ETDEWEB)

    Arena, L.; Faakye, O.

    2013-10-01

    Even though new homes constructed with hydronic heat comprise only 3% of the market (US Census Bureau 2009), of the 115 million existing homes in the United States, almost 14 million of those homes (11%) are heated with steam or hot water systems according to 2009 US Census data. Therefore, improvements in hydronic system performance could result in significant energy savings in the US. When operating properly, the combination of a gas-fired condensing boiler with baseboard convectors and an indirect water heater is a viable option for high-efficiency residential space heating in cold climates. Based on previous research efforts, however, it is apparent that these types of systems are typically not designed and installed to achieve maximum efficiency. Furthermore, guidance on proper design and commissioning for heating contractors and energy consultants is hard to find and is not comprehensive. Through modeling and monitoring, CARB sought to determine the optimal combination(s) of components - pumps, high efficiency heat sources, plumbing configurations and controls - that result in the highest overall efficiency for a hydronic system when baseboard convectors are used as the heat emitter. The impact of variable-speed pumps on energy use and system performance was also investigated along with the effects of various control strategies and the introduction of thermal mass.

  1. 40 CFR 141.723 - Requirements to respond to significant deficiencies identified in sanitary surveys performed by EPA.

    Science.gov (United States)

    2010-07-01

    ... deficiencies identified in sanitary surveys performed by EPA. 141.723 Section 141.723 Protection of Environment... performed by EPA, systems must respond in writing to significant deficiencies identified in sanitary survey... will address significant deficiencies noted in the survey. (d) Systems must correct significant...

  2. Can surveillance systems identify and avert adverse drug events? A prospective evaluation of a commercial application.

    Science.gov (United States)

    Jha, Ashish K; Laguette, Julia; Seger, Andrew; Bates, David W

    2008-01-01

    Computerized monitors can effectively detect and potentially prevent adverse drug events (ADEs). Most monitors have been developed in large academic hospitals and are not readily usable in other settings. We assessed the ability of a commercial program to identify and prevent ADEs in a community hospital. and Measurement We prospectively evaluated the commercial application in a community-based hospital. We examined the frequency and types of alerts produced, how often they were associated with ADEs and potential ADEs, and the potential financial impact of monitoring for ADEs. Among 2,407 patients screened, the application generated 516 high priority alerts. We were able to review 266 alerts at the time they were generated and among these, 30 (11.3%) were considered substantially important to warrant contacting the physician caring for the patient. These 30 alerts were associated with 4 ADEs and 11 potential ADEs. In all 15 cases, the responsible physician was unaware of the event, leading to a change in clinical care in 14 cases. Overall, 23% of high priority alerts were associated with an ADE (95% confidence interval [CI] 12% to 34%) and another 15% were associated with a potential ADE (95% CI 6% to 24%). Active surveillance used approximately 1.5 hours of pharmacist time daily. A commercially available, computer-based ADE detection tool was effective at identifying ADEs. When used as part of an active surveillance program, it can have an impact on preventing or ameliorating ADEs.

  3. and application to autopilot performance analysis

    Directory of Open Access Journals (Sweden)

    Daniel E. Davison

    2000-01-01

    Full Text Available This paper deals with the notion of disturbance model uncertainty. The disturbance is modeled as the output of a first-order filter which is driven by white noise and whose bandwidth and gain are uncertain. An analytical expression for the steady-state output variance as a function of the uncertain bandwidth and gain is derived, and several properties of this variance function are analyzed. Two notions, those of disturbance bandwidth margin and disturbance gain margin are also introduced. These tools are then applied to the analysis of a simple altitude-hold autopilot system in the presence of turbulence where the turbulence scale is treated as an uncertain parameter. It is shown that the autopilot, which is satisfactory for nominal turbulence scale, may be inadequate when the uncertainty is taken into account. Moreover, it is proven that, in order to obtain a design that provides robust performance in the face of turbulence scale uncertainty, it is necessary to substantially increase the controller bandwidth, even if one is willing to sacrifice the autopilot's holding ability and stability robustness.

  4. Diagnostic performance of body mass index to identify excess body fat in children with cerebral palsy.

    Science.gov (United States)

    Duran, Ibrahim; Schulze, Josefa; Martakis, KyriakoS; Stark, Christina; Schoenau, Eckhard

    2018-03-07

    To assess the diagnostic performance of body mass index (BMI) cut-off values according to recommendations of the World Health Organization (WHO), the World Obesity Federation (WOF), and the German Society for Adiposity (DAG) to identify excess body fat in children with cerebral palsy (CP). The present study was a monocentric retrospective analysis of prospectively collected data among children and adolescents with CP participating in a rehabilitation programme. Excess body fat was defined as a body fat percentage above the 85th centile assessed by dual-energy X-ray absorptiometry. In total, 329 children (181 males, 148 females) with CP were eligible for analysis. The mean age was 12 years 4 months (standard deviation 2y 9mo). The BMI cut-off values for 'overweight' according to the WHO, WOF, and DAG showed the following sensitivities and specificities for the prediction of excess body fat in our population: WHO: sensitivity 0.768 (95% confidence interval [CI] 0.636-0.870), specificity 0.894 (95% CI 0.851-0.928); WOF: sensitivity 0.696 (95% CI 0.559-0.812), specificity 0.934 (95% CI 0.898-0.960); DAG: sensitivity 0.411 (95% CI 0.281-0.550), specificity 0.993 (95% CI 0.974-0.999). Body mass index showed high specificity, but low sensitivity in children with CP. Thus, 'normal-weight obese' children with CP were overlooked, when assessing excess body fat only using BMI. Excess body fat in children with cerebral palsy (CP) is less common than previously reported. Body mass index (BMI) had high specificity but low sensitivity in detecting excess body fat in children with CP. BMI evaluation criteria of the German Society for Adiposity could be improved in children with CP. © 2018 Mac Keith Press.

  5. Which radiological investigations should be performed to identify fractures in suspected child abuse?

    International Nuclear Information System (INIS)

    Kemp, A.M.; Butler, A.; Morris, S.; Mann, M.; Kemp, K.W.; Rolfe, K.; Sibert, J.R.; Maguire, S.

    2006-01-01

    Aims: To determine which radiological investigations should be performed and which children should be investigated. Materials and methods: An all language literature search of original articles; from 1950-October 2005. Two reviewers independently reviewed each article. A third was carried out on disagreement. Each study was assessed using standardised data extraction, critical appraisal and evidence forms. Results: Thirty-four studies were included. Fifteen addressed the question: which investigation has a higher yield, skeletal surveys (SS) or bone scintigraphy (BS)? Studies gave conflicting results. Overall neither investigation is as good as the two combined. BS predominately missed skull, metaphyseal and epiphyseal fractures, whereas SS commonly missed rib fractures. Two studies showed that a repeat SS 2 weeks after the initial study provided significant additional information about tentative findings, the number and age of fractures. A comparative study evaluated additional oblique views of ribs in 73 children and showed improved diagnostic sensitivity, specificity and accuracy. Four studies addressed the diagnostic yield for occult fractures with respect to age. This was significant for children under 2-years old. Conclusions: In children under 2-years old, where physical abuse is suspected, diagnostic imaging of the skeleton should be mandatory. SS or BS alone is inadequate to identify all fractures. It is recommended that all SS should include oblique views of the ribs. This review suggests that the following options would optimize the diagnostic yield. However, each needs to be evaluated prospectively: SS that includes oblique views, SS and BS, a SS with repeat SS or selected images 2 weeks later or a BS plus skull radiography and coned views of metaphyses and epiphyses

  6. Which radiological investigations should be performed to identify fractures in suspected child abuse?

    Energy Technology Data Exchange (ETDEWEB)

    Kemp, A.M.; Butler, A.; Morris, S.; Mann, M.; Kemp, K.W.; Rolfe, K.; Sibert, J.R.; Maguire, S

    2006-09-15

    Aims: To determine which radiological investigations should be performed and which children should be investigated. Materials and methods: An all language literature search of original articles; from 1950-October 2005. Two reviewers independently reviewed each article. A third was carried out on disagreement. Each study was assessed using standardised data extraction, critical appraisal and evidence forms. Results: Thirty-four studies were included. Fifteen addressed the question: which investigation has a higher yield, skeletal surveys (SS) or bone scintigraphy (BS)? Studies gave conflicting results. Overall neither investigation is as good as the two combined. BS predominately missed skull, metaphyseal and epiphyseal fractures, whereas SS commonly missed rib fractures. Two studies showed that a repeat SS 2 weeks after the initial study provided significant additional information about tentative findings, the number and age of fractures. A comparative study evaluated additional oblique views of ribs in 73 children and showed improved diagnostic sensitivity, specificity and accuracy. Four studies addressed the diagnostic yield for occult fractures with respect to age. This was significant for children under 2-years old. Conclusions: In children under 2-years old, where physical abuse is suspected, diagnostic imaging of the skeleton should be mandatory. SS or BS alone is inadequate to identify all fractures. It is recommended that all SS should include oblique views of the ribs. This review suggests that the following options would optimize the diagnostic yield. However, each needs to be evaluated prospectively: SS that includes oblique views, SS and BS, a SS with repeat SS or selected images 2 weeks later or a BS plus skull radiography and coned views of metaphyses and epiphyses.

  7. Total System Performance Assessment - License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2003-12-08

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document.

  8. Expert system applications to nuclear power for enhancement of productivity and performance

    International Nuclear Information System (INIS)

    Naser, J.A.; Cain, D.G.; Sun, B.K.H.; Colley, R.W.; Hirota, N.S.; Gelhaus, F.E.

    1989-01-01

    Expert system technology has matured enough to offer a great deal of promise for a number of application areas in the electric utility industry. These applications can enhance productivity and aid in decision-making. Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of expert system technology. The first effort is the development of expert system building tools which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities which are required. The tool development helps define the applications which can be successfully developed. The purpose of this paper is to describe some of the tool and application development work which is being performed at EPRI for the electric utility industry. (orig.)

  9. Development and application of a methodology for identifying and characterising scenarios

    International Nuclear Information System (INIS)

    Billington, D.; Bailey, L.

    1998-01-01

    interval along each timeline. This report presents illustrative examples of the application of the above methodology to achieve this aim. The results of risk calculations and assigned weights are plotted on a 'weight-risk diagram', which is used to judge the relative significance of the different variant scenarios in relation to the base scenario and the regulatory risk target. The application of this methodology is consistent with a staged approach to performance assessment, in which effort is focused initially on scoping calculations of conditional risk. Only those variant scenarios giving a higher conditional risk than the base scenario are subject to more detailed evaluation, including the assignment of an appropriate weight. From the limited trialling that has been undertaken, the indications are that a tractable approach, consistent with the objectives of comprehensiveness, traceability and clarity, has been achieved. (author)

  10. Application of DNA forensic techniques for identifying poached guanacos (Lama guanicoe) in Chilean Patagonia*.

    Science.gov (United States)

    Marín, Juan C; Saucedo, Cristian E; Corti, Paulo; González, Benito A

    2009-09-01

    Guanaco (Lama guanicoe) is a protected and widely distributed ungulate in South America. A poacher, after killing guanacos in Valle Chacabuco, Chilean Patagonia, transported and stored the meat. Samples were retrieved by local police but the suspect argued that the meat was from a horse. Mitochondrial cytochrome b gene (774 pb), 15 loci microsatellites, and SRY gene were used to identify the species, number of animals and their population origin, and the sex of the animals, respectively. Analysis revealed that the samples came from a female (absence of SRY gene) Patagonian guanaco (assignment probability between 0.0075 and 0.0282), and clearly distinguishing it from sympatric ungulates (E-value = 0). Based on the evidence obtained in the field in addition to forensic data, the suspect was convicted of poaching and illegally carrying fire arms. This is the first report of molecular tools being used in forensic investigations of Chilean wildlife indicating its promising future application in guanaco management and conservation.

  11. Time distortion associated with smartphone addiction: Identifying smartphone addiction via a mobile application (App).

    Science.gov (United States)

    Lin, Yu-Hsuan; Lin, Yu-Cheng; Lee, Yang-Han; Lin, Po-Hsien; Lin, Sheng-Hsuan; Chang, Li-Ren; Tseng, Hsien-Wei; Yen, Liang-Yu; Yang, Cheryl C H; Kuo, Terry B J

    2015-06-01

    Global smartphone penetration has brought about unprecedented addictive behaviors. We report a proposed diagnostic criteria and the designing of a mobile application (App) to identify smartphone addiction. We used a novel empirical mode decomposition (EMD) to delineate the trend in smartphone use over one month. The daily use count and the trend of this frequency are associated with smartphone addiction. We quantify excessive use by daily use duration and frequency, as well as the relationship between the tolerance symptoms and the trend for the median duration of a use epoch. The psychiatrists' assisted self-reporting use time is significant lower than and the recorded total smartphone use time via the App and the degree of underestimation was positively correlated with actual smartphone use. Our study suggests the identification of smartphone addiction by diagnostic interview and via the App-generated parameters with EMD analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Business intelligence and performance management theory, systems and industrial applications

    CERN Document Server

    2013-01-01

    This book covers all the basic concepts of business intelligence and performance management including strategic support, business applications, methodologies and technologies from the field, and thoroughly explores the benefits, issues and challenges of each.

  13. Performance-oriented packaging: A guide to identifying and designing. Identifying and designing hazardous materials packaging for compliance with post HM-181 DOT Regulations

    International Nuclear Information System (INIS)

    1994-08-01

    With the initial publication of Docket HM-181 (hereafter referred to as HM-181), the U.S. Department of Energy (DOE), Headquarters, Transportation Management Division decided to produce guidance to help the DOE community transition to performance-oriented packagings (POP). As only a few individuals were familiar with the new requirements, elementary guidance was desirable. The decision was to prepare the guidance at a level easily understood by a novice to regulatory requirements. This document identifies design development strategies for use in obtaining performance-oriented packagings that are not readily available commercially. These design development strategies will be part of the methodologies for compliance with post HM-181 U.S. Department of Transportation (DOT) packaging regulations. This information was prepared for use by the DOE and its contractors. The document provides guidance for making decisions associated with designing performance-oriented packaging, and not for identifying specific material or fabrication design details. It does provide some specific design considerations. Having a copy of the regulations handy when reading this document is recommended to permit a fuller understanding of the requirements impacting the design effort. While this document is not written for the packaging specialist, it does contain guidance important to those not familiar with the new POP requirements

  14. Cache Performance Optimization for SoC Vedio Applications

    OpenAIRE

    Lei Li; Wei Zhang; HuiYao An; Xing Zhang; HuaiQi Zhu

    2014-01-01

    Chip Multiprocessors (CMPs) are adopted by industry to deal with the speed limit of the single-processor. But memory access has become the bottleneck of the performance, especially in multimedia applications. In this paper, a set of management policies is proposed to improve the cache performance for a SoC platform of video application. By analyzing the behavior of Vedio Engine, the memory-friendly writeback and efficient prefetch policies are adopted. The experiment platform is simulated by ...

  15. Optical Thermal Characterization Enables High-Performance Electronics Applications

    Energy Technology Data Exchange (ETDEWEB)

    2016-02-01

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  16. Novel Application of Statistical Methods to Identify New Urinary Incontinence Risk Factors

    Directory of Open Access Journals (Sweden)

    Theophilus O. Ogunyemi

    2012-01-01

    Full Text Available Longitudinal data for studying urinary incontinence (UI risk factors are rare. Data from one study, the hallmark Medical, Epidemiological, and Social Aspects of Aging (MESA, have been analyzed in the past; however, repeated measures analyses that are crucial for analyzing longitudinal data have not been applied. We tested a novel application of statistical methods to identify UI risk factors in older women. MESA data were collected at baseline and yearly from a sample of 1955 men and women in the community. Only women responding to the 762 baseline and 559 follow-up questions at one year in each respective survey were examined. To test their utility in mining large data sets, and as a preliminary step to creating a predictive index for developing UI, logistic regression, generalized estimating equations (GEEs, and proportional hazard regression (PHREG methods were used on the existing MESA data. The GEE and PHREG combination identified 15 significant risk factors associated with developing UI out of which six of them, namely, urinary frequency, urgency, any urine loss, urine loss after emptying, subject’s anticipation, and doctor’s proactivity, are found most highly significant by both methods. These six factors are potential candidates for constructing a future UI predictive index.

  17. The application of digital image plane holography technology to identify Chinese herbal medicine

    Science.gov (United States)

    Wang, Huaying; Guo, Zhongjia; Liao, Wei; Zhang, Zhihui

    2012-03-01

    In this paper, the imaging technology of digital image plane holography to identify the Chinese herbal medicine is studied. The optical experiment system of digital image plane holography which is the special case of pre-magnification digital holography was built. In the record system, one is an object light by using plane waves which illuminates the object, and the other one is recording hologram by using spherical light wave as reference light. There is a Micro objective lens behind the object. The second phase factor which caus ed by the Micro objective lens can be eliminated by choosing the proper position of the reference point source when digital image plane holography is recorded by spherical light. In this experiment, we use the Lygodium cells and Onion cells as the object. The experiment results with Lygodium cells and Onion cells show that digital image plane holography avoid the process of finding recording distance by using auto-focusing approach, and the phase information of the object can be reconstructed more accurately. The digital image plane holography is applied to the microscopic imaging of cells more effectively, and it is suit to apply for the identify of Chinese Herbal Medicine. And it promotes the application of digital holographic in practice.

  18. Wireless ad hoc and sensor networks management, performance, and applications

    CERN Document Server

    He, Jing

    2013-01-01

    Although wireless sensor networks (WSNs) have been employed across a wide range of applications, there are very few books that emphasize the algorithm description, performance analysis, and applications of network management techniques in WSNs. Filling this need, Wireless Ad Hoc and Sensor Networks: Management, Performance, and Applications summarizes not only traditional and classical network management techniques, but also state-of-the-art techniques in this area. The articles presented are expository, but scholarly in nature, including the appropriate history background, a review of current

  19. Applications of Earth Remote Sensing for Identifying Tornado and Severe Weather Damage

    Science.gov (United States)

    Schultz, Lori; Molthan, Andrew; Burks, Jason E.; Bell, Jordan; McGrath, Kevin; Cole, Tony

    2016-01-01

    NASA SPoRT (Short-term Prediction Research and Transition Center) provided MODIS (Moderate Resolution Imaging Spectrometer) and ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) imagery to WFOs (Weather Forecast Offices) in Alabama to support April 27th, 2011 damage assessments across the state. SPoRT was awarded a NASA Applied Science: Disasters Feasibility award to investigate the applicability of including remote sensing imagery and derived products into the NOAA/NWS (National Oceanic and Atmospheric Administration/National Weather System) Damage Assessment Toolkit (DAT). Proposal team was awarded the 3-year proposal to implement a web mapping service and associate data feeds from the USGS (U.S. Geological Survey) to provide satellite imagery and derived products directly to the NWS thru the DAT. In the United States, NOAA/NWS is charged with performing damage assessments when storm or tornado damage is suspected after a severe weather event. This has led to the development of the Damage Assessment Toolkit (DAT), an application for smartphones, tablets and web browsers that allows for the collection, geo-location, and aggregation of various damage indicators collected during storm surveys.

  20. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  1. Application of FEPs analysis to identify research priorities relevant to the safety case for an Australian radioactive waste facility

    International Nuclear Information System (INIS)

    Payne, T.E.; McGlinn, P.J.

    2007-01-01

    The Australian Nuclear Science and Technology Organisation (ANSTO) has established a project to undertake research relevant to the safety case for the proposed Australian radioactive waste facility. This facility will comprise a store for intermediate level radioactive waste, and either a store or a near-surface repository for low-level waste. In order to identify the research priorities for this project, a structured analysis of the features, events and processes (FEPs) relevant to the performance of the facility was undertaken. This analysis was based on the list of 137 FEPs developed by the IAEA project on 'Safety Assessment Methodologies for Near Surface Disposal Facilities' (ISAM). A number of key research issues were identified, and some factors which differ in significance for the store, compared to the repository concept, were highlighted. For example, FEPs related to long-term groundwater transport of radionuclides are considered to be of less significance for a store than a repository. On the other hand, structural damage from severe weather, accident or human interference is more likely for a store. The FEPs analysis has enabled the scientific research skills required for the inter-disciplinary project team to be specified. The outcomes of the research will eventually be utilised in developing the design, and assessing the performance, of the future facility. It is anticipated that a more detailed application of the FEPs methodology will be undertaken to develop the safety case for the proposed radioactive waste management facility. (authors)

  2. High-performance silicon photonics technology for telecommunications applications.

    Science.gov (United States)

    Yamada, Koji; Tsuchizawa, Tai; Nishi, Hidetaka; Kou, Rai; Hiraki, Tatsurou; Takeda, Kotaro; Fukuda, Hiroshi; Ishikawa, Yasuhiko; Wada, Kazumi; Yamamoto, Tsuyoshi

    2014-04-01

    By way of a brief review of Si photonics technology, we show that significant improvements in device performance are necessary for practical telecommunications applications. In order to improve device performance in Si photonics, we have developed a Si-Ge-silica monolithic integration platform, on which compact Si-Ge-based modulators/detectors and silica-based high-performance wavelength filters are monolithically integrated. The platform features low-temperature silica film deposition, which cannot damage Si-Ge-based active devices. Using this platform, we have developed various integrated photonic devices for broadband telecommunications applications.

  3. High-performance silicon photonics technology for telecommunications applications

    International Nuclear Information System (INIS)

    Yamada, Koji; Tsuchizawa, Tai; Nishi, Hidetaka; Kou, Rai; Hiraki, Tatsurou; Takeda, Kotaro; Fukuda, Hiroshi; Yamamoto, Tsuyoshi; Ishikawa, Yasuhiko; Wada, Kazumi

    2014-01-01

    By way of a brief review of Si photonics technology, we show that significant improvements in device performance are necessary for practical telecommunications applications. In order to improve device performance in Si photonics, we have developed a Si-Ge-silica monolithic integration platform, on which compact Si-Ge–based modulators/detectors and silica-based high-performance wavelength filters are monolithically integrated. The platform features low-temperature silica film deposition, which cannot damage Si-Ge–based active devices. Using this platform, we have developed various integrated photonic devices for broadband telecommunications applications. (review)

  4. High-performance silicon photonics technology for telecommunications applications

    Science.gov (United States)

    Yamada, Koji; Tsuchizawa, Tai; Nishi, Hidetaka; Kou, Rai; Hiraki, Tatsurou; Takeda, Kotaro; Fukuda, Hiroshi; Ishikawa, Yasuhiko; Wada, Kazumi; Yamamoto, Tsuyoshi

    2014-04-01

    By way of a brief review of Si photonics technology, we show that significant improvements in device performance are necessary for practical telecommunications applications. In order to improve device performance in Si photonics, we have developed a Si-Ge-silica monolithic integration platform, on which compact Si-Ge-based modulators/detectors and silica-based high-performance wavelength filters are monolithically integrated. The platform features low-temperature silica film deposition, which cannot damage Si-Ge-based active devices. Using this platform, we have developed various integrated photonic devices for broadband telecommunications applications.

  5. How Resource Challenges Can Improve Firm Innovation Performance: Identifying Coping Strategies

    NARCIS (Netherlands)

    Grinstein, A.; Rosenzweig, S.

    2016-01-01

    Researchers recently suggested that challenges in the form of adversities and constraints can actually promote individuals, teams and firms. However, it remains unclear how such challenges elicit positive innovation performance. Moreover, we still cannot distinguish between the conditions under

  6. Identifying and quantifying heterogeneity in high content analysis: application of heterogeneity indices to drug discovery.

    Directory of Open Access Journals (Sweden)

    Albert H Gough

    Full Text Available One of the greatest challenges in biomedical research, drug discovery and diagnostics is understanding how seemingly identical cells can respond differently to perturbagens including drugs for disease treatment. Although heterogeneity has become an accepted characteristic of a population of cells, in drug discovery it is not routinely evaluated or reported. The standard practice for cell-based, high content assays has been to assume a normal distribution and to report a well-to-well average value with a standard deviation. To address this important issue we sought to define a method that could be readily implemented to identify, quantify and characterize heterogeneity in cellular and small organism assays to guide decisions during drug discovery and experimental cell/tissue profiling. Our study revealed that heterogeneity can be effectively identified and quantified with three indices that indicate diversity, non-normality and percent outliers. The indices were evaluated using the induction and inhibition of STAT3 activation in five cell lines where the systems response including sample preparation and instrument performance were well characterized and controlled. These heterogeneity indices provide a standardized method that can easily be integrated into small and large scale screening or profiling projects to guide interpretation of the biology, as well as the development of therapeutics and diagnostics. Understanding the heterogeneity in the response to perturbagens will become a critical factor in designing strategies for the development of therapeutics including targeted polypharmacology.

  7. Identifying Gaps in the Performance of Pediatric Trainees Who Receive Marginal/Unsatisfactory Ratings.

    Science.gov (United States)

    Li, Su-Ting T; Tancredi, Daniel J; Schwartz, Alan; Guillot, Ann; Burke, Ann; Trimm, R Franklin; Guralnick, Susan; Mahan, John D; Gifford, Kimberly A

    2018-01-01

    To perform a derivation study to determine in which subcompetencies marginal/unsatisfactory pediatric residents had the greatest deficits compared with their satisfactorily performing peers and which subcompetencies best discriminated between marginal/unsatisfactory and satisfactorily performing residents. Multi-institutional cohort study of all 21 milestones (rated on four or five levels) reported to the Accreditation Council for Graduate Medical Education, and global marginal/unsatisfactory versus satisfactory performance reported to the American Board of Pediatrics. Data were gathered in 2013-2014. For each level of training (postgraduate year [PGY] 1, 2, and 3), mean differences between milestone levels of residents with marginal/unsatisfactory and satisfactory performance adjusted for clustering by program and C-statistics (area under receiver operating characteristic curve) were calculated. A Bonferroni-corrected significance threshold of .0007963 was used to account for multiple comparisons. Milestone and overall performance evaluations for 1,704 pediatric residents in 41 programs were obtained. For PGY1s, two subcompetencies had almost a one-point difference in milestone levels between marginal/unsatisfactory and satisfactory trainees and outstanding discrimination (≥ 0.90): organize/prioritize (0.93; C-statistic: 0.91) and transfer of care (0.97; C-statistic: 0.90). The largest difference between marginal/unsatisfactory and satisfactory PGY2s was trustworthiness (0.78). The largest differences between marginal/unsatisfactory and satisfactory PGY3s were ethical behavior (1.17), incorporating feedback (1.03), and professionalization (0.96). For PGY2s and PGY3s, no subcompetencies had outstanding discrimination. Marginal/unsatisfactory pediatric residents had different subcompetency gaps at different training levels. While PGY1s may have global deficits, senior residents may have different performance deficiencies requiring individualized counseling and

  8. Identifying the performance characteristics of a winning outcome in elite mixed martial arts competition.

    Science.gov (United States)

    James, Lachlan P; Robertson, Sam; Haff, G Gregory; Beckman, Emma M; Kelly, Vincent G

    2017-03-01

    To determine those performance indicators that have the greatest influence on classifying outcome at the elite level of mixed martial arts (MMA). A secondary objective was to establish the efficacy of decision tree analysis in explaining the characteristics of victory when compared to alternate statistical methods. Cross-sectional observational. Eleven raw performance indicators from male Ultimate Fighting Championship bouts (n=234) from July 2014 to December 2014 were screened for analysis. Each raw performance indicator was also converted to a rate-dependent measure to be scaled to fight duration. Further, three additional performance indicators were calculated from the dataset and included in the analysis. Cohen's d effect sizes were employed to determine the magnitude of the differences between Wins and Losses, while decision tree (chi-square automatic interaction detector (CHAID)) and discriminant function analyses (DFA) were used to classify outcome (Win and Loss). Effect size comparisons revealed differences between Wins and Losses across a number of performance indicators. Decision tree (raw: 71.8%; rate-scaled: 76.3%) and DFA (raw: 71.4%; rate-scaled 71.2%) achieved similar classification accuracies. Grappling and accuracy performance indicators were the most influential in explaining outcome. The decision tree models also revealed multiple combinations of performance indicators leading to victory. The decision tree analyses suggest that grappling activity and technique accuracy are of particular importance in achieving victory in elite-level MMA competition. The DFA results supported the importance of these performance indicators. Decision tree induction represents an intuitive and slightly more accurate approach to explaining bout outcome in this sport when compared to DFA. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  9. Competence Description for Personal Recommendations: The Importance of Identifying the Complexity of Learning and Performance Situations

    Science.gov (United States)

    Prins, Frans J.; Nadolski, Rob J.; Berlanga, Adriana J.; Drachsler, Hendrik; Hummel, Hans G. K.; Koper, Rob

    2008-01-01

    For competences development of learners and professionals, target competences and corresponding competence development opportunities have to be identified. Personal Recommender Systems (PRS) provide personal recommendations for learners aimed at finding and selecting learning activities that best match their needs. This article argues that a…

  10. Choking under Pressure: When an Additional Positive Stereotype Affects Performance for Domain Identified Male Mathematics Students

    Science.gov (United States)

    Rosenthal, Harriet E. S.; Crisp, Richard J.

    2007-01-01

    This research aimed to establish if the presentation of two positive stereotypes would result in choking under pressure for identified male mathematics students. Seventy-five 16 year old men, who had just commenced their AS-level study, were either made aware of their gender group membership (single positive stereotype), their school group…

  11. Identifying key performance indicators in food technology contract R&D

    NARCIS (Netherlands)

    Flipse, S.M.; Sanden, van der M.C.A.; Velden, van der T.; Fortuin, F.T.J.M.; Omta, S.W.F.; Osseweijer, P.

    2013-01-01

    Innovating companies increasingly rely on outsourcing to Contract Research Organisations (CROs) for their Research and Development (R&D), which are largely understudied. This paper presents the outcome of a case study in the field of food technology contract research, identifying context

  12. Critical Thinking Skills among Elementary School Students: Comparing Identified Gifted and General Education Student Performance

    Science.gov (United States)

    Kettler, Todd

    2014-01-01

    Education reform efforts, including the current adoption of Common Core State Standards, have increased attention to teaching critical thinking skills to all students. This study investigated the critical thinking skills of fourth-grade students from a school district in Texas, including 45 identified gifted students and 163 general education…

  13. Identifying competitive strategies to improve the performance of hospitals in a competitive environment.

    Science.gov (United States)

    Chang, Chuan-Hui; Chiao, Yu-Ching; Tsai, Yafang

    2017-11-21

    This study is based on competitive dynamics theory, and discusses competitive actions (including their implementation requirements, strategic orientation, and action complexity) that influence hospitals' performance, while also meeting the requirements of Taiwan's "global budget" insurance payment policy. In order to investigate the possible actions of hospitals, the study was conducted in two stages. The first stage investigated the actions of hospitals from March 1 to May 31, 2009. Semi-structured questionnaires were used, which included in-depth interviews with senior supervisors of 10 medium- and large-scale hospitals in central Taiwan. This stage collected data related to the types of actions adopted by the hospitals in previous years. The second stage was based on the data collected from the first stage and on developed questionnaires, which were distributed from June 29 to November 1, 2009. The questionnaires were given to 20 superintendents, deputy superintendents, and supervisors responsible for the management of a hospital, and focused on medical centers and regional hospitals in central Taiwan in order to determine the types and number of competitive actions. First, the strategic orientation of an action has a significantly positive influence on subjective performance. Second, action complexity has a significantly positive influence on the subjective and the objective performance of a hospital. Third, the implementation requirements of actions do not have a significantly positive impact on the subjective or the objective performance of a hospital. Managers facing a competitive healthcare environment should adopt competitive strategies to improve the performance of the hospital.

  14. Application of the Pareto principle to identify and address drug-therapy safety issues.

    Science.gov (United States)

    Müller, Fabian; Dormann, Harald; Pfistermeister, Barbara; Sonst, Anja; Patapovas, Andrius; Vogler, Renate; Hartmann, Nina; Plank-Kiegele, Bettina; Kirchner, Melanie; Bürkle, Thomas; Maas, Renke

    2014-06-01

    Adverse drug events (ADE) and medication errors (ME) are common causes of morbidity in patients presenting at emergency departments (ED). Recognition of ADE as being drug related and prevention of ME are key to enhancing pharmacotherapy safety in ED. We assessed the applicability of the Pareto principle (~80 % of effects result from 20 % of causes) to address locally relevant problems of drug therapy. In 752 cases consecutively admitted to the nontraumatic ED of a major regional hospital, ADE, ME, contributing drugs, preventability, and detection rates of ADE by ED staff were investigated. Symptoms, errors, and drugs were sorted by frequency in order to apply the Pareto principle. In total, 242 ADE were observed, and 148 (61.2 %) were assessed as preventable. ADE contributed to 110 inpatient hospitalizations. The ten most frequent symptoms were causally involved in 88 (80.0 %) inpatient hospitalizations. Only 45 (18.6 %) ADE were recognized as drug-related problems until discharge from the ED. A limited set of 33 drugs accounted for 184 (76.0 %) ADE; ME contributed to 57 ADE. Frequency-based listing of ADE, ME, and drugs involved allowed identification of the most relevant problems and development of easily to implement safety measures, such as wall and pocket charts. The Pareto principle provides a method for identifying the locally most relevant ADE, ME, and involved drugs. This permits subsequent development of interventions to increase patient safety in the ED admission process that best suit local needs.

  15. Identifying seasonal mobility profiles from anonymized and aggregated mobile phone data. Application in food security.

    Science.gov (United States)

    Zufiria, Pedro J; Pastor-Escuredo, David; Úbeda-Medina, Luis; Hernandez-Medina, Miguel A; Barriales-Valbuena, Iker; Morales, Alfredo J; Jacques, Damien C; Nkwambi, Wilfred; Diop, M Bamba; Quinn, John; Hidalgo-Sanchís, Paula; Luengo-Oroz, Miguel

    2018-01-01

    We propose a framework for the systematic analysis of mobile phone data to identify relevant mobility profiles in a population. The proposed framework allows finding distinct human mobility profiles based on the digital trace of mobile phone users characterized by a Matrix of Individual Trajectories (IT-Matrix). This matrix gathers a consistent and regularized description of individual trajectories that enables multi-scale representations along time and space, which can be used to extract aggregated indicators such as a dynamic multi-scale population count. Unsupervised clustering of individual trajectories generates mobility profiles (clusters of similar individual trajectories) which characterize relevant group behaviors preserving optimal aggregation levels for detailed and privacy-secured mobility characterization. The application of the proposed framework is illustrated by analyzing fully anonymized data on human mobility from mobile phones in Senegal at the arrondissement level over a calendar year. The analysis of monthly mobility patterns at the livelihood zone resolution resulted in the discovery and characterization of seasonal mobility profiles related with economic activities, agricultural calendars and rainfalls. The use of these mobility profiles could support the timely identification of mobility changes in vulnerable populations in response to external shocks (such as natural disasters, civil conflicts or sudden increases of food prices) to monitor food security.

  16. Identifying seasonal mobility profiles from anonymized and aggregated mobile phone data. Application in food security.

    Directory of Open Access Journals (Sweden)

    Pedro J Zufiria

    Full Text Available We propose a framework for the systematic analysis of mobile phone data to identify relevant mobility profiles in a population. The proposed framework allows finding distinct human mobility profiles based on the digital trace of mobile phone users characterized by a Matrix of Individual Trajectories (IT-Matrix. This matrix gathers a consistent and regularized description of individual trajectories that enables multi-scale representations along time and space, which can be used to extract aggregated indicators such as a dynamic multi-scale population count. Unsupervised clustering of individual trajectories generates mobility profiles (clusters of similar individual trajectories which characterize relevant group behaviors preserving optimal aggregation levels for detailed and privacy-secured mobility characterization. The application of the proposed framework is illustrated by analyzing fully anonymized data on human mobility from mobile phones in Senegal at the arrondissement level over a calendar year. The analysis of monthly mobility patterns at the livelihood zone resolution resulted in the discovery and characterization of seasonal mobility profiles related with economic activities, agricultural calendars and rainfalls. The use of these mobility profiles could support the timely identification of mobility changes in vulnerable populations in response to external shocks (such as natural disasters, civil conflicts or sudden increases of food prices to monitor food security.

  17. Identifying Key Features of Student Performance in Educational Video Games and Simulations through Cluster Analysis

    Science.gov (United States)

    Kerr, Deirdre; Chung, Gregory K. W. K.

    2012-01-01

    The assessment cycle of "evidence-centered design" (ECD) provides a framework for treating an educational video game or simulation as an assessment. One of the main steps in the assessment cycle of ECD is the identification of the key features of student performance. While this process is relatively simple for multiple choice tests, when…

  18. Identifying Critical Success Factors for TQM and Employee Performance in Malaysian Automotive Industry: A Literature Review

    Science.gov (United States)

    Nadia Dedy, Aimie; Zakuan, Norhayati; Zaidi Bahari, Ahamad; Ariff, Mohd Shoki Md; Chin, Thoo Ai; Zameri Mat Saman, Muhamad

    2016-05-01

    TQM is a management philosophy embracing all activities through which the needs and expectations of the customer and the community and the goals of the companies are satisfied in the most efficient and cost effective way by maximizing the potential of all workers in a continuing drive for total quality improvement. TQM is very important to the company especially in automotive industry in order for them to survive in the competitive global market. The main objective of this study is to review a relationship between TQM and employee performance. Authors review updated literature on TQM study with two main targets: (a) evolution of TQM considering as a set of practice, (b) and its impacts to employee performance. Therefore, two research questions are proposed in order to review TQM constructs and employee performance measure: (a) Is the set of critical success factors associated with TQM valid as a whole? (b) What is the critical success factors should be considered to measure employee performance in automotive industry?

  19. Identifying and Validating Selection Tools for Predicting Officer Performance and Retention

    Science.gov (United States)

    2017-05-01

    Teresa L. Russell, Editor Cheryl J. Paullin, Editor Human Resources Research Organization Peter J. Legree, Editor Robert N. Kilcullen, Editor...for the Department of the Army by Human Resources Research Organization Technical review by Rebekkah Beeco, U.S. Army Research Institute...i.e., Career Intentions) and four job performance dimensions: (a) Technical Task Proficiency (TTP); (b) Management , Administration, and

  20. A New Tool for Identifying Research Standards and Evaluating Research Performance

    Science.gov (United States)

    Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki

    2012-01-01

    Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…

  1. The comparison of the performance of two screening strategies identifying newly-diagnosed HIV during pregnancy

    NARCIS (Netherlands)

    Boer, K.; Smit, C.; Flier, M. van der; Wolf, F. de; Koopmans †, P.P.; Crevel, R. van; Eggink, A.J.; Groot, R. de; Keuter, M.; Post, F.; Ven, A.J.A.M. van der; Warris, A.; et al.,

    2011-01-01

    BACKGROUND: In the Netherlands, a non-selective opt-out instead of a selective opt-in antenatal HIV screening strategy was implemented in 2004. In case of infection, screening was followed by prevention of mother-to-child-transmission (PMTCT). We compared the performance of the two strategies in

  2. The comparison of the performance of two screening strategies identifying newly-diagnosed HIV during pregnancy

    NARCIS (Netherlands)

    Boer, Kees; Smit, Colette; van der Flier, Michiel; de Wolf, Frank; Bezemer, D. O.; Gras, L. A. J.; Kesselring, A. M.; van Sighem, A. I.; Smit, C.; Zhang, S.; Zaheri, S.; Bronsveld, W.; Hillebrand-Haverkort, M. E.; Prins, J. M.; Branger, J.; Eeftinck Schattenkerk, J. K. M.; Gisolf, J.; Godfried, M. H.; Lange, J. M. A.; Lettinga, K. D.; van der Meer, J. T. M.; Nellen, F. J. B.; van der Poll, T.; Reiss, P.; Ruys, Th A.; Steingrover, R.; van Twillert, G.; Vermeulen, J. N.; Vrouenraets, S. M. E.; van Vugt, M.; Wit, F. W. M. N.; Kuijpers, T. W.; Pajkrt, D.; Scherpbier, H. J.; van Eeden, A.; Brinkman, K.; van den Berk, G. E. L.; Blok, W. L.; Frissen, P. H. J.; Roos, J. C.; Schouten, W. E. M.; Bekendam, D. J.; Weigel, H. M.; Mulder, J. W.; van Gorp, E. C. M.; Wagenaar, J.; Veenstra, J.; Danner, S. A.; van Agtmael, M. A.; Claessen, F. A. P.

    2011-01-01

    In the Netherlands, a non-selective opt-out instead of a selective opt-in antenatal HIV screening strategy was implemented in 2004. In case of infection, screening was followed by prevention of mother-to-child-transmission (PMTCT). We compared the performance of the two strategies in terms of

  3. Performance Issues in High Performance Fortran Implementations of Sensor-Based Applications

    Directory of Open Access Journals (Sweden)

    David R. O'hallaron

    1997-01-01

    Full Text Available Applications that get their inputs from sensors are an important and often overlooked application domain for High Performance Fortran (HPF. Such sensor-based applications typically perform regular operations on dense arrays, and often have latency and through put requirements that can only be achieved with parallel machines. This article describes a study of sensor-based applications, including the fast Fourier transform, synthetic aperture radar imaging, narrowband tracking radar processing, multibaseline stereo imaging, and medical magnetic resonance imaging. The applications are written in a dialect of HPF developed at Carnegie Mellon, and are compiled by the Fx compiler for the Intel Paragon. The main results of the study are that (1 it is possible to realize good performance for realistic sensor-based applications written in HPF and (2 the performance of the applications is determined by the performance of three core operations: independent loops (i.e., loops with no dependences between iterations, reductions, and index permutations. The article discusses the implications for HPF implementations and introduces some simple tests that implementers and users can use to measure the efficiency of the loops, reductions, and index permutations generated by an HPF compiler.

  4. Liver Function Indicators Performed Better to Eliminate Cardioembolic Stroke than to Identify It from Stroke Subtypes.

    Science.gov (United States)

    Tan, Ge; Yuan, Ruozhen; Hao, Zilong; Lei, Chunyan; Xiong, Yao; Xu, Mangmang; Liu, Ming

    2017-01-01

    Identifying the etiology of ischemic stroke is essential to acute management and secondary prevention. The value of liver function indicators in differentiating stroke subtypes remains to be evaluated. A total of 1333 acute ischemic stroke patients were included. Liver function indicators collected within 24 hours from stroke onset, including alanine aminotransferase, aspartate aminotransferase (AST), alkaline phosphatase, gamma-glutamyl transpeptidase (GGT), and bilirubin (BILI), were collapsed into quartiles (Q) and also dichotomized by Q1. Multivariate regression analysis was conducted to identify the independent association between liver function indicators and cardioembolic stroke (SCE). Area under the curve (AUC) of receiver operating characteristic analysis was conducted, and sensitivity (Sen), specificity (Spe), positive prospective value (PPV), and negative prospective value (NPV) were determined to evaluate the predictive value of liver function indicators for SCE. AST, GGT, and BILI were associated with SCE. After adjustment, only AST was related to SCE independently. The incidence of SCE in the Q1 of AST, GGT, and BILI, particularly in the Q1 of AST, was quite low. The ability of AST, GGT, and BILI to identify SCE was poor, with low AUC, Sen, and PPV. The value of AST, GGT, and BILI in eliminating SCE from stroke subtypes was good, with high Spe and moderate NPV, and was enhanced after combining each liver function indicator. Results of present study demonstrated that AST, GGT, and BILI, particularly AST, had a potential to eliminate SCE from stroke subtypes, and the ability of eliminating SCE would be strengthened after combining each liver function indicator together. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  5. Application of gene network analysis techniques identifies AXIN1/PDIA2 and endoglin haplotypes associated with bicuspid aortic valve.

    Directory of Open Access Journals (Sweden)

    Eric C Wooten

    2010-01-01

    Full Text Available Bicuspid Aortic Valve (BAV is a highly heritable congenital heart defect. The low frequency of BAV (1% of general population limits our ability to perform genome-wide association studies. We present the application of four a priori SNP selection techniques, reducing the multiple-testing penalty by restricting analysis to SNPs relevant to BAV in a genome-wide SNP dataset from a cohort of 68 BAV probands and 830 control subjects. Two knowledge-based approaches, CANDID and STRING, were used to systematically identify BAV genes, and their SNPs, from the published literature, microarray expression studies and a genome scan. We additionally tested Functionally Interpolating SNPs (fitSNPs present on the array; the fourth consisted of SNPs selected by Random Forests, a machine learning approach. These approaches reduced the multiple testing penalty by lowering the fraction of the genome probed to 0.19% of the total, while increasing the likelihood of studying SNPs within relevant BAV genes and pathways. Three loci were identified by CANDID, STRING, and fitSNPS. A haplotype within the AXIN1-PDIA2 locus (p-value of 2.926x10(-06 and a haplotype within the Endoglin gene (p-value of 5.881x10(-04 were found to be strongly associated with BAV. The Random Forests approach identified a SNP on chromosome 3 in association with BAV (p-value 5.061x10(-06. The results presented here support an important role for genetic variants in BAV and provide support for additional studies in well-powered cohorts. Further, these studies demonstrate that leveraging existing expression and genomic data in the context of GWAS studies can identify biologically relevant genes and pathways associated with a congenital heart defect.

  6. Applications Performance on NAS Intel Paragon XP/S - 15#

    Science.gov (United States)

    Saini, Subhash; Simon, Horst D.; Copper, D. M. (Technical Monitor)

    1994-01-01

    The Numerical Aerodynamic Simulation (NAS) Systems Division received an Intel Touchstone Sigma prototype model Paragon XP/S- 15 in February, 1993. The i860 XP microprocessor with an integrated floating point unit and operating in dual -instruction mode gives peak performance of 75 million floating point operations (NIFLOPS) per second for 64 bit floating point arithmetic. It is used in the Paragon XP/S-15 which has been installed at NAS, NASA Ames Research Center. The NAS Paragon has 208 nodes and its peak performance is 15.6 GFLOPS. Here, we will report on early experience using the Paragon XP/S- 15. We have tested its performance using both kernels and applications of interest to NAS. We have measured the performance of BLAS 1, 2 and 3 both assembly-coded and Fortran coded on NAS Paragon XP/S- 15. Furthermore, we have investigated the performance of a single node one-dimensional FFT, a distributed two-dimensional FFT and a distributed three-dimensional FFT Finally, we measured the performance of NAS Parallel Benchmarks (NPB) on the Paragon and compare it with the performance obtained on other highly parallel machines, such as CM-5, CRAY T3D, IBM SP I, etc. In particular, we investigated the following issues, which can strongly affect the performance of the Paragon: a. Impact of the operating system: Intel currently uses as a default an operating system OSF/1 AD from the Open Software Foundation. The paging of Open Software Foundation (OSF) server at 22 MB to make more memory available for the application degrades the performance. We found that when the limit of 26 NIB per node out of 32 MB available is reached, the application is paged out of main memory using virtual memory. When the application starts paging, the performance is considerably reduced. We found that dynamic memory allocation can help applications performance under certain circumstances. b. Impact of data cache on the i860/XP: We measured the performance of the BLAS both assembly coded and Fortran

  7. Goodbye or Identify: Detrimental Effects of Downsizing on Identification and Survivor Performance

    Science.gov (United States)

    van Dick, Rolf; Drzensky, Frank; Heinz, Matthias

    2016-01-01

    Research shows that after layoffs, employees often report decreased commitment and performance which has been coined the survivor syndrome. However, the mechanisms underlying this effect remain underexplored. The purpose of the paper is to show that reduced organizational identification can serve as an explanation for the survivor syndrome. We conducted a laboratory experiment, in which participants work as a group of employees for another participant who acts as employer. In the course of the experiment, the employer decides whether one of his or her employees should be laid off or not. Mediation analysis supports a social identity-based explanation for the emergence of the survivor syndrome: downsizing causes lower identification with the employer which in turn relates to lower performance of employees. PMID:27252674

  8. Identifying significant uncertainties in thermally dependent processes for repository performance analysis

    International Nuclear Information System (INIS)

    Gansemer, J.D.; Lamont, A.

    1994-01-01

    In order to study the performance of the potential Yucca Mountain Nuclear Waste Repository, scientific investigations are being conducted to reduce the uncertainty about process models and system parameters. This paper is intended to demonstrate a method for determining a strategy for the cost effective management of these investigations. It is not meant to be a complete study of all processes and interactions, but does outline a method which can be applied to more in-depth investigations

  9. Identifying individual changes in performance with composite quality indicators while accounting for regression to the mean.

    Science.gov (United States)

    Gajewski, Byron J; Dunton, Nancy

    2013-04-01

    Almost a decade ago Morton and Torgerson indicated that perceived medical benefits could be due to "regression to the mean." Despite this caution, the regression to the mean "effects on the identification of changes in institutional performance do not seem to have been considered previously in any depth" (Jones and Spiegelhalter). As a response, Jones and Spiegelhalter provide a methodology to adjust for regression to the mean when modeling recent changes in institutional performance for one-variable quality indicators. Therefore, in our view, Jones and Spiegelhalter provide a breakthrough methodology for performance measures. At the same time, in the interests of parsimony, it is useful to aggregate individual quality indicators into a composite score. Our question is, can we develop and demonstrate a methodology that extends the "regression to the mean" literature to allow for composite quality indicators? Using a latent variable modeling approach, we extend the methodology to the composite indicator case. We demonstrate the approach on 4 indicators collected by the National Database of Nursing Quality Indicators. A simulation study further demonstrates its "proof of concept."

  10. New application of intelligent agents in sporadic amyotrophic lateral sclerosis identifies unexpected specific genetic background

    Directory of Open Access Journals (Sweden)

    Marocchi Alessandro

    2008-05-01

    Full Text Available Abstract Background Few genetic factors predisposing to the sporadic form of amyotrophic lateral sclerosis (ALS have been identified, but the pathology itself seems to be a true multifactorial disease in which complex interactions between environmental and genetic susceptibility factors take place. The purpose of this study was to approach genetic data with an innovative statistical method such as artificial neural networks to identify a possible genetic background predisposing to the disease. A DNA multiarray panel was applied to genotype more than 60 polymorphisms within 35 genes selected from pathways of lipid and homocysteine metabolism, regulation of blood pressure, coagulation, inflammation, cellular adhesion and matrix integrity, in 54 sporadic ALS patients and 208 controls. Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis Results Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis. An unexpected discovery of a strong genetic background in sporadic ALS using a DNA multiarray panel and analytical processing of the data with advanced artificial neural networks was found. The predictive accuracy obtained with Linear Discriminant Analysis and Standard Artificial Neural Networks ranged from 70% to 79% (average 75.31% and from 69.1 to 86.2% (average 76.6% respectively. The corresponding value obtained with Advanced Intelligent Systems reached an average of 96.0% (range 94.4 to 97.6%. This latter approach allowed the identification of seven genetic variants essential to differentiate cases from controls: apolipoprotein E arg

  11. New application of intelligent agents in sporadic amyotrophic lateral sclerosis identifies unexpected specific genetic background.

    Science.gov (United States)

    Penco, Silvana; Buscema, Massimo; Patrosso, Maria Cristina; Marocchi, Alessandro; Grossi, Enzo

    2008-05-30

    Few genetic factors predisposing to the sporadic form of amyotrophic lateral sclerosis (ALS) have been identified, but the pathology itself seems to be a true multifactorial disease in which complex interactions between environmental and genetic susceptibility factors take place. The purpose of this study was to approach genetic data with an innovative statistical method such as artificial neural networks to identify a possible genetic background predisposing to the disease. A DNA multiarray panel was applied to genotype more than 60 polymorphisms within 35 genes selected from pathways of lipid and homocysteine metabolism, regulation of blood pressure, coagulation, inflammation, cellular adhesion and matrix integrity, in 54 sporadic ALS patients and 208 controls. Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis. An unexpected discovery of a strong genetic background in sporadic ALS using a DNA multiarray panel and analytical processing of the data with advanced artificial neural networks was found. The predictive accuracy obtained with Linear Discriminant Analysis and Standard Artificial Neural Networks ranged from 70% to 79% (average 75.31%) and from 69.1 to 86.2% (average 76.6%) respectively. The corresponding value obtained with Advanced Intelligent Systems reached an average of 96.0% (range 94.4 to 97.6%). This latter approach allowed the identification of seven genetic variants essential to differentiate cases from controls: apolipoprotein E arg158cys; hepatic lipase -480 C/T; endothelial

  12. Application of small RNA sequencing to identify microRNAs in acute kidney injury and fibrosis

    Energy Technology Data Exchange (ETDEWEB)

    Pellegrini, Kathryn L. [Department of Medicine, Renal Division, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Gerlach, Cory V. [Department of Medicine, Renal Division, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Department of Environmental Health, Harvard T.H. Chan School of Public Health, Boston, MA (United States); Laboratory of Systems Pharmacology, Harvard Program in Therapeutic Sciences, Harvard Medical School, Boston, MA (United States); Craciun, Florin L.; Ramachandran, Krithika [Department of Medicine, Renal Division, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Bijol, Vanesa [Department of Pathology, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Kissick, Haydn T. [Department of Surgery, Urology Division, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA (United States); Vaidya, Vishal S., E-mail: vvaidya@bwh.harvard.edu [Department of Medicine, Renal Division, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Department of Environmental Health, Harvard T.H. Chan School of Public Health, Boston, MA (United States); Laboratory of Systems Pharmacology, Harvard Program in Therapeutic Sciences, Harvard Medical School, Boston, MA (United States)

    2016-12-01

    Establishing a microRNA (miRNA) expression profile in affected tissues provides an important foundation for the discovery of miRNAs involved in the development or progression of pathologic conditions. We conducted small RNA sequencing to generate a temporal profile of miRNA expression in the kidneys using a mouse model of folic acid-induced (250 mg/kg i.p.) kidney injury and fibrosis. From the 103 miRNAs that were differentially expressed over the time course (> 2-fold, p < 0.05), we chose to further investigate miR-18a-5p, which is expressed during the acute stage of the injury; miR-132-3p, which is upregulated during transition between acute and fibrotic injury; and miR-146b-5p, which is highly expressed at the peak of fibrosis. Using qRT-PCR, we confirmed the increased expression of these candidate miRNAs in the folic acid model as well as in other established mouse models of acute injury (ischemia/reperfusion injury) and fibrosis (unilateral ureteral obstruction). In situ hybridization confirmed high expression of miR-18a-5p, miR-132-3p and miR-146b-5p throughout the kidney cortex in mice and humans with severe kidney injury or fibrosis. When primary human proximal tubular epithelial cells were treated with model nephrotoxicants such as cadmium chloride (CdCl{sub 2}), arsenic trioxide, aristolochic acid (AA), potassium dichromate (K{sub 2}Cr{sub 2}O{sub 7}) and cisplatin, miRNA-132-3p was upregulated 4.3-fold after AA treatment and 1.5-fold after K{sub 2}Cr{sub 2}O{sub 7} and CdCl{sub 2} treatment. These results demonstrate the application of temporal small RNA sequencing to identify miR-18a, miR-132 and miR-146b as differentially expressed miRNAs during distinct phases of kidney injury and fibrosis progression. - Highlights: • We used small RNA sequencing to identify differentially expressed miRNAs in kidney. • Distinct patterns were found for acute injury and fibrotic stages in the kidney. • Upregulation of miR-18a, -132 and -146b was confirmed in mice

  13. Application of Entropy-Based Metrics to Identify Emotional Distress from Electroencephalographic Recordings

    Directory of Open Access Journals (Sweden)

    Beatriz García-Martínez

    2016-06-01

    Full Text Available Recognition of emotions is still an unresolved challenge, which could be helpful to improve current human-machine interfaces. Recently, nonlinear analysis of some physiological signals has shown to play a more relevant role in this context than their traditional linear exploration. Thus, the present work introduces for the first time the application of three recent entropy-based metrics: sample entropy (SE, quadratic SE (QSE and distribution entropy (DE to discern between emotional states of calm and negative stress (also called distress. In the last few years, distress has received growing attention because it is a common negative factor in the modern lifestyle of people from developed countries and, moreover, it may lead to serious mental and physical health problems. Precisely, 279 segments of 32-channel electroencephalographic (EEG recordings from 32 subjects elicited to be calm or negatively stressed have been analyzed. Results provide that QSE is the first single metric presented to date with the ability to identify negative stress. Indeed, this metric has reported a discriminant ability of around 70%, which is only slightly lower than the one obtained by some previous works. Nonetheless, discriminant models from dozens or even hundreds of features have been previously obtained by using advanced classifiers to yield diagnostic accuracies about 80%. Moreover, in agreement with previous neuroanatomy findings, QSE has also revealed notable differences for all the brain regions in the neural activation triggered by the two considered emotions. Consequently, given these results, as well as easy interpretation of QSE, this work opens a new standpoint in the detection of emotional distress, which may gain new insights about the brain’s behavior under this negative emotion.

  14. Performing drought indices to identify the relationship between agricultural losses and drought events in Spain.

    Science.gov (United States)

    Peña Gallardo, Marina; Serrano, Sergio Martín Vicente; Portugués Santiago, Beguería; Burguera Miquel, Tomás

    2017-04-01

    Drought leads to crop failures reducing the productivity. For this reason, the need of appropriate tool for recognize dry periods and evaluate the impact of drought on crop production is important. In this study, we provide an assessment of the relationship between drought episodes and crop failures in Spain as one of the direct consequences of drought is the diminishing of crop yields. First, different drought indices [the Standardized Precipitation and Evapotranspiration Index (SPEI); the Standardized Precipitation Index (SPI); the self-calibrated Palmer Moisture Anomaly Index (Z-Index), the self-calibrated Crop Moisture Index (CMI) and the Standardized Palmer Drought Index (SPDI)] have been calculated at different time scales in order to identify the dry events occurred in Spain and determine the duration and intensity of each event. Second, the drought episodes have been correlated with crop production estimated and final crop production data provided by the Spanish Crop Insurance System for the available period from 1995 to 2014 at the municipal spatial scale, with the purpose of knowing if the characteristics of the drought episodes are reflected on the agricultural losses. The analysis has been carried out in particular for two types of crop, wheat and barley. The results indicate the existence of an agreement between the most important drought events in Spain and the response of the crop productions and the proportion of hectare insurance. Nevertheless, this agreement vary depending on the drought index applied. Authors found a higher competence of the drought indices calculated at different time scales (SPEI, SPI and SPDI) identifying the begging and end of the drought events and the correspondence with the crop failures.

  15. Identifying Contextual Factors of Employee Satisfaction of Performance Management at a Thai State Enterprise

    Directory of Open Access Journals (Sweden)

    Molraudee Saratun

    2013-11-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Although there has been an increase in Performance Management (PM literature over the years arguing that PM perceptions are likely to be a function of PM process components and contextual factors, the actual relationship between the contextual factors and employee satisfaction of PM remains little explored.  Extending previous research, this study examines relationships between contextual factors and employees’ PM satisfaction.  Derived from the literature, these contextual factors are motivation and empowerment of employees, role conflict, role ambiguity, perceived organisational support, procedural justice and distributive justice.  Seven directional hypotheses are tested accordingly through a series of regression analyses.  This article finds that these contextual factors, with the exception of role conflict, are directly predictive of enhanced employees’ PM satisfaction at the Thai state enterprise. Keywords: Performance management, contextual factors, performance management satisfaction, public organisations, Thailand. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  16. Identifying students’ learning performance as a way to determine the admission process in physical education field

    Science.gov (United States)

    Prihanto, J. B.; Kartiko, D. C.; Wijaya, A.

    2018-01-01

    The interest in the physical education field has been rising in the past ten years. It can be seen that registrants of the physical education program in several universities increase. This research is meant to analyze students’ admission process and its relation to their performance in the learning activities in the department of physical education at Universitas Negeri Surabaya. The design of this study was quantitative data analysis. The research was conducted by collecting students’ admission data and their transcripts. The result showed that the most influential factor of admission in physical education program was the student’ field of study in high school. In addition, their achievements in sports competitions and family welfare are not likely to be important factors. These results give a recommendation for the next admission process which related to the quality of graduates.

  17. Use of the Cognitive Performance Test for Identifying Deficits in Hospitalized Older Adults

    Directory of Open Access Journals (Sweden)

    Alison Douglas

    2012-01-01

    Full Text Available Objectives. The Cognitive Performance Test (CPT is a functional assessment for persons with dementia. The study purpose was to evaluate the reliability, discriminant, and concurrent validity of the CPT. Method. The CPT was tested against other measures of cognition (Standardized Mini Mental Status Exam (SMMSE and Assessment of Motor and Process Skills-Process scale (AMPS-Process. Participants were persons 65 years and older admitted to a geriatric rehabilitation unit (n=47. Results. The CPT correlated moderately with measures of cognition (SMMSE r=0.47, AMPS-Process r=0.53, P<0.01, and ADL burden of care (FIM r=0.32, P<0.05. Scores were not affected by age, sex, years of education, motor skills, or comorbidities. The CPT differentiated between impaired and unimpaired individuals differently from other measures. Conclusion. While CPT appears related to other measures of cognition, test interpretation requires noting the variability between CPT scores and those measures.

  18. Instruction-level performance modeling and characterization of multimedia applications

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Los Alamos National Lab., NM (United States). Scientific Computing Group; Cameron, K.W. [Louisiana State Univ., Baton Rouge, LA (United States). Dept. of Computer Science

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based on microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.

  19. Optical packet switching in HPC : an analysis of applications performance

    NARCIS (Netherlands)

    Meyer, Hugo; Sancho, Jose Carlos; Mrdakovic, Milica; Miao, Wang; Calabretta, Nicola

    2018-01-01

    Optical Packet Switches (OPS) could provide the needed low latency transmissions in today large data centers. OPS can deliver lower latency and higher bandwidth than traditional electrical switches. These features are needed for parallel High Performance Computing (HPC) applications. For this

  20. Predictive Performance Tuning of OpenACC Accelerated Applications

    KAUST Repository

    Siddiqui, Shahzeb; Feki, Saber

    2014-01-01

    , with the introduction of high level programming models such as OpenACC [1] and OpenMP 4.0 [2], these devices are becoming more accessible and practical to use by a larger scientific community. However, performance optimization of OpenACC accelerated applications usually

  1. Distributed dynamic simulations of networked control and building performance applications

    NARCIS (Netherlands)

    Yahiaoui, Azzedine

    2018-01-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the

  2. Identifying black swans in NextGen: predicting human performance in off-nominal conditions.

    Science.gov (United States)

    Wickens, Christopher D; Hooey, Becky L; Gore, Brian F; Sebok, Angelia; Koenicke, Corey S

    2009-10-01

    The objective is to validate a computational model of visual attention against empirical data--derived from a meta-analysis--of pilots' failure to notice safety-critical unexpected events. Many aircraft accidents have resulted, in part, because of failure to notice nonsalient unexpected events outside of foveal vision, illustrating the phenomenon of change blindness. A model of visual noticing, N-SEEV (noticing-salience, expectancy, effort, and value), was developed to predict these failures. First, 25 studies that reported objective data on miss rate for unexpected events in high-fidelity cockpit simulations were identified, and their miss rate data pooled across five variables (phase of flight, event expectancy, event location, presence of a head-up display, and presence of a highway-in-the-sky display). Second, the parameters of the N-SEEV model were tailored to mimic these dichotomies. The N-SEEV model output predicted variance in the obtained miss rate (r = .73). The individual miss rates of all six dichotomous conditions were predicted within 14%, and four of these were predicted within 7%. The N-SEEV model, developed on the basis of an independent data set, was able to successfully predict variance in this safety-critical measure of pilot response to abnormal circumstances, as collected from the literature. As new technology and procedures are envisioned for the future airspace, it is important to predict if these may compromise safety in terms of pilots' failing to notice unexpected events. Computational models such as N-SEEV support cost-effective means of making such predictions.

  3. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  4. Regression Trees Identify Relevant Interactions: Can This Improve the Predictive Performance of Risk Adjustment?

    Science.gov (United States)

    Buchner, Florian; Wasem, Jürgen; Schillo, Sonja

    2017-01-01

    Risk equalization formulas have been refined since their introduction about two decades ago. Because of the complexity and the abundance of possible interactions between the variables used, hardly any interactions are considered. A regression tree is used to systematically search for interactions, a methodologically new approach in risk equalization. Analyses are based on a data set of nearly 2.9 million individuals from a major German social health insurer. A two-step approach is applied: In the first step a regression tree is built on the basis of the learning data set. Terminal nodes characterized by more than one morbidity-group-split represent interaction effects of different morbidity groups. In the second step the 'traditional' weighted least squares regression equation is expanded by adding interaction terms for all interactions detected by the tree, and regression coefficients are recalculated. The resulting risk adjustment formula shows an improvement in the adjusted R 2 from 25.43% to 25.81% on the evaluation data set. Predictive ratios are calculated for subgroups affected by the interactions. The R 2 improvement detected is only marginal. According to the sample level performance measures used, not involving a considerable number of morbidity interactions forms no relevant loss in accuracy. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Identifying Sources of Volatile Organic Compounds and Aldehydes in a High Performance Building

    International Nuclear Information System (INIS)

    Ortiz, Anna C.; Russell, Marion; Lee, Wen-Yee; Apte, Michael; Maddalena, Randy

    2010-01-01

    The developers of the Paharpur Business Center (PBC) and Software Technology Incubator Park in New Delhi, India offer an environmentally sustainable building with a strong emphasis on energy conservation, waste minimization and superior indoor air quality (IAQ). To achieve the IAQ goal, the building utilizes a series of air cleaning technologies for treating the air entering the building. These technologies include an initial water wash followed by ultraviolet light treatment and biofiltration using a greenhouse located on the roof and numerous plants distributed throughout the building. Even with the extensive treatment of makeup air and room air in the PBC, a recent study found that the concentrations of common volatile organic compounds and aldehydes appear to rise incrementally as the air passes through the building from the supply to the exhaust. This finding highlights the need to consider the minimization of chemical sources in buildings in combination with the use of advanced air cleaning technologies when seeking to achieve superior IAQ. The goal of this project was to identify potential source materials for indoor chemicals in the PBC. Samples of building materials, including wood paneling (polished and unpolished), drywall, and plastic from a hydroponic drum that was part of the air cleaning system, were collected from the building for testing. All materials were collected from the PBC building and shipped to the Lawrence Berkeley National Laboratory (LBNL) for testing. The materials were pre-conditioned for two different time periods before measuring material and chemical specific emission factors for a range of VOCs and Aldehydes. Of the six materials tested, we found that the highest emitter of formaldehyde was new plywood paneling. Although polish and paint contribute to some VOC emissions, the main influence of the polish was in altering the capacity of the surface to accumulate formaldehyde. Neither the new nor aged polish contributed significantly

  6. Identifying Sources of Volatile Organic Compounds and Aldehydes in a High Performance Building

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, Anna C.; Russell, Marion; Lee, Wen-Yee; Apte, Michael; Maddalena, Randy

    2010-09-20

    The developers of the Paharpur Business Center (PBC) and Software Technology Incubator Park in New Delhi, India offer an environmentally sustainable building with a strong emphasis on energy conservation, waste minimization and superior indoor air quality (IAQ). To achieve the IAQ goal, the building utilizes a series of air cleaning technologies for treating the air entering the building. These technologies include an initial water wash followed by ultraviolet light treatment and biolfiltration using a greenhouse located on the roof and numerous plants distributed throughout the building. Even with the extensive treatment of makeup air and room air in the PBC, a recent study found that the concentrations of common volatile organic compounds and aldehydes appear to rise incrementally as the air passes through the building from the supply to the exhaust. This finding highlights the need to consider the minimization of chemical sources in buildings in combination with the use of advanced air cleaning technologies when seeking to achieve superior IAQ. The goal of this project was to identify potential source materials for indoor chemicals in the PBC. Samples of building materials, including wood paneling (polished and unpolished), drywall, and plastic from a hydroponic drum that was part of the air cleaning system, were collected from the building for testing. All materials were collected from the PBC building and shipped to the Lawrence Berkeley National Laboratory (LBNL) for testing. The materials were pre-conditioned for two different time periods before measuring material and chemical specific emission factors for a range of VOCs and Aldehydes. Of the six materials tested, we found that the highest emitter of formaldehyde was new plywood paneling. Although polish and paint contribute to some VOC emissions, the main influence of the polish was in altering the capacity of the surface to accumulate formaldehyde. Neither the new nor aged polish contributed

  7. Fragrance contact allergens in 5588 cosmetic products identified through a novel smartphone application.

    Science.gov (United States)

    Bennike, N H; Oturai, N B; Müller, S; Kirkeby, C S; Jørgensen, C; Christensen, A B; Zachariae, C; Johansen, J D

    2018-01-01

    More than 25% of the adult European population suffers from contact allergy, with fragrance substances recognized as one of the main causes. Since 2005, 26 fragrance contact allergens have been mandatory to label in cosmetic products within the EU if present at 10 ppm or above in leave-on and 100 ppm or above in wash-off cosmetics. To examine exposure, based on ingredient labelling, to the 26 fragrances in a sample of 5588 fragranced cosmetic products. The investigated products were identified through a novel, non-profit smartphone application (app), designed to provide information to consumers about chemical substances in cosmetic products. Products registered through the app between December 2015 and October 2016 were label checked according to International Nomenclature of Cosmetic Ingredients (INCI) for the presence of the 26 fragrance substances or the wording 'fragrance/parfum/aroma'. The largest product categories investigated were 'cream, lotion and oil' (n = 1192), 'shampoo and conditioner' (n = 968) and 'deodorants' (n = 632). Among cosmetic products labelled to contain at least one of the 26 fragrances, 85.5% and 73.9% contained at least two and at least three of the 26 fragrances, respectively. Linalool (49.5%) and limonene (48.5%) were labelled most often among all investigated products. Hydroxyisohexyl 3-cyclohexene carboxaldehyde (HICC/Lyral ® ) was found in 13.5% of deodorants. Six of the 26 fragrance substances were labelled on less than one per cent of all products, including the natural extracts Evernia furfuracea (tree moss) and Evernia prunastri (oak moss). A total of 329 (5.9%) products had one or more of the 26 fragrance substances labelled but did not have 'parfum/fragrance/aroma' listed on the label. Consumers are widely exposed to, often multiple, well-established fragrance contact allergens through various cosmetic products intended for daily use. Several fragrance substances that are common causes of contact allergy were rarely

  8. Performance of fertigation technique for phosphorus application in cotton

    Directory of Open Access Journals (Sweden)

    M. Aslam

    2009-05-01

    Full Text Available Low native soil phosphorus availability coupled with poor utilization of added phosphorus is one of the major constraints limiting the productivity of the crops. With a view of addressing this issue, field studies were conducted to compare the relative efficacy of broadcast and fertigation techniques for phosphorus application during 2005-2006 using cotton as a test crop. Two methods of phosphorus application i.e. broadcast and fertigation were evaluated using five levels of P2O5 (0, 30, 45, 60 and 75 kg P2O5 ha -1. Fertigation showed an edge over broadcast method at all levels of phosphorus application. The highest seed cotton yield was recorded with 75 kg P2O5 ha-1. Fertilizer phosphorus applied at the rate of 60 kg ha-1 through fertigation produced 3.4 tons ha-1 of seed cotton yield, which was statistically identical to 3.3 tons recorded with 75 kg ha-1 of broadcast phosphorus. Agronomic performance of phosphorus was influenced considerably by either method of fertilizer application. The seed cotton yield per kg of fertigation phosphorus was 48% higher than the corresponding broadcast application. The results of these studies showed that fertigation was the most efficient method of phosphorus application compared with the conventional broadcast application of fertilizers.

  9. A Two-Step Method to Identify Positive Deviant Physician Organizations of Accountable Care Organizations with Robust Performance Management Systems.

    Science.gov (United States)

    Pimperl, Alexander F; Rodriguez, Hector P; Schmittdiel, Julie A; Shortell, Stephen M

    2018-06-01

    To identify positive deviant (PD) physician organizations of Accountable Care Organizations (ACOs) with robust performance management systems (PMSYS). Third National Survey of Physician Organizations (NSPO3, n = 1,398). Organizational and external factors from NSPO3 were analyzed. Linear regression estimated the association of internal and contextual factors on PMSYS. Two cutpoints (75th/90th percentiles) identified PDs with the largest residuals and highest PMSYS scores. A total of 65 and 41 PDs were identified using 75th and 90th percentiles cutpoints, respectively. The 90th percentile more strongly differentiated PDs from non-PDs. Having a high proportion of vulnerable patients appears to constrain PMSYS development. Our PD identification method increases the likelihood that PD organizations selected for in-depth inquiry are high-performing organizations that exceed expectations. © Health Research and Educational Trust.

  10. An applicable approach for performance auditing in ERP

    Directory of Open Access Journals (Sweden)

    Wan Jian Guo

    2016-01-01

    Full Text Available This paper aims at the realistic problem of performance auditing in ERP environment. Traditional performance auditing methods and existing approaches for performance evaluation of ERP implementation could not work well, because they are either difficult to work or contains certain subjective elements. This paper proposed an applicable performance auditing approach for SAP ERP based on quantitative analysis. This approach consists of 3 parts which are system utilization, data quality and the effectiveness of system control. In each part, we provide the main process to conduct the operation, especially how to calculate the online settlement rate of SAP system. This approach has played an important role in the practical auditing work. A practical case is provided at the end of this paper to describe the effectiveness of this approach. Implementation of this approach also has some significance to the performance auditing of other ERP products.

  11. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  12. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  13. The circular electrical mobility spectrometer; theory, performances and applications

    International Nuclear Information System (INIS)

    Mesbah, Boualem

    1995-04-01

    A new type of electrical mobility spectrometer (S.M.E.C.) has been designed in the Service d'Etudes et de Recherches en Aerocontamination et en Confinement (CEA) laboratories. It differs from classical electrical mobility spectrometers in its plan circular geometry and its radial flow. This gives some advantages and the possibility of new applications. The theories that we derive for the different versions of this device are confirmed by experimental results obtained using aerosol particles with known electrical mobility. The S.M.E.C's performances are tested for several applications: - controlled surface contamination, - monodisperse aerosol production, - fine and ultrafine aerosol sizing. (author) [fr

  14. How physicians identify with predetermined personalities and links to perceived performance and wellness outcomes: a cross-sectional study.

    Science.gov (United States)

    Lemaire, Jane B; Wallace, Jean E

    2014-11-29

    Certain personalities are ascribed to physicians. This research aims to measure the extent to which physicians identify with three predetermined personalities (workaholic, Type A and control freak) and to explore links to perceptions of professional performance, and wellness outcomes. This is a cross-sectional study using a mail-out questionnaire sent to all practicing physicians (2957 eligible, 1178 responses, 40% response rate) in a geographical health region within a western Canadian province. Survey items were used to assess the extent to which participants felt they are somewhat of a workaholic, Type A and/or control freak, and if they believed that having these personalities makes one a better doctor. Participants' wellness outcomes were also measured. Zero-order correlations were used to determine the relationships between physicians identifying with a personality and feeling it makes one a better doctor. T-tests were used to compare measures of physician wellness for those who identified with the personality versus those who did not. 53% of participants identified with the workaholic personality, 62% with the Type A, and 36% with the control freak. Identifying with any one of the personalities was correlated with feeling it makes one a better physician. There were statistically significant differences in several wellness outcomes comparing participants who identified with the personalities versus those who did not. These included higher levels of emotional exhaustion (workaholic, Type A and control freak), higher levels of anxiety (Type A and control freak) and higher levels of depression, poorer mental health and lower levels of job satisfaction (control freak). Participants who identified with the workaholic personality versus those who did not reported higher levels of job satisfaction, rewarding patient experiences and career commitment. Most participants identified with at least one of the three personalities. The beliefs of some participants that

  15. Utilização da análise fatorial na identificação dos principais indicadores para avaliação do desempenho financeiro: uma aplicação nas empresas de seguros Use of factor analysis to identify the main financial performance assessment indicators: an application in insurance companies

    Directory of Open Access Journals (Sweden)

    Francisco Antonio Bezerra

    2006-12-01

    Full Text Available A utilização de indicadores financeiros para a avaliação do desempenho das organizações já é realizada há muito tempo. Usualmente, esses indicadores são utilizados para fazer comparações entre empresas ou mesmo entre unidades de uma mesma companhia. No entanto, a análise dos indicadores, geralmente, é realizada de forma individual e seqüencial, ou seja, as análises são realizadas com base em comparações, por exemplo, dos indicadores de liquidez, em que se busca verificar quais são as melhores empresas com base em um padrão médio de liquidez, depois disso, uma nova análise é feita para indicadores de rentabilidade etc. Esse tipo de avaliação seqüencial e individualizada não permite avaliar a influência de alguns indicadores sobre os demais, além de depender de critérios subjetivos para avaliar quais dos indicadores são os mais relevantes. O que se pretende com este trabalho é propor uma metodologia que: (1 diminua o grau de subjetividade na escolha dos indicadores que deverão compor a avaliação das empresas e (2 permita uma análise simultânea do comportamento de vários indicadores. Na realização deste trabalho foi utilizada uma das técnicas de análise multivariada de dados: Análise Fatorial (AF para criação dos critérios de seleção dos indicadores financeiros.Financial indicators have been used for a long time to assess organizational performance. Usually, these indicators are used to compare companies or even units of the same company. However, these indicators tend to be analyzed individually and sequentially. In other words, analyses are based on comparisons of liquidity ratios for example, with a view to finding the best companies on the basis of a mean liquidity standard, followed by a new analysis for profitability ratios etc. That type of sequential and individualized assessment does not permit the evaluation of some indicators' influence on others, besides depending on subjective criteria

  16. Identifying Domain-General and Domain-Specific Predictors of Low Mathematics Performance: A Classification and Regression Tree Analysis

    Directory of Open Access Journals (Sweden)

    David J. Purpura

    2017-12-01

    Full Text Available Many children struggle to successfully acquire early mathematics skills. Theoretical and empirical evidence has pointed to deficits in domain-specific skills (e.g., non-symbolic mathematics skills or domain-general skills (e.g., executive functioning and language as underlying low mathematical performance. In the current study, we assessed a sample of 113 three- to five-year old preschool children on a battery of domain-specific and domain-general factors in the fall and spring of their preschool year to identify Time 1 (fall factors associated with low performance in mathematics knowledge at Time 2 (spring. We used the exploratory approach of classification and regression tree analyses, a strategy that uses step-wise partitioning to create subgroups from a larger sample using multiple predictors, to identify the factors that were the strongest classifiers of low performance for younger and older preschool children. Results indicated that the most consistent classifier of low mathematics performance at Time 2 was children’s Time 1 mathematical language skills. Further, other distinct classifiers of low performance emerged for younger and older children. These findings suggest that risk classification for low mathematics performance may differ depending on children’s age.

  17. Integrated multi sensors and camera video sequence application for performance monitoring in archery

    Science.gov (United States)

    Taha, Zahari; Arif Mat-Jizat, Jessnor; Amirul Abdullah, Muhammad; Muazu Musa, Rabiu; Razali Abdullah, Mohamad; Fauzi Ibrahim, Mohamad; Hanafiah Shaharudin, Mohd Ali

    2018-03-01

    This paper explains the development of a comprehensive archery performance monitoring software which consisted of three camera views and five body sensors. The five body sensors evaluate biomechanical related variables of flexor and extensor muscle activity, heart rate, postural sway and bow movement during archery performance. The three camera views with the five body sensors are integrated into a single computer application which enables the user to view all the data in a single user interface. The five body sensors’ data are displayed in a numerical and graphical form in real-time. The information transmitted by the body sensors are computed with an embedded algorithm that automatically transforms the summary of the athlete’s biomechanical performance and displays in the application interface. This performance will be later compared to the pre-computed psycho-fitness performance from the prefilled data into the application. All the data; camera views, body sensors; performance-computations; are recorded for further analysis by a sports scientist. Our developed application serves as a powerful tool for assisting the coach and athletes to observe and identify any wrong technique employ during training which gives room for correction and re-evaluation to improve overall performance in the sport of archery.

  18. Distributed dynamic simulations of networked control and building performance applications.

    Science.gov (United States)

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  19. High-performance insulator structures for accelerator applications

    International Nuclear Information System (INIS)

    Sampayan, S.E.; Caporaso, G.J.; Sanders, D.M.; Stoddard, R.D.; Trimble, D.O.; Elizondo, J.; Krogh, M.L.; Wieskamp, T.F.

    1997-05-01

    A new, high gradient insulator technology has been developed for accelerator systems. The concept involves the use of alternating layers of conductors and insulators with periods of order 1 mm or less. These structures perform many times better (about 1.5 to 4 times higher breakdown electric field) than conventional insulators in long pulse, short pulse, and alternating polarity applications. We describe our ongoing studies investigating the degradation of the breakdown electric field resulting from alternate fabrication techniques, the effect of gas pressure, the effect of the insulator-to-electrode interface gap spacing, and the performance of the insulator structure under bi-polar stress

  20. Thermodynamic performance assessment of wind energy systems: An application

    International Nuclear Information System (INIS)

    Redha, Adel Mohammed; Dincer, Ibrahim; Gadalla, Mohamed

    2011-01-01

    In this paper, the performance of wind energy system is assessed thermodynamically, from resource and technology perspectives. The thermodynamic characteristics of wind through energy and exergy analyses are considered and both energetic and exergetic efficiencies are studied. Wind speed is affected by air temperature and pressure and has a subsequent effect on wind turbine performance based on wind reference temperature and Bernoulli's equation. VESTAS V52 wind turbine is selected for (Sharjah/UAE). Energy and exergy efficiency equations for wind energy systems are further developed for practical applications. The results show that there are noticeable differences between energy and exergy efficiencies and that exergetic efficiency reflects the right/actual performance. Finally, exergy analysis has been proven to be the right tool used in design, simulation, and performance evaluation of all renewable energy systems. -- Highlights: → In this research the performance of wind energy system is assessed thermodynamically, from resource and technology perspectives. → Energy and exergy equations for wind energy systems are further developed for practical applications. → Thermodynamic characteristics of wind turbine systems through energetic and exergetic efficiencies are evaluated from January till March 2010. → Exergy efficiency describes the system irreversibility and the minimum irreversibility exists when the wind speed reaches 11 m/s. → The power production during March was about 17% higher than the month of February and 66% higher than January.

  1. Evaluation of performance of silicon photomultipliers in lidar applications

    Science.gov (United States)

    Vinogradov, Sergey L.

    2017-05-01

    Silicon Photomultipliers (SiPMs) are a well-recognized new generation of photon number resolving avalanche photodetectors. Many advantages - a high gain with an ultra-low excess noise of multiplication, multi-pixel architecture, relatively low operating voltage - make SiPMs very competitive in a growing number of applications. Challenging demands of LIDAR applications for a receiver having high sensitivity starting from single photons, superior time-offlight resolution, robustness including surviving at bright light flashes, solid-state compactness and more, are expected to be feasible for the SiPMs. Despite some known drawbacks, namely crosstalk, afterpulsing, dark noise, limited dynamic range, SiPMs are already considered as promising substitutes for conventional APDs and PMTs in LIDAR applications. However, these initial considerations are based on a rather simplified representation of the SiPM as a generic LIDAR receiver described by generic expressions. This study is focused on a comprehensive evaluation of a SiPM potential considering essential features of this new technology, which could affect applicability and performance of SiPMs as LIDAR receivers. Namely, an excess noise due to correlated processes of crosstalk and afterpulsing, are included into account utilizing the well-established framework of analytical probabilistic models. The analysis of SiPM performance in terms of a photon number and time resolution clarifies their competitiveness over conventional APD and PMT and anticipates the development of next SiPM generations.

  2. Performance test of a bladeless turbine for geothermal applications

    Energy Technology Data Exchange (ETDEWEB)

    Steidel, R.; Weiss, H.

    1976-03-24

    The Possell bladeless turbine was tested at the LLL Geothermal Test Facility to evaluate its potential for application in the total flow process. Test description and performance data are given for 3000, 3500, 4000, and 4500 rpm. The maximum engine efficiency observed was less than 7 percent. It is concluded that the Possell turbine is not a viable candidate machine for the conversion of geothermal fluids by the total flow process. (LBS)

  3. Robust global identifiability theory using potentials--Application to compartmental models.

    Science.gov (United States)

    Wongvanich, N; Hann, C E; Sirisena, H R

    2015-04-01

    This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Credit supply and monetary policy : Identifying the bank balance-sheet channel with loan applications

    NARCIS (Netherlands)

    Jimenez Porras, G.; Ongena, S.; Peydro, J.L.; Saurina, J.

    2012-01-01

    We analyze the impact of monetary policy on the supply of bank credit. Monetary policy affects both loan supply and demand, thus making identification a steep challenge. We therefore analyze a novel, supervisory dataset with loan applications from Spain. Accounting for time-varying firm

  5. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Science.gov (United States)

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  6. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  7. Development of smartphone application that aids stroke screening and identifying nearby acute stroke care hospitals.

    Science.gov (United States)

    Nam, Hyo Suk; Heo, JoonNyung; Kim, Jinkwon; Kim, Young Dae; Song, Tae Jin; Park, Eunjeong; Heo, Ji Hoe

    2014-01-01

    The benefits of thrombolytic treatment are time-dependent. We developed a smartphone application that aids stroke patient self-screening and hospital selection, and may also decrease hospital arrival time. The application was developed for iPhone and Android smartphones. Map data for the application were adopted from the open map. For hospital registration, a web page (http://stroke119.org) was developed using PHP and MySQL. The Stroke 119 application includes a stroke screening tool and real-time information on nearby hospitals that provide thrombolytic treatment. It also provides information on stroke symptoms, thrombolytic treatment, and prescribed actions when stroke is suspected. The stroke screening tool was adopted from the Cincinnati Prehospital Stroke Scale and is displayed in a cartoon format. If the user taps a cartoon image that represents abnormal findings, a pop-up window shows that the user may be having a stroke, informs the user what to do, and directs the user to call emergency services. Information on nearby hospitals is provided in map and list views, incorporating proximity to the user's location using a Global Positioning System (a built-in function of smartphones). Users can search for a hospital according to specialty and treatment levels. We also developed a web page for hospitals to register in the system. Neurology training hospitals and hospitals that provide acute stroke care in Korea were invited to register. Seventy-seven hospitals had completed registration. This application may be useful for reducing hospital arrival times for thrombolytic candidates.

  8. Performance Analysis of a Bunch and Track Identifier Prototype (BTI) for the CMS Barrel Muon Drift Chambers

    International Nuclear Information System (INIS)

    Puerta Pelayo, J.

    2001-01-01

    This note contains a short description of the first step in the first level trigger applied to the barrel muon drift chambers of CMS: the Bunch and Track Identifier (BTI). The test beam results obtained with a BTI prototype have been also analysed BTI performance for different incidence angles and in presence of external magnetic field has been tested, as well as BTI capability as trigger device and track reconstructor. (Author) 30 refs

  9. Multiphase pumping: indoor performance test and oilfield application

    Science.gov (United States)

    Kong, Xiangling; Zhu, Hongwu; Zhang, Shousen; Li, Jifeng

    2010-03-01

    Multiphase pumping is essentially a means of adding energy to the unprocessed effluent which enables the liquid and gas mixture to be transported over a long distances without prior separation. A reduction, consolidation, or elimination of the production infrastructure, such as separation equipments and offshore platforms can be developed more economically. Also it successfully lowed the backpressure of wells, revived dead wells and improved the production and efficiency of oilfield. This paper reviews the issues related to indoor performance test and an oilfield application of the helico-axial multiphase pump designed by China University of Petroleum (Beijing). Pump specification and its hydraulic design are given. Results of performance testing under different condition, such as operational speed and gas volume fraction (GVF) etc are presented. Experimental studies on combination of theoretical analysis showed the multiphase pump satisfies the similitude rule, which can be used in the development of new MPP design and performance prediction. Test results showed that rising the rotation speed and suction pressure could better its performance, pressure boost improved, high efficiency zone expanding and the flow rate related to the optimum working condition increased. The pump worked unstable as GVF increased to a certain extent and slip occurred between two phases in the pump, creating surging and gas lock at a high GVF. A case of application in Nanyang oilfield is also studied.

  10. A Generally Applicable Translational Strategy Identifies S100A4 as a Candidate Gene in Allergy

    DEFF Research Database (Denmark)

    Bruhn, Sören; Fang, Yu; Barrenäs, Fredrik

    2014-01-01

    The identification of diagnostic markers and therapeutic candidate genes in common diseases is complicated by the involvement of thousands of genes. We hypothesized that genes co-regulated with a key gene in allergy, IL13, would form a module that could help to identify candidate genes. We identi...

  11. Russia and hybrid warfare: identifying critical elements in successful applications of hybrid tactics

    OpenAIRE

    Neville, Seth B.

    2015-01-01

    Approved for public release; distribution is unlimited With the Russian annexation of Crimea in 2014, hybrid war became a buzzword within political and academic circles. This thesis examines hybrid warfare applications using contemporary and historical examples. The analysis seeks to determine why a country was or was not successful in its execution of hybrid war, and it assesses the geo-political context of cost, benefit, and risk for an aggressor state contributing to its decision to eng...

  12. Teaching assistants’ performance at identifying common introductory student difficulties in mechanics revealed by the Force Concept Inventory

    Directory of Open Access Journals (Sweden)

    Alexandru Maries

    2016-05-01

    Full Text Available The Force Concept Inventory (FCI has been widely used to assess student understanding of introductory mechanics concepts by a variety of educators and physics education researchers. One reason for this extensive use is that many of the items on the FCI have strong distractor choices which correspond to students’ alternate conceptions in mechanics. Instruction is unlikely to be effective if instructors do not know the common alternate conceptions of introductory physics students and explicitly take into account students’ initial knowledge states in their instructional design. Here, we discuss research involving the FCI to evaluate one aspect of the pedagogical content knowledge of teaching assistants (TAs: knowledge of introductory student alternate conceptions in mechanics as revealed by the FCI. For each item on the FCI, the TAs were asked to identify the most common incorrect answer choice of introductory physics students. This exercise was followed by a class discussion with the TAs related to this task, including the importance of knowing student difficulties in teaching and learning. Then, we used FCI pretest and post-test data from a large population (∼900 of introductory physics students to assess the extent to which TAs were able to identify alternate conceptions of introductory students related to force and motion. In addition, we carried out think-aloud interviews with graduate students who had more than two semesters of teaching experience in recitations to examine how they reason about the task. We find that while the TAs, on average, performed better than random guessing at identifying introductory students’ difficulties with FCI content, they did not identify many common difficulties that introductory physics students have after traditional instruction. We discuss specific alternate conceptions, the extent to which TAs are able to identify them, and results from the think-aloud interviews that provided valuable information

  13. Application of artificial neural networks to identify equilibration in computer simulations

    Science.gov (United States)

    Leibowitz, Mitchell H.; Miller, Evan D.; Henry, Michael M.; Jankowski, Eric

    2017-11-01

    Determining which microstates generated by a thermodynamic simulation are representative of the ensemble for which sampling is desired is a ubiquitous, underspecified problem. Artificial neural networks are one type of machine learning algorithm that can provide a reproducible way to apply pattern recognition heuristics to underspecified problems. Here we use the open-source TensorFlow machine learning library and apply it to the problem of identifying which hypothetical observation sequences from a computer simulation are “equilibrated” and which are not. We generate training populations and test populations of observation sequences with embedded linear and exponential correlations. We train a two-neuron artificial network to distinguish the correlated and uncorrelated sequences. We find that this simple network is good enough for > 98% accuracy in identifying exponentially-decaying energy trajectories from molecular simulations.

  14. Using a Counterfactual Process to Identify the Applicability of Emerging Technology

    Science.gov (United States)

    2014-09-01

    conditions that must exist for the antecedent to happen.129 Crafting a 126 Michael W. Morris and...opportunities of interrupting the sequence of events. For example, if the bombers were identified on Monday then the events that unfold on Tuesday ...www.telegraph.co.uk/news/worldnews/northamerica/usa/10006491/Boston- marathon-bombings-Dzhokhar-Tsarnaev-pictured-behind-eight-year-old- victim.html Morris , Michael

  15. Proteomics strategy for identifying candidate bioactive proteins in complex mixtures: application to the platelet releasate.

    LENUS (Irish Health Repository)

    O'Connor, Roisin

    2010-01-01

    Proteomic approaches have proven powerful at identifying large numbers of proteins, but there are fewer reports of functional characterization of proteins in biological tissues. Here, we describe an experimental approach that fractionates proteins released from human platelets, linking bioassay activity to identity. We used consecutive orthogonal separation platforms to ensure sensitive detection: (a) ion-exchange of intact proteins, (b) SDS-PAGE separation of ion-exchange fractions and (c) HPLC separation of tryptic digests coupled to electrospray tandem mass spectrometry. Migration of THP-1 monocytes in response to complete or fractionated platelet releasate was assessed and located to just one of the forty-nine ion-exchange fractions. Over 300 proteins were identified in the releasate, with a wide range of annotated biophysical and biochemical properties, in particular platelet activation, adhesion, and wound healing. The presence of PEDF and involucrin, two proteins not previously reported in platelet releasate, was confirmed by western blotting. Proteins identified within the fraction with monocyte promigratory activity and not in other inactive fractions included vimentin, PEDF, and TIMP-1. We conclude that this analytical platform is effective for the characterization of complex bioactive samples.

  16. Application of Monte Carlo cross-validation to identify pathway cross-talk in neonatal sepsis.

    Science.gov (United States)

    Zhang, Yuxia; Liu, Cui; Wang, Jingna; Li, Xingxia

    2018-03-01

    To explore genetic pathway cross-talk in neonates with sepsis, an integrated approach was used in this paper. To explore the potential relationships between differently expressed genes between normal uninfected neonates and neonates with sepsis and pathways, genetic profiling and biologic signaling pathway were first integrated. For different pathways, the score was obtained based upon the genetic expression by quantitatively analyzing the pathway cross-talk. The paired pathways with high cross-talk were identified by random forest classification. The purpose of the work was to find the best pairs of pathways able to discriminate sepsis samples versus normal samples. The results found 10 pairs of pathways, which were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways were identified according to analysis of extensive literature. Impact statement To find the best pairs of pathways able to discriminate sepsis samples versus normal samples, an RF classifier, the DS obtained by DEGs of paired pathways significantly associated, and Monte Carlo cross-validation were applied in this paper. Ten pairs of pathways were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways ((7) IL-6 Signaling and Phospholipase C Signaling (PLC); (8) Glucocorticoid Receptor (GR) Signaling and Dendritic Cell Maturation) were identified according to analysis of extensive literature.

  17. Using EVT for Geological Anomaly Design and Its Application in Identifying Anomalies in Mining Areas

    Directory of Open Access Journals (Sweden)

    Feilong Qin

    2016-01-01

    Full Text Available A geological anomaly is the basis of mineral deposit prediction. Through the study of the knowledge and characteristics of geological anomalies, the category of extreme value theory (EVT to which a geological anomaly belongs can be determined. Associating the principle of the EVT and ensuring the methods of the shape parameter and scale parameter for the generalized Pareto distribution (GPD, the methods to select the threshold of the GPD can be studied. This paper designs a new algorithm called the EVT model of geological anomaly. These study data on Cu and Au originate from 26 exploration lines of the Jiguanzui Cu-Au mining area in Hubei, China. The proposed EVT model of the geological anomaly is applied to identify anomalies in the Jiguanzui Cu-Au mining area. The results show that the model can effectively identify the geological anomaly region of Cu and Au. The anomaly region of Cu and Au is consistent with the range of ore bodies of actual engineering exploration. Therefore, the EVT model of the geological anomaly can effectively identify anomalies, and it has a high indicating function with respect to ore prospecting.

  18. Application of 13C-stable isotope probing to identify RDX-degrading microorganisms in groundwater

    International Nuclear Information System (INIS)

    Cho, Kun-Ching; Lee, Do Gyun; Roh, HyungKeun; Fuller, Mark E.; Hatzinger, Paul B.; Chu, Kung-Hui

    2013-01-01

    We employed stable isotope probing (SIP) with 13 C-labeled hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) to identify active microorganisms responsible for RDX biodegradation in groundwater microcosms. Sixteen different 16S rRNA gene sequences were derived from microcosms receiving 13 C-labeled RDX, suggesting the presence of microorganisms able to incorporate carbon from RDX or its breakdown products. The clones, residing in Bacteroidia, Clostridia, α-, β- and δ-Proteobacteria, and Spirochaetes, were different from previously described RDX degraders. A parallel set of microcosms was amended with cheese whey and RDX to evaluate the influence of this co-substrate on the RDX-degrading microbial community. Cheese whey stimulated RDX biotransformation, altered the types of RDX-degrading bacteria, and decreased microbial community diversity. Results of this study suggest that RDX-degrading microorganisms in groundwater are more phylogenetically diverse than what has been inferred from studies with RDX-degrading isolates. Highlights: •SIP identified sixteen groundwater bacteria capable of using RDX and/or its metabolites as a carbon source. •The RDX degraders in groundwater are phylogenetically diverse and different from known RDX degraders. •Cheese whey induced community shift and altered diversity of the RDX-degrading microorganisms over time. -- RDX-degrading bacteria in contaminated groundwater, identified by SIP with 13 C-labeled RDX, are phylogenetically diverse and different from known RDX degraders

  19. Performance of ceramics in ring/cylinder applications

    International Nuclear Information System (INIS)

    Dufrane, K.F.; Glaeser, W.A.

    1987-01-01

    In support of the efforts to apply ceramics to advanced heat engines, a study is being performed of the performance of ceramics at the ring/cylinder interface of advanced (low heat rejection) engines. The objective of the study, managed by the Oak Ridge National Laboratory, is to understand the basic mechanisms controlling the wear of ceramics and thereby identify means for applying ceramics effectively. Attempts to operate three different zirconias, silicon carbide, silicon nitride, and plasma-sprayed ceramic coatings without lubrication have not been successful because of excessive friction and high wear rates. Silicon carbide and silicon nitride perform well at ambient temperatures with fully formulated mineral oil lubrication, but are limited to temperatures of 500F because of the lack of suitable liquid lubricants for higher temperatures

  20. Performance of modified blood pressure-to-height ratio for identifying hypertension in Chinese and American children.

    Science.gov (United States)

    Zhang, Yuanyuan; Ma, Chuanwei; Yang, Lili; Bovet, Pascal; Xi, Bo

    2018-04-06

    Blood pressure-to-height ratio (BPHR) has been reported to perform well for identifying hypertension (HTN) in adolescents but not in young children. Our study was aimed to evaluate the performance of BPHR and modified BPHR (MBPHR) for screening HTN in children. A total of 5268 Chinese children (boys: 53.1%) aged 6-12 years and 5024 American children (boys: 48.1%) aged 8-12 years were included in the present study. BPHR was calculated as BP/height (mmHg/cm). MBPHR7 was calculated as BP/(height + 7*(13-age)). MBPHR3 was calculated as BP/(height + 3*(13-age)). We used receiver-operating characteristic curve analysis to assess the performance of the three ratios for identifying HTN in children as compared to the 2017 U.S. clinical guideline as the "gold standard". The prevalence of HTN in Chinese and American children was 9.4% and 5.4%, respectively, based on the 2017 U.S. guideline. The AUC was larger for MBPHR3 than BPHR and MBPHR7. All three ratios had optimal negative predictive value (~100%). The positive predictive value (PPV) was higher for MBPHR3 than BPHR in both Chinese (43.9% vs. 37.9%) and American (39.1% vs. 26.3%) children. In contrast, the PPV was higher for MBPHR7 than BPHR in Chinese children (47.4% vs. 37.9%) but not in American children (24.8% vs. 26.3%). In summary, MBPHR3 overall performed better than MBPHR7 and BPHR for identifying HTN in children. However, the three ratios had low PPV (<50%) as compared to the 2017 U.S. guidelines, which makes these ratios of limited use for HTN screening in children.

  1. I/O Performance Characterization of Lustre and NASA Applications on Pleiades

    Science.gov (United States)

    Saini, Subhash; Rappleye, Jason; Chang, Johnny; Barker, David Peter; Biswas, Rupak; Mehrotra, Piyush

    2012-01-01

    In this paper we study the performance of the Lustre file system using five scientific and engineering applications representative of NASA workload on large-scale supercomputing systems such as NASA s Pleiades. In order to facilitate the collection of Lustre performance metrics, we have developed a software tool that exports a wide variety of client and server-side metrics using SGI's Performance Co-Pilot (PCP), and generates a human readable report on key metrics at the end of a batch job. These performance metrics are (a) amount of data read and written, (b) number of files opened and closed, and (c) remote procedure call (RPC) size distribution (4 KB to 1024 KB, in powers of 2) for I/O operations. RPC size distribution measures the efficiency of the Lustre client and can pinpoint problems such as small write sizes, disk fragmentation, etc. These extracted statistics are useful in determining the I/O pattern of the application and can assist in identifying possible improvements for users applications. Information on the number of file operations enables a scientist to optimize the I/O performance of their applications. Amount of I/O data helps users choose the optimal stripe size and stripe count to enhance I/O performance. In this paper, we demonstrate the usefulness of this tool on Pleiades for five production quality NASA scientific and engineering applications. We compare the latency of read and write operations under Lustre to that with NFS by tracing system calls and signals. We also investigate the read and write policies and study the effect of page cache size on I/O operations. We examine the performance impact of Lustre stripe size and stripe count along with performance evaluation of file per process and single shared file accessed by all the processes for NASA workload using parameterized IOR benchmark.

  2. Landmarks for Identifying the Suprascapular Foramen Anteriorly: Application to Anterior Neurotization and Decompressive Procedures.

    Science.gov (United States)

    Manouvakhova, Olga V; Macchi, Veronica; Fries, Fabian N; Loukas, Marios; De Caro, Raffaele; Oskouian, Rod J; Spinner, Robert J; Tubbs, R Shane

    2018-02-01

    Additional landmarks for identifying the suprascapular nerve at its entrance into the suprascapular foramen from an anterior approach would be useful to the surgeon. To identify landmarks for the identification of this hidden site within an anterior approach. In 8 adult cadavers (16 sides), lines were used to connect the superior angle of the scapula, the acromion, and the coracoid process tip thus creating an anatomic triangle. The suprascapular nerve's entrance into the suprascapular foramen was documented regarding its position within this anatomical triangle. Depths from the skin surface and specifically from the medial-most point of the clavicular attachment of the trapezius to the suprascapular nerve's entrance into the suprascapular foramen were measured using calipers and a ruler. The clavicle was then fractured and retracted superiorly to verify the position of the nerve's entrance into the suprascapular foramen. From the trapezius, the nerve's entrance into the foramen was 3 to 4.2 cm deep (mean, 3.5 cm). The mean distance from the tip of the corocoid process to the suprascapular foramen was 3.8 cm. The angle best used to approach the suprascapular foramen from the surface was 15° to 20°. Based on our study, an anterior suprascapular approach to the suprascapular nerve as it enters the suprascapular foramen can identify the most medial fibers of the trapezius attachment onto the clavicle and insert a finger at an angle of 15° to 20° laterally and advanced to an average depth of 3.5 cm. Copyright © 2017 by the Congress of Neurological Surgeons

  3. Application of particle swarm optimization to identify gamma spectrum with neural network

    International Nuclear Information System (INIS)

    Shi Dongsheng; Di Yuming; Zhou Chunlin

    2007-01-01

    In applying neural network to identification of gamma spectra back propagation (BP) algorithm is usually trapped to a local optimum and has a low speed of convergence, whereas particle swarm optimization (PSO) is advantageous in terms of globe optimal searching. In this paper, we propose a new algorithm for neural network training, i.e. combined BP and PSO optimization, or PSO-BP algorithm. Practical example shows that the new algorithm can overcome shortcomings of BP algorithm and the neural network trained by it has a high ability of generalization with identification result of 100% correctness. It can be used effectively and reliably to identify gamma spectra. (authors)

  4. Application of Computer Simulation to Identify Erosion Resistance of Materials of Wet-steam Turbine Blades

    Science.gov (United States)

    Korostelyov, D. A.; Dergachyov, K. V.

    2017-10-01

    A problem of identifying the efficiency of using materials, coatings, linings and solderings of wet-steam turbine rotor blades by means of computer simulation is considered. Numerical experiments to define erosion resistance of materials of wet-steam turbine blades are described. Kinetic curves for erosion area and weight of the worn rotor blade material of turbines K-300-240 LMP and atomic icebreaker “Lenin” have been defined. The conclusion about the effectiveness of using different erosion-resistant materials and protection configuration of rotor blades is also made.

  5. Performance indicator system with application to NPP management

    International Nuclear Information System (INIS)

    Gomez, J.; Roldan, J.

    2001-01-01

    The objective of the paper is to present the work that is being conducted in the scope of a research project between Cofrentes NPP and the polytechnic university of Valencia aimed to the development and implementation of a performance indicators system to support plant management. In developing this system, attention is being paid to the areas of safety, production and dependability. The first step in the project was the development of the performance indicator system (PIS), in order to help in assessing the effectiveness of the different activities in plant (i.e. maintenance, inspections, tests, etc.). It is suggested establishing the operational indicators set in 3 levels. The lowest level concerns indicators monitoring performance and maintenance characteristics of components. The next one involves a subset of indicators placed at system level with a similar goal. And finally, the highest level summarizes the impact of the global policy in the whole plant from safety and performance point of view. The definition of an indicator should comprise, at least, the following items: indicator's name, performance area, definition and data needed. A strategy should define what, when and how indicators have to be evaluated, analyzed and reported. This article gives an example application of the methodology at the Cofrentes NPP, collective dose as safety indicator, power production as production indicator and the number of work orders as maintenance indicator are considered and their time evolution is given. (A.C.)

  6. Identify and Quantify the Mechanistic Sources of Sensor Performance Variation Between Individual Sensors SN1 and SN2

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, Aaron A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baldwin, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cinson, Anthony D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jones, Anthony M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Larche, Michael R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mathews, Royce [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mullen, Crystal A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pardini, Allan F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Posakony, Gerald J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prowant, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hartman, Trenton S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Edwards, Matthew K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-08-06

    This Technical Letter Report satisfies the M3AR-14PN2301022 milestone, and is focused on identifying and quantifying the mechanistic sources of sensor performance variation between individual 22-element, linear phased-array sensor prototypes, SN1 and SN2. This effort constitutes an iterative evolution that supports the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor inspection system. The scope of the work for this portion of the PNNL effort conducted in FY14 includes performing a comparative evaluation and assessment of the performance characteristics of the SN1 and SN2 22 element PA-UT probes manufactured at PNNL. Key transducer performance parameters, such as sound field dimensions, resolution capabilities, frequency response, and bandwidth are used as a metric for the comparative evaluation and assessment of the SN1 and SN2 engineering test units.

  7. Identifying optimum performance trade-offs using a cognitively bounded rational analysis model of discretionary task interleaving.

    Science.gov (United States)

    Janssen, Christian P; Brumby, Duncan P; Dowell, John; Chater, Nick; Howes, Andrew

    2011-01-01

    We report the results of a dual-task study in which participants performed a tracking and typing task under various experimental conditions. An objective payoff function was used to provide explicit feedback on how participants should trade off performance between the tasks. Results show that participants' dual-task interleaving strategy was sensitive to changes in the difficulty of the tracking task and resulted in differences in overall task performance. To test the hypothesis that people select strategies that maximize payoff, a Cognitively Bounded Rational Analysis model was developed. This analysis evaluated a variety of dual-task interleaving strategies to identify the optimal strategy for maximizing payoff in each condition. The model predicts that the region of optimum performance is different between experimental conditions. The correspondence between human data and the prediction of the optimal strategy is found to be remarkably high across a number of performance measures. This suggests that participants were honing their behavior to maximize payoff. Limitations are discussed. Copyright © 2011 Cognitive Science Society, Inc.

  8. Application of Geomorphologic Factors for Identifying Soil Loss in Vulnerable Regions of the Cameron Highlands

    Directory of Open Access Journals (Sweden)

    Kahhoong Kok

    2018-03-01

    Full Text Available The main purpose of this study is to propose a methodology for identifying vulnerable regions in the Cameron Highlands that are susceptible to soil loss, based on runoff aggregation structure and the energy expenditure pattern of the natural river basin, within the framework of power law distribution. To this end, three geomorphologic factors, namely shear stress and stream power, as well as the drainage area of every point in the basin of interest, have been extracted using GIS, and then their complementary cumulative distributions are graphically analyzed by fitting them to power law distribution, with the purpose of identifying the sensitive points within the basin that are susceptible to soil loss with respect to scaling regimes of shear stress and stream power. It is observed that the range of vulnerable regions by the scaling regime of shear stress is much narrower than by the scaling regime of stream power. This result seems to suggest that shear stress is a scale-dependent factor, which does not follow power law distribution and does not adequately reflect the energy expenditure pattern of a river basin. Therefore, stream power is preferred as a more reasonable factor for the evaluation of soil loss. The methodology proposed in this study can be validated by visualizing the path of soil loss, which is generated from the hillslope process (characterized by the local slope to the valley through a fluvial process (characterized by the drainage area as well as the local slope.

  9. Identifying the Applications of Internet of Things in the Smart Home by Using Meta synthesis Method

    Directory of Open Access Journals (Sweden)

    manochehr ansari

    2017-12-01

    Full Text Available There is quite huge annual energy consumption in Iran in household, commercial, and public sectors. In addition, the index of population aging manifests a great increase in recent years. Concurrent to all these, smart houses which are equipped with Internet of Things (IoT can help us maintain sustainable developments with functionalities such as improvements in energy consumption and health, to name but a few. Accordingly, in this thesis, we aimed to identify the usage and functions of IoT in smart houses. This research is an applied research in nature and it would be classified as qualitative regarding data collection. In order to identify the usages of IoT in smart houses with the help of meta-synthesis approach, we have examined 371 researches among which only 85 have been selected for the final analysis. 122 factors have been extracted based on these 85 researches which have been combined into 7 main usages of “electricity consumption management”, “Heating, ventilation and air conditioning System”, “water consumption control”, “security empowerment for the buildings and the neighborhood”, “health monitoring”, “crisis management”, and home appliance automation.

  10. Application of positive matrix factorization to identify potential sources of PAHs in soil of Dalian, China

    International Nuclear Information System (INIS)

    Wang Degao; Tian Fulin; Yang Meng; Liu Chenlin; Li Yifan

    2009-01-01

    Soil derived sources of polycyclic aromatic hydrocarbons (PAHs) in the region of Dalian, China were investigated using positive matrix factorization (PMF). Three factors were separated based on PMF for the statistical investigation of the datasets both in summer and winter. These factors were dominated by the pattern of single sources or groups of similar sources, showing seasonal and regional variations. The main sources of PAHs in Dalian soil in summer were the emissions from coal combustion average (46%), diesel engine (30%), and gasoline engine (24%). In winter, the main sources were the emissions from coal-fired boiler (72%), traffic average (20%), and gasoline engine (8%). These factors with strong seasonality indicated that coal combustion in winter and traffic exhaust in summer dominated the sources of PAHs in soil. These results suggested that PMF model was a proper approach to identify the sources of PAHs in soil. - PMF model is a proper approach to identify potential sources of PAHs in soil based on the PAH profiles measured in the field and those published in the literature.

  11. Application of classification-tree methods to identify nitrate sources in ground water

    Science.gov (United States)

    Spruill, T.B.; Showers, W.J.; Howe, S.S.

    2002-01-01

    A study was conducted to determine if nitrate sources in ground water (fertilizer on crops, fertilizer on golf courses, irrigation spray from hog (Sus scrofa) wastes, and leachate from poultry litter and septic systems) could be classified with 80% or greater success. Two statistical classification-tree models were devised from 48 water samples containing nitrate from five source categories. Model I was constructed by evaluating 32 variables and selecting four primary predictor variables (??15N, nitrate to ammonia ratio, sodium to potassium ratio, and zinc) to identify nitrate sources. A ??15N value of nitrate plus potassium 18.2 indicated inorganic or soil organic N. A nitrate to ammonia ratio 575 indicated nitrate from golf courses. A sodium to potassium ratio 3.2 indicated spray or poultry wastes. A value for zinc 2.8 indicated poultry wastes. Model 2 was devised by using all variables except ??15N. This model also included four variables (sodium plus potassium, nitrate to ammonia ratio, calcium to magnesium ratio, and sodium to potassium ratio) to distinguish categories. Both models were able to distinguish all five source categories with better than 80% overall success and with 71 to 100% success in individual categories using the learning samples. Seventeen water samples that were not used in model development were tested using Model 2 for three categories, and all were correctly classified. Classification-tree models show great potential in identifying sources of contamination and variables important in the source-identification process.

  12. Performance assessment of advanced engineering workstations for fuel management applications

    International Nuclear Information System (INIS)

    Turinsky, P.J.

    1989-07-01

    The purpose of this project was to assess the performance of an advanced engineering workstation [AEW] with regard to applications to incore fuel management for LWRs. The attributes of most interest to us that define an AEW are parallel computational hardware and graphics capabilities. The AEWs employed were super microcomputers manufactured by MASSCOMP, Inc. These computers utilize a 32-bit architecture, graphics co-processor, multi-CPUs [up to six] attached to common memory and multi-vector accelerators. 7 refs., 33 figs., 4 tabs

  13. Performance and application of a fourfold monopole mass spectrometer

    International Nuclear Information System (INIS)

    Richards, J.A.; Huey, R.M.

    1978-01-01

    Some preliminary tests with an experimental fourfold monopole mass spectrometer described, illustrating that the device performs acceptably (at the low resolutions used) despite the fact that the field-forming surfaces of the driven electrodes are only one quadrant of a cylinder. Coupling between adjacent channels is shown not to be a problem so that applications requiring simultaneous measurements using two or more of the monopole channels can be entertained. Owing to its parellel structure the instrument is suggested as being suited particularly to isotope ratio measurements with precisions which could be significantly better than would be possible with a quadrupole device. (Auth.)

  14. High performance hybrid magnetic structure for biotechnology applications

    Science.gov (United States)

    Humphries, David E [El Cerrito, CA; Pollard, Martin J [El Cerrito, CA; Elkin, Christopher J [San Ramon, CA

    2009-02-03

    The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides means for separation and other biotechnology applications involving holding, manipulation, or separation of magnetic or magnetizable molecular structures and targets. Also disclosed are further improvements to aspects of the hybrid magnetic structure, including additional elements and for adapting the use of the hybrid magnetic structure for use in biotechnology and high throughput processes.

  15. Performance and applications of a μ-TPC

    International Nuclear Information System (INIS)

    Miuchi, Kentaro; Kubo, Hidetoshi; Nagayoshi, Tsutomu; Okada, Yoko; Orito, Reiko; Takada, Atsushi; Takeda, Atsushi; Tanimori, Toru; Ueno, Masaru; Bouianov, Oleg; Bouianov, Marina

    2004-01-01

    A μ-TPC, a time projection chamber (TPC) which can detect three-dimensional fine tracks of charged particles, was developed and its performance was measured. We developed a μ-TPC with a detection volume of 10x10x10cm3 based on a novel two-dimensional imaging gaseous detector, or the μ-PIC. Fine tracks of charged particles with large energy depositions (protons and electrons) were detected with the μ-TPC. With a pressurized gas, tracks of the minimum ionizing particles were detected. We showed the principle of the application for the time-resolved neutron imaging detector

  16. The performance and application of laser-induced photoacoustic spectrometer

    International Nuclear Information System (INIS)

    Wang Bo; Chen Xi; Yao Jun

    2012-01-01

    Laser-induced photoacoustic spectrometer (LIPAS) is a key instrument can be used in the investigation of radionuclides migration behaviors due to its higher sensitivity for the detection and identification of radionuclides speciation in aqueous solutions. The speciation of radionuclides such as oxidation states and complexation may be determined directly by using this specific non-contact and nondestructive analytical technique, and the sensitivity of LIPAS surpasses that of conventional absorption spectroscopy by one to two orders of magnitude. In the present work, LIPAS system was established at China Institute of Atomic Energy (CIAE), and the principle, performance and preliminary application of LIPAS are also be presented. (authors)

  17. High-performance heat pipes for heat recovery applications

    Science.gov (United States)

    Saaski, E. W.; Hartl, J. H.

    1980-01-01

    Methods to improve the performance of reflux heat pipes for heat recovery applications were examined both analytically and experimentally. Various models for the estimation of reflux heat pipe transport capacity were surveyed in the literature and compared with experimental data. A high transport capacity reflux heat pipe was developed that provides up to a factor of 10 capacity improvement over conventional open tube designs; analytical models were developed for this device and incorporated into a computer program HPIPE. Good agreement of the model predictions with data for R-11 and benzene reflux heat pipes was obtained.

  18. High performance protection circuit for power electronics applications

    Energy Technology Data Exchange (ETDEWEB)

    Tudoran, Cristian D., E-mail: cristian.tudoran@itim-cj.ro; Dădârlat, Dorin N.; Toşa, Nicoleta; Mişan, Ioan [National Institute for Research and Development of Isotopic and Molecular Technologies, 67-103 Donat, PO 5 Box 700, 400293 Cluj-Napoca (Romania)

    2015-12-23

    In this paper we present a high performance protection circuit designed for the power electronics applications where the load currents can increase rapidly and exceed the maximum allowed values, like in the case of high frequency induction heating inverters or high frequency plasma generators. The protection circuit is based on a microcontroller and can be adapted for use on single-phase or three-phase power systems. Its versatility comes from the fact that the circuit can communicate with the protected system, having the role of a “sensor” or it can interrupt the power supply for protection, in this case functioning as an external, independent protection circuit.

  19. Performance measures in the earth observations commercialization applications program

    Science.gov (United States)

    Macauley, Molly K.

    1996-03-01

    Performance measures in the Earth Observations Commercialization Application Program (EOCAP) are key to its success and include net profitability; enhancements to industry productivity through generic innovations in industry practices, standards, and protocols; and documented contributions to public policy governing the newly developing remote sensing industry. Because EOCAP requires company co-funding, both parties to the agreement (the government and the corporate partner) have incentives to pursue these goals. Further strengthening progress towards these goals are requirements for business plans in the company's EOCAP proposal, detailed scrutiny given these plans during proposal selection, and regularly documented progress reports during project implementation.

  20. Identifying patterns of motor performance, executive functioning, and verbal ability in preschool children: A latent profile analysis.

    Science.gov (United States)

    Houwen, Suzanne; Kamphorst, Erica; van der Veer, Gerda; Cantell, Marja

    2018-04-30

    A relationship between motor performance and cognitive functioning is increasingly being recognized. Yet, little is known about the precise nature of the relationship between both domains, especially in early childhood. To identify distinct constellations of motor performance, executive functioning (EF), and verbal ability in preschool aged children; and to explore how individual and contextual variables are related to profile membership. The sample consisted of 119 3- to 4-year old children (62 boys; 52%). The home based assessments consisted of a standardized motor test (Movement Assessment Battery for Children - 2), five performance-based EF tasks measuring inhibition and working memory, and the Receptive Vocabulary subtest from the Wechsler Preschool and Primary Scale of Intelligence Third Edition. Parents filled out the Behavior Rating Inventory of Executive Function - Preschool version. Latent profile analysis (LPA) was used to delineate profiles of motor performance, EF, and verbal ability. Chi-square statistics and multinomial logistic regression analysis were used to examine whether profile membership was predicted by age, gender, risk of motor coordination difficulties, ADHD symptomatology, language problems, and socioeconomic status (SES). LPA yielded three profiles with qualitatively distinct response patterns of motor performance, EF, and verbal ability. Quantitatively, the profiles showed most pronounced differences with regard to parent ratings and performance-based tests of EF, as well as verbal ability. Risk of motor coordination difficulties and ADHD symptomatology were associated with profile membership, whereas age, gender, language problems, and SES were not. Our results indicate that there are distinct subpopulations of children who show differential relations with regard to motor performance, EF, and verbal ability. The fact that we found both quantitative as well as qualitative differences between the three patterns of profiles underscores

  1. Identified best environmental management practices to improve the energy performance of the retail trade sector in Europe

    International Nuclear Information System (INIS)

    Galvez-Martos, Jose-Luis; Styles, David; Schoenberger, Harald

    2013-01-01

    The retail trade sector has been identified as a target sector for the development of sectoral reference documents on best environmental management practices under the Eco-Management and Audit Scheme. This paper focuses on the important energy-related needs in retailers' stores such as for food refrigeration and lighting, as well as heating, ventilation and air conditioning of the building. For the definition of best environmental management practices in the European framework, frontrunner retailers have been identified as those retailers integrating energy minimization and saving measures as standard practice systematically across stores. These best performers also integrate a comprehensive monitoring system in the energy management of every store or building belonging to the company, enabling the rapid identification of energy saving opportunities. An integrative approach is needed to define how best practices should be implemented in combination to optimize energy management within stores: building aspects such as insulation of the building envelope or the heating, ventilation and air conditioning system, should be optimized in combination with best options for refrigeration in food retailers. Refrigeration systems are responsible for half of the final energy use in stores and of their carbon footprint. Natural refrigerants, heat recovery from the condensation stage and covering of display cases are measures with high environmental benefits to reduce the impact of refrigeration. Finally, practices for lighting, as optimal lighting strategies, and the integration of renewable energy sources in overall zero energy building concepts can save considerable amounts of fossil energy, reduce the carbon footprint and produce significant cost-savings in the long term. - highlights: • There is a high energy performance improvement potential of the retail trade sector. • We propose techniques with a high performance level and applied by frontrunners. • We identified

  2. Total System Performance Assessment-License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2002-09-13

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issue (KTI) agreements, the ''Yucca Mountain Review Plan'' (CNWRA 2002 [158449]), and 10 CFR Part 63. This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are utilized in this document.

  3. The application of advanced rotor (performance) methods for design calculations

    Energy Technology Data Exchange (ETDEWEB)

    Bussel, G.J.W. van [Delft Univ. of Technology, Inst. for Wind Energy, Delft (Netherlands)

    1997-08-01

    The calculation of loads and performance of wind turbine rotors has been a topic for research over the last century. The principles for the calculation of loads on rotor blades with a given specific geometry, as well as the development of optimal shaped rotor blades have been published in the decades that significant aircraft development took place. Nowadays advanced computer codes are used for specific problems regarding modern aircraft, and application to wind turbine rotors has also been performed occasionally. The engineers designing rotor blades for wind turbines still use methods based upon global principles developed in the beginning of the century. The question what to expect in terms of the type of methods to be applied in a design environment for the near future is addressed here. (EG) 14 refs.

  4. Automatic address validation and health record review to identify homeless Social Security disability applicants.

    Science.gov (United States)

    Erickson, Jennifer; Abbott, Kenneth; Susienka, Lucinda

    2018-06-01

    Homeless patients face a variety of obstacles in pursuit of basic social services. Acknowledging this, the Social Security Administration directs employees to prioritize homeless patients and handle their disability claims with special care. However, under existing manual processes for identification of homelessness, many homeless patients never receive the special service to which they are entitled. In this paper, we explore address validation and automatic annotation of electronic health records to improve identification of homeless patients. We developed a sample of claims containing medical records at the moment of arrival in a single office. Using address validation software, we reconciled patient addresses with public directories of homeless shelters, veterans' hospitals and clinics, and correctional facilities. Other tools annotated electronic health records. We trained random forests to identify homeless patients and validated each model with 10-fold cross validation. For our finished model, the area under the receiver operating characteristic curve was 0.942. The random forest improved sensitivity from 0.067 to 0.879 but decreased positive predictive value to 0.382. Presumed false positive classifications bore many characteristics of homelessness. Organizations could use these methods to prompt early collection of information necessary to avoid labor-intensive attempts to reestablish contact with homeless individuals. Annually, such methods could benefit tens of thousands of patients who are homeless, destitute, and in urgent need of assistance. We were able to identify many more homeless patients through a combination of automatic address validation and natural language processing of unstructured electronic health records. Copyright © 2018. Published by Elsevier Inc.

  5. Application Of Quality Function Deployment (QFD) To Measure Performance

    International Nuclear Information System (INIS)

    Fazila Said; Mohd Amirul Shafiq Shafiee; Nurul Hasanah Mohd Abd Basir

    2014-01-01

    This study aims to measure service quality performance and identify critical service quality characteristics as perceived by the customers. An integrated results survey that conducted by seven service centers that certified with Quality Management System (QMS) in Nuclear Malaysia are analysed. This is followed by constructing House of Quality (HoQ) and identifying other parameters for the Quality Function Deployment (QFD) matrix. HoQ is a simple and attractive service innovation tool which can be used to directly show comprehensive information which contained the voice of customer (VOC), technical response, technical correlation and matrix relationship. This study revealed that the information's from HoQ with further discussion on planning part which can be used to assist management in knowing the overall detail information of service center achievement and recognizes the solution for unsatisfied customer through priority improvement activity to enhance the customer satisfaction in future. (author)

  6. Summary report on the FHWA LTBP Workshop to identify bridge substructure performance issues : March 4-6, 2010, in Orlando, FL.

    Science.gov (United States)

    2013-01-01

    The Long-Term Bridge Performance (LTBP) program was created to identify, collect, and analyze researchquality : data on the most critical aspects of bridge performance. To complete a thorough investigation of bridge : performance issues, the Federal ...

  7. Performance of alternative refrigerants for residential air-conditioning applications

    International Nuclear Information System (INIS)

    Park, Ki-Jung; Seo, Taebeom; Jung, Dongsoo

    2007-01-01

    In this study, performances of two pure hydrocarbons and seven mixtures composed of propylene, propane, HFC152a, and dimethylether were measured to substitute for HCFC22 in residential air-conditioners and heat pumps. Thermodynamic cycle analysis was carried out to determine the optimum compositions before testing and actual tests were performed in a breadboard-type laboratory heat pump/air-conditioner at the evaporation and condensation temperatures of 7 and 45 deg. C, respectively. Test results show that the coefficient of performance of these mixtures is up to 5.7% higher than that of HCFC22. While propane showed a 11.5% reduction in capacity, most of the fluids had a similar capacity to that of HCFC22. For these fluids, compressor-discharge temperatures were reduced by 11-17 deg. C. For all fluids tested, the amount of charge was reduced by up to 55% as compared to HCFC22. Overall, these fluids provide good performances with reasonable energy-savings without any environmental problem and thus can be used as long-term alternatives for residential air-conditioning and heat-pumping applications

  8. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  9. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  10. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  11. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  12. Wireless sensor network performance metrics for building applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, W.S. (Department of Civil Engineering Yeungnam University 214-1 Dae-Dong, Gyeongsan-Si Gyeongsangbuk-Do 712-749 South Korea); Healy, W.M. [Building and Fire Research Laboratory, 100 Bureau Drive, Gaithersburg, MD 20899-8632 (United States)

    2010-06-15

    Metrics are investigated to help assess the performance of wireless sensors in buildings. Wireless sensor networks present tremendous opportunities for energy savings and improvement in occupant comfort in buildings by making data about conditions and equipment more readily available. A key barrier to their adoption, however, is the uncertainty among users regarding the reliability of the wireless links through building construction. Tests were carried out that examined three performance metrics as a function of transmitter-receiver separation distance, transmitter power level, and obstruction type. These tests demonstrated, via the packet delivery rate, a clear transition from reliable to unreliable communications at different separation distances. While the packet delivery rate is difficult to measure in actual applications, the received signal strength indication correlated well with the drop in packet delivery rate in the relatively noise-free environment used in these tests. The concept of an equivalent distance was introduced to translate the range of reliability in open field operation to that seen in a typical building, thereby providing wireless system designers a rough estimate of the necessary spacing between sensor nodes in building applications. It is anticipated that the availability of straightforward metrics on the range of wireless sensors in buildings will enable more widespread sensing in buildings for improved control and fault detection. (author)

  13. Performance Evaluation Of Furrow Lengths And Field Application Techniques

    Directory of Open Access Journals (Sweden)

    Issaka

    2015-08-01

    Full Text Available Abstract The study evaluated performance of furrow lengths and field application techniques. The experiment was conducted on 2000 m2 field at Bontanga irrigation scheme. Randomized Complete Block Design RCBD was used with three replicates. The replicates include Blocks A B and C of furrow lengths 100 m 75 m and 50 m respectively. Each replicate had surge cut-off cut-back and bunds treatments. Water was introduced into the furrows and the advance distances and time were measured. Results of the study showed that at Block A surge technique recorded the highest advance rate of 1.26 minm and opportunity time of 11 min whilst bunds recorded the lowest advance rate of 0.92 minm. Significant difference 3.32 pamp88050.05 occurred between treatment means of field application techniques at Block A 100 m. Significant difference 2.71 pamp88050.05 was also recorded between treatment means. At Block B 75 m there was significant difference 2.71 pamp88050.05 between treatment means. No significant difference 0.14 pamp88040.05 was observed among surge cut-back and bunds techniques. There was significant difference 2.60 pamp88050.05 between treatment means but no significant difference between cut-back and bunds techniques in Block C 50 m. Their performance was ranked in the order Surge Cut-back Cut-off Bunds for furrow lengths 100 m 75 m and 50 m respectively.

  14. Application of isotopic and hydro-geochemical methods in identifying sources of mine inrushing water

    Institute of Scientific and Technical Information of China (English)

    Dou Huiping; Ma Zhiyuan; Cao Haidong; Liu Feng; Hu Weiwei; Li Ting

    2011-01-01

    Isotopic and hydro-geochemical surveys were carried out to identify the source of mine inrushing water at the #73003 face in the Laohutai Mine.Based on the analysis of isotopes and hydro-chemical features of surface water,groundwater from different levels and the inrushing water,a special relationship between water at the #73003 face and cretaceous water has been found.The results show that the isotopic and hydro-chemical features of the inrushing water are completely different from those of other groundwater bodies,except for the cretaceous water.The isotopic and hydrochemical characteristics of cretaceous water are similar to the inrushing water of the #73003 face,which aided with obtaining the evidence for the possible source of the inrushing water at the #73003 face.The isotope calculations show that the inrushing water at the #73003 face is a mixture of cretaceous water and Quaternary water,water from the cretaceous conglomerate is the main source,accounting for 67% of the inrushing water,while the Quaternary water accounts for 33%.The conclusion is also supported by a study of inrushing-water channels and an active fault near the inrushing-water plot on the #73003 face.

  15. Development and Application of Diagnostic Test to Identify Students' Misconceptions of Quantum Physics

    International Nuclear Information System (INIS)

    Halim, A.A.; Meerah, T.S.; Lilia Halim

    2009-01-01

    A study on students' misconceptions on quantum physics is rarely being done, because the target audience is quite small. It is important to understand quantum physics concepts correctly especially for science students. This study was under taken to help students identify their misconceptions at the early stage. The aim of this study is to develop a diagnostic test which can access the students' misconceptions, and use the findings for the benefits of quantum physics courses. A multiple-choice Quantum Physics Diagnostic Test (QPDT), that involves concepts of light, atomic model, particle-wave dualism, wave function, and potential energy, was administered to 200 university students. The results shows that many students use the classical concepts to describe the quantum phenomenon. For example students describe light only as a wave, an electron only as a particle, and that the atomic structure is parallel to the solar system. To overcome these problems, it is suggested that lecturers spend more time in explaining the basic definitions and using analogies in quantum physics teaching. (author)

  16. A simple method to identify areas of environmental risk due to manure application.

    Science.gov (United States)

    Flores, Héctor; Arumí, José Luis; Rivera, Diego; Lagos, L Octavio

    2012-06-01

    The management of swine manure is becoming an important environmental issue in Chile. One option for the final disposal of manure is to use it as a biofertilizer, but this practice could impact the surrounding environment. To assess the potential environmental impacts of the use of swine manure as a biofertilizer, we propose a method to identify zones of environmental risk through indices. The method considers two processes: nutrient runoff and solute leaching, and uses available information about soils, crops and management practices (irrigation, fertilization, and rotation). We applied the method to qualitatively assess the environmental risk associated with the use of swine manure as a biofertilizer in an 8,000-pig farm located in Central Chile. Results showed that the farm has a moderate environmental risk, but some specific locations have high environmental risks, especially those associated with impacts on areas surrounding water resources. This information could assist the definition of better farm-level management practices, as well as the preservation of riparian vegetation acting as buffer strips. The main advantage of our approach is that it combines qualitative and quantitative information, including particular situations or field features based on expert knowledge. The method is flexible, simple, and can be easily extended or adapted to other processes.

  17. Developing prediction equations and a mobile phone application to identify infants at risk of obesity.

    Science.gov (United States)

    Santorelli, Gillian; Petherick, Emily S; Wright, John; Wilson, Brad; Samiei, Haider; Cameron, Noël; Johnson, William

    2013-01-01

    Advancements in knowledge of obesity aetiology and mobile phone technology have created the opportunity to develop an electronic tool to predict an infant's risk of childhood obesity. The study aims were to develop and validate equations for the prediction of childhood obesity and integrate them into a mobile phone application (App). Anthropometry and childhood obesity risk data were obtained for 1868 UK-born White or South Asian infants in the Born in Bradford cohort. Logistic regression was used to develop prediction equations (at 6 ± 1.5, 9 ± 1.5 and 12 ± 1.5 months) for risk of childhood obesity (BMI at 2 years >91(st) centile and weight gain from 0-2 years >1 centile band) incorporating sex, birth weight, and weight gain as predictors. The discrimination accuracy of the equations was assessed by the area under the curve (AUC); internal validity by comparing area under the curve to those obtained in bootstrapped samples; and external validity by applying the equations to an external sample. An App was built to incorporate six final equations (two at each age, one of which included maternal BMI). The equations had good discrimination (AUCs 86-91%), with the addition of maternal BMI marginally improving prediction. The AUCs in the bootstrapped and external validation samples were similar to those obtained in the development sample. The App is user-friendly, requires a minimum amount of information, and provides a risk assessment of low, medium, or high accompanied by advice and website links to government recommendations. Prediction equations for risk of childhood obesity have been developed and incorporated into a novel App, thereby providing proof of concept that childhood obesity prediction research can be integrated with advancements in technology.

  18. Application of spatial methods to identify areas with lime requirement in eastern Croatia

    Science.gov (United States)

    Bogunović, Igor; Kisic, Ivica; Mesic, Milan; Zgorelec, Zeljka; Percin, Aleksandra; Pereira, Paulo

    2016-04-01

    With more than 50% of acid soils in all agricultural land in Croatia, soil acidity is recognized as a big problem. Low soil pH leads to a series of negative phenomena in plant production and therefore as a compulsory measure for reclamation of acid soils is liming, recommended on the base of soil analysis. The need for liming is often erroneously determined only on the basis of the soil pH, because the determination of cation exchange capacity, the hydrolytic acidity and base saturation is a major cost to producers. Therefore, in Croatia, as well as some other countries, the amount of liming material needed to ameliorate acid soils is calculated by considering their hydrolytic acidity. For this research, several interpolation methods were tested to identify the best spatial predictor of hidrolitic acidity. The purpose of this study was to: test several interpolation methods to identify the best spatial predictor of hidrolitic acidity; and to determine the possibility of using multivariate geostatistics in order to reduce the number of needed samples for determination the hydrolytic acidity, all with an aim that the accuracy of the spatial distribution of liming requirement is not significantly reduced. Soil pH (in KCl) and hydrolytic acidity (Y1) is determined in the 1004 samples (from 0-30 cm) randomized collected in agricultural fields near Orahovica in eastern Croatia. This study tested 14 univariate interpolation models (part of ArcGIS software package) in order to provide most accurate spatial map of hydrolytic acidity on a base of: all samples (Y1 100%), and the datasets with 15% (Y1 85%), 30% (Y1 70%) and 50% fewer samples (Y1 50%). Parallel to univariate interpolation methods, the precision of the spatial distribution of the Y1 was tested by the co-kriging method with exchangeable acidity (pH in KCl) as a covariate. The soils at studied area had an average pH (KCl) 4,81, while the average Y1 10,52 cmol+ kg-1. These data suggest that liming is necessary

  19. Application of cluster analysis to geochemical compositional data for identifying ore-related geochemical anomalies

    Science.gov (United States)

    Zhou, Shuguang; Zhou, Kefa; Wang, Jinlin; Yang, Genfang; Wang, Shanshan

    2017-12-01

    Cluster analysis is a well-known technique that is used to analyze various types of data. In this study, cluster analysis is applied to geochemical data that describe 1444 stream sediment samples collected in northwestern Xinjiang with a sample spacing of approximately 2 km. Three algorithms (the hierarchical, k-means, and fuzzy c-means algorithms) and six data transformation methods (the z-score standardization, ZST; the logarithmic transformation, LT; the additive log-ratio transformation, ALT; the centered log-ratio transformation, CLT; the isometric log-ratio transformation, ILT; and no transformation, NT) are compared in terms of their effects on the cluster analysis of the geochemical compositional data. The study shows that, on the one hand, the ZST does not affect the results of column- or variable-based (R-type) cluster analysis, whereas the other methods, including the LT, the ALT, and the CLT, have substantial effects on the results. On the other hand, the results of the row- or observation-based (Q-type) cluster analysis obtained from the geochemical data after applying NT and the ZST are relatively poor. However, we derive some improved results from the geochemical data after applying the CLT, the ILT, the LT, and the ALT. Moreover, the k-means and fuzzy c-means clustering algorithms are more reliable than the hierarchical algorithm when they are used to cluster the geochemical data. We apply cluster analysis to the geochemical data to explore for Au deposits within the study area, and we obtain a good correlation between the results retrieved by combining the CLT or the ILT with the k-means or fuzzy c-means algorithms and the potential zones of Au mineralization. Therefore, we suggest that the combination of the CLT or the ILT with the k-means or fuzzy c-means algorithms is an effective tool to identify potential zones of mineralization from geochemical data.

  20. Applicability of Earth Observation for Identifying Small-Scale Mining Footprints in a Wet Tropical Region

    Directory of Open Access Journals (Sweden)

    Celso M. Isidro

    2017-09-01

    Full Text Available The unpredictable climate in wet tropical regions along with the spatial resolution limitations of some satellite imageries make detecting and mapping artisanal and small-scale mining (ASM challenging. The objective of this study was to test the utility of Pleiades and SPOT imagery with an object-based support vector machine (OB-SVM classifier for the multi-temporal remote sensing of ASM and other land cover including a large-scale mine in the Didipio catchment in the Philippines. Historical spatial data on location and type of ASM mines were collected from the field and were utilized as training data for the OB-SVM classifier. The classification had an overall accuracy between 87% and 89% for the three different images—Pleiades-1A for the 2013 and 2014 images and SPOT-6 for the 2016 image. The main land use features, particularly the Didipio large-scale mine, were well identified by the OB-SVM classifier, however there were greater commission errors for the mapping of small-scale mines. The lack of consistency in their shape and their small area relative to pixel sizes meant they were often not distinguished from other land clearance types (i.e., open land. To accurately estimate the total area of each land cover class, we calculated bias-adjusted surface areas based on misclassification values. The analysis showed an increase in small-scale mining areas from 91,000 m2—or 0.2% of the total catchment area—in March 2013 to 121,000 m2—or 0.3%—in May 2014, and then a decrease to 39,000 m2—or 0.1%—in January 2016.

  1. Performance of a transmutation advanced device for sustainable energy application

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, C.; Rosales, J.; Garcia, L. [Instituto Superior de Tecnologias y Ciencias Aplicadas (INSTEC), La Habana (Cuba); Perez-Navarro, A.; Escriva, A. [Universidad Politecnica de Valencia, Valencia (Spain). Inst. de Ingenieria Energetica; Abanades, A. [Universidad Politecnica de Madrid (Spain). Grupo de Modelizacion de Sistemas Termoenergeticos

    2009-07-01

    Preliminary studies have been performed to design a device for nuclear waste transmutation and hydrogen generation based on a gas cooled pebble bed accelerator driven system, TADSEA (transmutation advanced device for sustainable energy application). In previous studies we have addressed the viability of an ADS Transmutation device that uses as fuel wastes from the existing LWR power plants, encapsulated in graphite in the form of pebble beds, being cooled by helium which enables high temperatures, in the order of 1200 K, to facilitate hydrogen generation from water either by high temperature electrolysis or by thermo chemical cycles. To design this device several configurations were studied, including several reactors thickness, to achieve the desired parameters, the transmutation of nuclear waste and the production of 100 MW. of thermal power. In this paper we are presenting new studies performed on deep burn in-core fuel management strategy for LWR waste. We analyze the fuel cycle on TADSEA device based on driver and transmutation fuel that were proposed for the General Atomic design of a gas turbine-modular helium reactor. We compare the transmutation results of the three fuel management strategies, using driven and transmutation, and standard LWR spend fuel, and present several parameters that describe the neutron performance of TADSEA nuclear core as the fuel and moderator temperature reactivity coefficients and transmutation chain. (author)

  2. Performance of a transmutation advanced device for sustainable energy application

    International Nuclear Information System (INIS)

    Garcia, C.; Rosales, J.; Garcia, L.; Perez-Navarro, A.; Escriva, A.; Abanades, A.

    2009-01-01

    Preliminary studies have been performed to design a device for nuclear waste transmutation and hydrogen generation based on a gas cooled pebble bed accelerator driven system, TADSEA (transmutation advanced device for sustainable energy application). In previous studies we have addressed the viability of an ADS Transmutation device that uses as fuel wastes from the existing LWR power plants, encapsulated in graphite in the form of pebble beds, being cooled by helium which enables high temperatures, in the order of 1200 K, to facilitate hydrogen generation from water either by high temperature electrolysis or by thermo chemical cycles. To design this device several configurations were studied, including several reactors thickness, to achieve the desired parameters, the transmutation of nuclear waste and the production of 100 MW. of thermal power. In this paper we are presenting new studies performed on deep burn in-core fuel management strategy for LWR waste. We analyze the fuel cycle on TADSEA device based on driver and transmutation fuel that were proposed for the General Atomic design of a gas turbine-modular helium reactor. We compare the transmutation results of the three fuel management strategies, using driven and transmutation, and standard LWR spend fuel, and present several parameters that describe the neutron performance of TADSEA nuclear core as the fuel and moderator temperature reactivity coefficients and transmutation chain. (author)

  3. High performance polypyrrole coating for corrosion protection and biocidal applications

    Science.gov (United States)

    Nautiyal, Amit; Qiao, Mingyu; Cook, Jonathan Edwin; Zhang, Xinyu; Huang, Tung-Shi

    2018-01-01

    Polypyrrole (PPy) coating was electrochemically synthesized on carbon steel using sulfonic acids as dopants: p-toluene sulfonic acid (p-TSA), sulfuric acid (SA), (±) camphor sulfonic acid (CSA), sodium dodecyl sulfate (SDS), and sodium dodecylbenzene sulfonate (SDBS). The effect of acidic dopants (p-TSA, SA, CSA) on passivation of carbon steel was investigated by linear potentiodynamic and compared with morphology and corrosion protection performance of the coating produced. The types of the dopants used were significantly affecting the protection efficiency of the coating against chloride ion attack on the metal surface. The corrosion performance depends on size and alignment of dopant in the polymer backbone. Both p-TSA and SDBS have extra benzene ring that stack together to form a lamellar sheet like barrier to chloride ions thus making them appropriate dopants for PPy coating in suppressing the corrosion at significant level. Further, adhesion performance was enhanced by adding long chain carboxylic acid (decanoic acid) directly in the monomer solution. In addition, PPy coating doped with SDBS displayed excellent biocidal abilities against Staphylococcus aureus. The polypyrrole coatings on carbon steels with dual function of anti-corrosion and excellent biocidal properties shows great potential application in the industry for anti-corrosion/antimicrobial purposes.

  4. Development of comprehensive material performance database for nuclear applications

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime

    1993-01-01

    This paper introduces the present status of the comprehensive material performance database for nuclear applications, which was named JAERI Material Performance Database (JMPD), and examples of its utilization. The JMPD has been developed since 1986 in JAERI with a view to utilizing various kinds of characteristics data of nuclear materials efficiently. Management system of relational database, PLANNER, was employed, and supporting systems for data retrieval and output were expanded. In order to improve user-friendliness of the retrieval system, the menu selection type procedures have been developed where knowledge of the system or the data structures are not required for end-users. As to utilization of the JMPD, two types of data analyses are mentioned as follows: (1) A series of statistical analyses was performed in order to estimate the design values both of the yield strength (Sy) and the tensile strength (Su) for aluminum alloys which are widely used as structural materials for research reactors. (2) Statistical analyses were accomplished by using the cyclic crack growth rate data for nuclear pressure vessel steels, and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and ΔK-constant type tests. (author)

  5. Development of a Web Application: Recording Learners' Mouse Trajectories and Retrieving their Study Logs to Identify the Occurrence of Hesitation in Solving Word-Reordering Problems

    Directory of Open Access Journals (Sweden)

    Mitsumasa Zushi

    2014-04-01

    Full Text Available Most computer marking systems evaluate the results of the answers reached by learners without looking into the process by which the answers are produced, which will be insufficient to ascertain learners' understanding level because correct answers may well include lucky hunches, namely accidentally correct but not confident answers. In order to differentiate these lucky answers from confident correct ones, we have developed a Web application that can record mouse trajectories during the performance of tasks. Mathematical analyses of these trajectories have revealed that some parameters for mouse movements can be useful indicators to identify the occurrence of hesitation resulting from lack of knowledge or confidence in solving problems.

  6. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  7. Performance evaluation of microturbine generation system for microgrid applications

    Energy Technology Data Exchange (ETDEWEB)

    Salam, A.A.; Mohamed, A.; Hannan, M.A.; Shareef, H.; Wanik, M.Z.C. [Kebangsaan Malaysia Univ., Selangor (Malaysia). Dept. of Electrical, Electronic and Systems Engineering, Faculty of Engineering and Built Environment

    2009-03-11

    A control system for microturbine generation system (MGS) units in microgrid applications was presented. A dynamic model of the microturbine and power electronics interface systems was used to determine converter control strategies for distributed generation operation. Back-to-back converters were used to interface the microturbine-based distributed generation system to the grid. The controllers were used to regulate the output voltage value at the reference bus voltage and the frequency of the whole grid. Reference values were predetermined in the control scheme in order to obtain the desired value of voltage amplitude and frequency. An investigation of system dynamics was conducted using simulations in both grid-connected and islanded modes. Results of the simulations demonstrated the ability of the MGS to improve electricity grid reliability. The model can be used to accurately simulate MGS dynamic performance for both grid- and islanded modes of operation. 10 refs., 17 figs.

  8. The minireactor Mirene for neutron-radiography: performances and applications

    International Nuclear Information System (INIS)

    Houelle, M.; Gerberon, J.M.

    1981-05-01

    The MIRENE neutron radiograhy mini-reactor is described. The core contains only one kilogram of enriched uranium in solution form. It works by pulsed operation. The neutron bursts produced are collimated into two beams which pass through the concrete protection around the reactor block. The performance of the reactor and the results achieved since it went into service in 1977 are described. These concern various fields. In the nuclear field: examination of fast neutron reactor fissile pins, monitoring of neutron absorbing screens employed to guarantee the safety-criticality of the transport and storage of the nuclear fuel cycle, observation of irradiated oxide fuel pellets in order to determine the fuel state equation of the fast neutron system, examination of UO 2 and water mixtures for criticality experiments. In the industrial field, Mirene has a vast field of application. Two examples are given: monitoring of electric insulation sealing, visualization of the bonding of two high density metal parts. Finally an original application in agronomy has given very good results: this concerns the on-site follow-up of the root growth of maize plants [fr

  9. Performance Shaping Factors Assessments and Application to PHWR Outages

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Woo

    2007-02-15

    Human reliability analysis is definitely related to the quality of PSA because human errors have been identified as major contributors to PSA. According to NRC's 'Office of analysis and evaluation of operational data (AEOD)',82% of the reactor trips and accident during outage is caused by the events related to human errors. There is, however, no one HRA method universally accepted. Furthermore, HRA during PHWR outages has not been performed around the world yet. HRA during PHWR outages is especially important since manual management of operator is more required during PHWR. In this study, accident scenarios which HYU developed are used to perform a quantification of human error probability. In this study, overall procedures of standard HRA methodology are introduced and follows the quantification of 10 possible selected human actions during PHWR outages based on standard HRA methodology. To see the verification, quantified values were compared with the values from 'Generic CANDU Probabilistic Safety Assessment' and the values estimated by ASEP.Core Damage Frequency was estimated 3.35 x 10{sup -4} more higher than CDF estimated by AECL data. It was considered that the differences between the HEPs for OPAFW and OPECC3 make CDF higher. Therefore, complementary study of reestimating HEP for OPAFW and OPECC3 in detail is required for increasing the qualities of HRA and PSA. Moreover, one of the difficulties in performing human reliability analysis is to evaluate performance shaping factors which represent the characteristics and circumstances. For assessing a specific human action more exactly, it is necessary to consider all of the PSFs at the same time which makes an effect on the human action. Also, it requires the effect comparison among PSFs to minimize the uncertainties which are usually caused by the subjective judgements of HRA analysts. To see the sensitivity, performance shaping factors of each decision rule are changed which resulted

  10. Performance Shaping Factors Assessments and Application to PHWR Outages

    International Nuclear Information System (INIS)

    Lee, Seung Woo

    2007-02-01

    Human reliability analysis is definitely related to the quality of PSA because human errors have been identified as major contributors to PSA. According to NRC's 'Office of analysis and evaluation of operational data (AEOD)',82% of the reactor trips and accident during outage is caused by the events related to human errors. There is, however, no one HRA method universally accepted. Furthermore, HRA during PHWR outages has not been performed around the world yet. HRA during PHWR outages is especially important since manual management of operator is more required during PHWR. In this study, accident scenarios which HYU developed are used to perform a quantification of human error probability. In this study, overall procedures of standard HRA methodology are introduced and follows the quantification of 10 possible selected human actions during PHWR outages based on standard HRA methodology. To see the verification, quantified values were compared with the values from 'Generic CANDU Probabilistic Safety Assessment' and the values estimated by ASEP.Core Damage Frequency was estimated 3.35 x 10 -4 more higher than CDF estimated by AECL data. It was considered that the differences between the HEPs for OPAFW and OPECC3 make CDF higher. Therefore, complementary study of reestimating HEP for OPAFW and OPECC3 in detail is required for increasing the qualities of HRA and PSA. Moreover, one of the difficulties in performing human reliability analysis is to evaluate performance shaping factors which represent the characteristics and circumstances. For assessing a specific human action more exactly, it is necessary to consider all of the PSFs at the same time which makes an effect on the human action. Also, it requires the effect comparison among PSFs to minimize the uncertainties which are usually caused by the subjective judgements of HRA analysts. To see the sensitivity, performance shaping factors of each decision rule are changed which resulted in changes of core damage

  11. The Short Physical Performance Battery is a discriminative tool for identifying patients with COPD at risk of disability

    Directory of Open Access Journals (Sweden)

    Bernabeu-Mora R

    2015-12-01

    Full Text Available Roberto Bernabeu-Mora,1,2 Françesc Medina-Mirapeix,2 Eduardo Llamazares-Herrán,3 Gloria García-Guillamón,2 Luz María Giménez-Giménez,2 Juan Miguel Sánchez-Nieto1,4 1Division of Pneumology, Hospital Morales Meseguer, 2Department of Physical Therapy, University of Murcia, Murcia, 3Department of Physical Therapy, Alcala University, Alcala de Henares, 4Department of Intern Medical, University of Murcia, Murcia, Spain Background: Limited mobility is a risk factor for developing chronic obstructive pulmonary disease (COPD-related disabilities. Little is known about the validity of the Short Physical Performance Battery (SPPB for identifying mobility limitations in patients with COPD. Objective: To determine the clinical validity of the SPPB summary score and its three components (standing balance, 4-meter gait speed, and five-repetition sit-to-stand for identifying mobility limitations in patients with COPD.Methods: This cross-sectional study included 137 patients with COPD, recruited from a hospital in Spain. Muscle strength tests and SPPB were measured; then, patients were surveyed for self-reported mobility limitations. The validity of SPPB scores was analyzed by developing receiver operating characteristic curves to analyze the sensitivity and specificity for identifying patients with mobility limitations; by examining group differences in SPPB scores across categories of mobility activities; and by correlating SPPB scores to strength tests.Results: Only the SPPB summary score and the five-repetition sit-to-stand components showed good discriminative capabilities; both showed areas under the receiver operating characteristic curves greater than 0.7. Patients with limitations had significantly lower SPPB scores than patients without limitations in nine different mobility activities. SPPB scores were moderately correlated with the quadriceps test (r>0.40, and less correlated with the handgrip test (r<0.30, which reinforced convergent and

  12. Identifying subassemblies by ultrasound to prevent fuel handling error in sodium fast reactors: First test performed in water

    International Nuclear Information System (INIS)

    Paumel, Kevin; Lhuillier, Christian

    2015-01-01

    Identifying subassemblies by ultrasound is a method that is being considered to prevent handling errors in sodium fast reactors. It is based on the reading of a code (aligned notches) engraved on the subassembly head by an emitting/receiving ultrasonic sensor. This reading is carried out in sodium with high temperature transducers. The resulting one-dimensional C-scan can be likened to a binary code expressing the subassembly type and number. The first test performed in water investigated two parameters: width and depth of the notches. The code remained legible for notches as thin as 1.6 mm wide. The impact of the depth seems minor in the range under investigation. (authors)

  13. Performance of a PET detector module utilizing an array of silicon photodiodes to identify the crystal of interaction

    International Nuclear Information System (INIS)

    Moses, W.W.; Derenzo, S.E.; Nutt, R.; Digby, W.M.; Williams, C.W.; Andreaco, M.

    1993-01-01

    The authors initial performance results for a new multi-layer PET detector module consisting of an array of 3 mm square by 30 mm deep BGO crystals coupled on one end to a single photomultiplier tube and on the opposite end to an array of 3 mm square silicon photodiodes. The photomultiplier tube provides an accurate timing pulse and energy discrimination for all the crystals in the module, while the silicon photodiodes identify the crystal of interaction. When a single BGO crystal at +25 C is excited with 511 keV photons, the authors measure a photodiode signal centered at 700 electrons (e - ) with noise of 375 e - fwhm. When a four crystal/photodiode module is excited with a collimated line source of 511 keV photons, the crystal of interaction is correctly identified 82% of the time. The misidentification rate can be greatly reduced and an 8 x 8 crystal/photodiode module constructed by using thicker depletion layer photodiodes or cooling to 0 C

  14. Expert system applications to nuclear plant for enhancement of productivity and performance

    International Nuclear Information System (INIS)

    Sun, B.; Cain, D.; Naser, J.; Colley, R.; Hirota, N.

    1988-01-01

    Expert systems, a major essence of the artificial intelligence (AI) technology, are referred to as computer software and hardware systems which are designed to capture and emulate the knowledge, reasoning, judgment, and to store the expertise of humans. Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools which are tailored to electric utility industry applications. The second effort is the development of expert system application prototypes. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities which are required. The tool development helps define the applications which can be successfully developed. This paper summarizes a number of research projects which are being performed at EPRI in both the areas of expert system building tool development and expert system applications to operations and maintenance. The AI technology as demonstrated by the development is being established as a credible technological tool for the electric utility industry. A challenge to transferring the expert systems technology to the utility industry is to gain utility users' acceptance of this modern information technology. To achieve successful technology transfer, the technology developers need to (1) understand the problems which can be addressed successfully using AI technology, (2) involve with users throughout the development and testing phases, and (3) demonstrate the benefits of the technology by the users

  15. Use of Persistent Identifiers to link Heterogeneous Data Systems in the Integrated Earth Data Applications (IEDA) Facility

    Science.gov (United States)

    Hsu, L.; Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V.; O'hara, S. H.; Walker, J. D.

    2012-12-01

    for a GeoPass account ID, write a proposal to NSF and create a data plan using the IEDA Data Management Plan Tool. Having received the grant, the investigator then collects rock samples on a scientific cruise from dredges and registers the samples with IGSNs. The investigator then performs analytical geochemistry on the samples, and submits the full dataset to the Geochemical Resource Library for a dataset DOI. Finally, the investigator writes an article that is published in Science Direct. Knowing any of the following IDs: Investigator GeoPass ID, NSF Award Number, Cruise ID, Sample IGSNs, dataset DOI, or publication DOI, a user would be able to navigate to all samples, datasets, and publications in IEDA and external systems. Use of persistent identifiers to link heterogeneous data systems in IEDA thus increases access, discovery, and proper citation of hard-earned investigator datasets.

  16. Smart limbed vehicles for naval applications. Part I. Performance analysis

    Energy Technology Data Exchange (ETDEWEB)

    Weisberg, A.; Wood, L.

    1976-09-30

    Research work in smart, unmanned limbed vehicles for naval warfare applications performed during the latter part of FY76 and FY76T by the Special Studies Group of the LLL Physics Department for the Office of Naval Research is reported. Smart water-traversing limbed remotely navigated vehicles are interesting because: they are the only viable small vehicle usable in high sea states; they are small and work on the ocean surface, they are much harder to detect than any other conventional craft; they have no human pilot, are capable of high-g evasion, and will continue to operate after direct hits that would have crippled a human crew; they have the prospect of providing surface platforms possessing unprecedented speed and maneuverability; unlike manned information-gathering craft, they impose almost no penalty for missions in excess of 10 hours (no need to rotate shifts of crewmen, no food/lavatory requirements, etc.) and, in their ''loitering mode'', waterbugs could perhaps perform their missions for days to weeks; they are cheap enough to use for one-way missions; they are mass-producible; they are inherently reliable--almost impossible to sink and, in the event of in-use failure, the vehicle will not be destroyed; they maximally exploit continuing technological asymmetries between the U.S. and its potential opponents; and they are economically highly cost-effective for a wide spectrum of Navy missions. (TFD)

  17. Application of Lidar Data to the Performance Evaluations of ...

    Science.gov (United States)

    The Tropospheric Ozone (O3) Lidar Network (TOLNet) provides time/height O3 measurements from near the surface to the top of the troposphere to describe in high-fidelity spatial-temporal distributions, which is uniquely useful to evaluate the temporal evolution of O3 profiles in air quality models. This presentation describes the application of the Lidar data to the performance evaluation of CMAQ simulated O3 vertical profiles during the summer, 2014. Two-way coupled WRF-CMAQ simulations with 12km and 4km domains centered over Boulder, Colorado were performed during this time period. The analysis on the time series of observed and modeled O3 mixing ratios at different vertical layers indicates that the model frequently underestimated the observed values, and the underestimation was amplified in the middle model layers (~1km above the ground). When the lightning strikes detected by the National Lightning Detection Network (NLDN) were analyzed along with the observed O3 time series, it was found that the daily maximum O3 mixing ratios correlated well with the lightning strikes in the vicinity of the Lidar station. The analysis on temporal vertical profiles of both observed and modeled O3 mixing ratios on episodic days suggests that the model resolutions (12km and 4km) do not make any significant difference for this analysis (at this specific location and simulation period), but high O3 levels in the middle layers were linked to lightning activity that occurred in t

  18. Bimanual Psychomotor Performance in Neurosurgical Resident Applicants Assessed Using NeuroTouch, a Virtual Reality Simulator.

    Science.gov (United States)

    Winkler-Schwartz, Alexander; Bajunaid, Khalid; Mullah, Muhammad A S; Marwa, Ibrahim; Alotaibi, Fahad E; Fares, Jawad; Baggiani, Marta; Azarnoush, Hamed; Zharni, Gmaan Al; Christie, Sommer; Sabbagh, Abdulrahman J; Werthner, Penny; Del Maestro, Rolando F

    Current selection methods for neurosurgical residents fail to include objective measurements of bimanual psychomotor performance. Advancements in computer-based simulation provide opportunities to assess cognitive and psychomotor skills in surgically naive populations during complex simulated neurosurgical tasks in risk-free environments. This pilot study was designed to answer 3 questions: (1) What are the differences in bimanual psychomotor performance among neurosurgical residency applicants using NeuroTouch? (2) Are there exceptionally skilled medical students in the applicant cohort? and (3) Is there an influence of previous surgical exposure on surgical performance? Participants were instructed to remove 3 simulated brain tumors with identical visual appearance, stiffness, and random bleeding points. Validated tier 1, tier 2, and advanced tier 2 metrics were used to assess bimanual psychomotor performance. Demographic data included weeks of neurosurgical elective and prior operative exposure. This pilot study was carried out at the McGill Neurosurgical Simulation Research and Training Center immediately following neurosurgical residency interviews at McGill University, Montreal, Canada. All 17 medical students interviewed were asked to participate, of which 16 agreed. Performances were clustered in definable top, middle, and bottom groups with significant differences for all metrics. Increased time spent playing music, increased applicant self-evaluated technical skills, high self-ratings of confidence, and increased skin closures statistically influenced performance on univariate analysis. A trend for both self-rated increased operating room confidence and increased weeks of neurosurgical exposure to increased blood loss was seen in multivariate analysis. Simulation technology identifies neurosurgical residency applicants with differing levels of technical ability. These results provide information for studies being developed for longitudinal studies on the

  19. Performance Monitoring Enterprise Applications with the BlackBird System

    Science.gov (United States)

    Germano, João P.; da Silva, Alberto Rodrigues; Silva, Fernando M.

    This work describes the BlackBird system, which is an analysis and monitoring service for data-intensive enterprise applications, without restrictions on the targeted architecture or employed technologies. A case study is presented for the monitoring of Billing applications from Vodafone Portugal. Monitoring systems are an essential tool for the effective management of Enterprise Applications and the attainment of the demanding service level agreements imposed to these applications. However, due to the increasing complexity and diversity of these applications, adequate monitoring systems are rarely available. The BlackBird monitoring system is able to interact with these applications through different technologies employed by the Monitored Application, and is able to produce Metrics regarding the application service level goals. The BlackBird system can be specified using a set of pre-defined Configuration Objects, allowing it to be extensible and adaptable for applications with different architectures.

  20. 12 CFR 228.29 - Effect of CRA performance on applications.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Effect of CRA performance on applications. 228... account the record of performance under the CRA of: (1) Each applicant bank for the: (i) Establishment of... approval of application. A bank's record of performance may be the basis for denying or conditioning...

  1. 12 CFR 25.29 - Effect of CRA performance on applications.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Effect of CRA performance on applications. 25... takes into account the record of performance under the CRA of each applicant bank in considering an... application. A bank's record of performance may be the basis for denying or conditioning approval of an...

  2. Performance specification methodology: introduction and application to displays

    Science.gov (United States)

    Hopper, Darrel G.

    1998-09-01

    Acquisition reform is based on the notion that DoD must rely on the commercial marketplace insofar as possible rather than solely looking inward to a military marketplace to meet its needs. This reform forces a fundamental change in the way DoD conducts business, including a heavy reliance on private sector models of change. The key to more reliance on the commercial marketplace is the performance specifications (PS). This paper introduces some PS concepts and a PS classification principal to help bring some structure to the analysis of risk (cost, schedule, capability) in weapons system development and the management of opportunities for affordable ownership (maintain/increase capability via technology insertion, reduce cost) in this new paradigm. The DoD shift toward commercial components is nowhere better exemplified than in displays. Displays are the quintessential dual-use technology and are used herein to exemplify these PS concepts and principal. The advent of flat panel displays as a successful technology is setting off an epochal shift in cockpits and other military applications. Displays are installed in every DoD weapon system, and are, thus, representative of a range of technologies where issues and concerns throughout industry and government have been raised regarding the increased DoD reliance on the commercial marketplace. Performance specifications require metrics: the overall metrics of 'information-thrust' with units of Mb/s and 'specific info- thrust' with units of Mb/s/kg are introduced to analyze value of a display to the warfighter and affordability to the taxpayer.

  3. Ergonomics as aid tool to identify and to analyze factors that can affect the operational performance of nuclear power plants

    International Nuclear Information System (INIS)

    Luquetti Santos, I.J.A.; Carvalho, P.V.R.

    2005-01-01

    The study of ergonomics has evolved around the world as one of the keys to understand human behavior in interaction with complex systems as nuclear power plant and to achieve the best match between the system and its users in the context of task to be performed. Increasing research efforts have yielded a considerable body of knowledge concerning the design of workstations, workplace, control rooms, human-system interfaces, user-interface interaction and organizational design to prevent worker discomfort, illness and also to improve productivity, product quality, ease of use and safety. The work ergonomics analysis consists of gathering a series of observation in order to better understand the work done and to propose changes and improvements in the working conditions. The work ergonomics analysis implies both the correction of existing situations (safety, reliability and production problems) and the development of new work system. Operator activity analysis provides a useful tool for the ergonomics approach, based on work ergonomics analysis. The operators will be systematically observed in their real work environment (control room) or in simulators. The focus is on description of the distributed regulated mechanisms (in the sense that operators work in crew), both in nominal and degraded situations, observing how operators regulate collectively their work during an increase in workload or when confronted with situations where incidents or accidents occur. Audio, video recorders and field notes can be used to collect empirical data, conversations and interactions that occur naturally within the work environment. Our research develops an applied ergonomics methodology, based on field studies, that permits to identify and analyze situations, factors that may affect the operational performance of nuclear power plants. Our contribution is related to the following technical topic: How best to learn from and share operational safety experience and manage changes during

  4. Performance of the modified Richmond Agitation Sedation Scale in identifying delirium  in older ED patients.

    Science.gov (United States)

    Grossmann, Florian F; Hasemann, Wolfgang; Kressig, Reto W; Bingisser, Roland; Nickel, Christian H

    2017-09-01

    Delirium in older emergency department (ED) patients is associated with severe negative patient outcomes and its detection is challenging for ED clinicians. ED clinicians need easy tools for delirium detection. We aimed to test the performance criteria of the modified Richmond Agitation Sedation Scale (mRASS) in identifying delirium in older ED patients. The mRASS was applied to a sample of consecutive ED patients aged 65 or older by specially trained nurses during an 11-day period in November 2015. Reference standard delirium diagnosis was based on Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR) criteria, and was established by geriatricians. Performance criteria were computed. Analyses were repeated in the subsamples of patients with and without dementia. Of 285 patients, 20 (7.0%) had delirium and 41 (14.4%) had dementia. The sensitivity of an mRASS other than 0 to detect delirium was 0.70 (95% confidence interval, CI, 0.48; 0.85), specificity 0.93 (95% CI 0.90; 0.96), positive likelihood ratio 10.31 (95% CI 6.06; 17.51), negative likelihood ratio 0.32 (95% CI 0.16; 0.63). In the sub-sample of patients with dementia, sensitivity was 0.55 (95% CI 0.28; 0.79), specificity 0.83 (95% CI 0.66; 0.93), positive likelihood ratio 3.27 (95% CI 1.25; 8.59), negative likelihood ratio 0.55 (95% CI 0.28; 1.06). The sensitivity of the mRASS to detect delirium in older ED patients was low, especially in patients with dementia. Therefore its usefulness as a stand-alone screening tool is limited. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Identifying critical nitrogen application rate for maize yield and nitrate leaching in a Haplic Luvisol soil using the DNDC model.

    Science.gov (United States)

    Zhang, Yitao; Wang, Hongyuan; Liu, Shen; Lei, Qiuliang; Liu, Jian; He, Jianqiang; Zhai, Limei; Ren, Tianzhi; Liu, Hongbin

    2015-05-01

    Identification of critical nitrogen (N) application rate can provide management supports for ensuring grain yield and reducing amount of nitrate leaching to ground water. A five-year (2008-2012) field lysimeter (1 m × 2 m × 1.2 m) experiment with three N treatments (0, 180 and 240 kg Nha(-1)) was conducted to quantify maize yields and amount of nitrate leaching from a Haplic Luvisol soil in the North China Plain. The experimental data were used to calibrate and validate the process-based model of Denitrification-Decomposition (DNDC). After this, the model was used to simulate maize yield production and amount of nitrate leaching under a series of N application rates and to identify critical N application rate based on acceptable yield and amount of nitrate leaching for this cropping system. The results of model calibration and validation indicated that the model could correctly simulate maize yield and amount of nitrate leaching, with satisfactory values of RMSE-observation standard deviation ratio, model efficiency and determination coefficient. The model simulations confirmed the measurements that N application increased maize yield compared with the control, but the high N rate (240 kg Nha(-1)) did not produce more yield than the low one (120 kg Nha(-1)), and that the amount of nitrate leaching increased with increasing N application rate. The simulation results suggested that the optimal N application rate was in a range between 150 and 240 kg ha(-1), which would keep the amount of nitrate leaching below 18.4 kg NO₃(-)-Nha(-1) and meanwhile maintain acceptable maize yield above 9410 kg ha(-1). Furthermore, 180 kg Nha(-1) produced the highest yields (9837 kg ha(-1)) and comparatively lower amount of nitrate leaching (10.0 kg NO₃(-)-Nha(-1)). This study will provide a valuable reference for determining optimal N application rate (or range) in other crop systems and regions in China. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Clean Technology Application : Kupola Model Burner for Increasing the Performance of Spent Accu Recycle

    International Nuclear Information System (INIS)

    Titiresmi

    2000-01-01

    Recycling of used battery for recovering lead done by either small household/small scale industries has been identified as a source of air pollution, especially by heavy metal (Pb). This condition give an adverse impact toward workers and societies. Technological aspect is one of the causal. The process apply an open system. Therefore, a lot of energy, as well as dust wasted to the air without prior treatment. For overcoming this condition, closed system by utilizing Cupola furnace will be offered as one of the alternatives clean technology application and to increase the recovering performance in order to set an effective and efficient result. (author)

  7. Rapid Prototyping of High Performance Signal Processing Applications

    Science.gov (United States)

    Sane, Nimish

    Advances in embedded systems for digital signal processing (DSP) are enabling many scientific projects and commercial applications. At the same time, these applications are key to driving advances in many important kinds of computing platforms. In this region of high performance DSP, rapid prototyping is critical for faster time-to-market (e.g., in the wireless communications industry) or time-to-science (e.g., in radio astronomy). DSP system architectures have evolved from being based on application specific integrated circuits (ASICs) to incorporate reconfigurable off-the-shelf field programmable gate arrays (FPGAs), the latest multiprocessors such as graphics processing units (GPUs), or heterogeneous combinations of such devices. We, thus, have a vast design space to explore based on performance trade-offs, and expanded by the multitude of possibilities for target platforms. In order to allow systematic design space exploration, and develop scalable and portable prototypes, model based design tools are increasingly used in design and implementation of embedded systems. These tools allow scalable high-level representations, model based semantics for analysis and optimization, and portable implementations that can be verified at higher levels of abstractions and targeted toward multiple platforms for implementation. The designer can experiment using such tools at an early stage in the design cycle, and employ the latest hardware at later stages. In this thesis, we have focused on dataflow-based approaches for rapid DSP system prototyping. This thesis contributes to various aspects of dataflow-based design flows and tools as follows: 1. We have introduced the concept of topological patterns, which exploits commonly found repetitive patterns in DSP algorithms to allow scalable, concise, and parameterizable representations of large scale dataflow graphs in high-level languages. We have shown how an underlying design tool can systematically exploit a high

  8. Application of Machine Learning Algorithms for the Query Performance Prediction

    Directory of Open Access Journals (Sweden)

    MILICEVIC, M.

    2015-08-01

    Full Text Available This paper analyzes the relationship between the system load/throughput and the query response time in a real Online transaction processing (OLTP system environment. Although OLTP systems are characterized by short transactions, which normally entail high availability and consistent short response times, the need for operational reporting may jeopardize these objectives. We suggest a new approach to performance prediction for concurrent database workloads, based on the system state vector which consists of 36 attributes. There is no bias to the importance of certain attributes, but the machine learning methods are used to determine which attributes better describe the behavior of the particular database server and how to model that system. During the learning phase, the system's profile is created using multiple reference queries, which are selected to represent frequent business processes. The possibility of the accurate response time prediction may be a foundation for automated decision-making for database (DB query scheduling. Possible applications of the proposed method include adaptive resource allocation, quality of service (QoS management or real-time dynamic query scheduling (e.g. estimation of the optimal moment for a complex query execution.

  9. High performance graphics processors for medical imaging applications

    International Nuclear Information System (INIS)

    Goldwasser, S.M.; Reynolds, R.A.; Talton, D.A.; Walsh, E.S.

    1989-01-01

    This paper describes a family of high- performance graphics processors with special hardware for interactive visualization of 3D human anatomy. The basic architecture expands to multiple parallel processors, each processor using pipelined arithmetic and logical units for high-speed rendering of Computed Tomography (CT), Magnetic Resonance (MR) and Positron Emission Tomography (PET) data. User-selectable display alternatives include multiple 2D axial slices, reformatted images in sagittal or coronal planes and shaded 3D views. Special facilities support applications requiring color-coded display of multiple datasets (such as radiation therapy planning), or dynamic replay of time- varying volumetric data (such as cine-CT or gated MR studies of the beating heart). The current implementation is a single processor system which generates reformatted images in true real time (30 frames per second), and shaded 3D views in a few seconds per frame. It accepts full scale medical datasets in their native formats, so that minimal preprocessing delay exists between data acquisition and display

  10. Performances of an atmospheric tritium sampler and its application

    International Nuclear Information System (INIS)

    Inoue, Yoshikazu; Kahn, B.; Carter, M.W.

    1983-01-01

    A sampling system for atmospheric tritium in the form of water vapor, hydrogen and hydrocarbons was designed and built. The air was passed first through molecular sieve which adsorbed water vapor, then over palladium catalyst which oxidized hydrogen and adsorbed resulting water in situ, and finally over hot Hopcalite catalyst, which oxidized hydrocarbons and the resulting water was adsorbed on a following molecular sieve column. Three water samples were extracted from adsorbers and their tritium contents were measured by liquid scintillation counting. Performances of this sampler were examined for retrieval of tritiated water from molecular sieve, oxidation of hydrogen on palladium catalyst and oxidation of methane on Hopcalite. The portable sampler was applied to analyze tritium in a duct air of a heavy water moderated research reactor. More than 99% of total tritium was in vapor form. Trace amounts of tritiated hydrogen and hydrocarbon were also detected. This tritium sampler is applicable to detect all of atmospheric tritium as high as ten times of ambient levels. (author)

  11. Mobile Applications' Impact on Student Performance and Satisfaction

    Science.gov (United States)

    Alqahtani, Maha; Mohammad, Heba

    2015-01-01

    Mobile applications are rapidly growing in importance and can be used for various purposes. They had been used widely in education. One of the educational purposes for which mobile applications can be used is learning the right way to read and pronounce the verses of the Holy Quran. There are many applications that translate the Quran into several…

  12. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    Science.gov (United States)

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  13. Imaging-Based Screen Identifies Laminin 411 as a Physiologically Relevant Niche Factor with Importance for i-Hep Applications

    Directory of Open Access Journals (Sweden)

    John Ong

    2018-03-01

    Full Text Available Summary: Use of hepatocytes derived from induced pluripotent stem cells (i-Heps is limited by their functional differences in comparison with primary cells. Extracellular niche factors likely play a critical role in bridging this gap. Using image-based characterization (high content analysis; HCA of freshly isolated hepatocytes from 17 human donors, we devised and validated an algorithm (Hepatocyte Likeness Index; HLI for comparing the hepatic properties of cells against a physiological gold standard. The HLI was then applied in a targeted screen of extracellular niche factors to identify substrates driving i-Heps closer to the standard. Laminin 411, the top hit, was validated in two additional induced pluripotent stem cell (iPSC lines, primary tissue, and an in vitro model of α1-antitrypsin deficiency. Cumulatively, these data provide a reference method to control and screen for i-Hep differentiation, identify Laminin 411 as a key niche protein, and underscore the importance of combining substrates, soluble factors, and HCA when developing iPSC applications. : Rashid and colleagues demonstrate the utility of a high-throughput imaging platform for identification of physiologically relevant extracellular niche factors to advance i-Heps closer to their primary tissue counterparts. The extracellular matrix (ECM protein screen identified Laminin 411 as an important niche factor facilitating i-Hep-based disease modeling in vitro. Keywords: iPS hepatocytes, extracellular niche, image-based screening, disease modeling, laminin

  14. Performance of the BioPlex 2200 HIV Ag-Ab assay for identifying acute HIV infection.

    Science.gov (United States)

    Eshleman, Susan H; Piwowar-Manning, Estelle; Sivay, Mariya V; Debevec, Barbara; Veater, Stephanie; McKinstry, Laura; Bekker, Linda-Gail; Mannheimer, Sharon; Grant, Robert M; Chesney, Margaret A; Coates, Thomas J; Koblin, Beryl A; Fogel, Jessica M

    Assays that detect HIV antigen (Ag) and antibody (Ab) can be used to screen for HIV infection. To compare the performance of the BioPlex 2200 HIV Ag-Ab assay and two other Ag/Ab combination assays for detection of acute HIV infection. Samples were obtained from 24 individuals (18 from the US, 6 from South Africa); these individuals were classified as having acute infection based on the following criteria: positive qualitative RNA assay; two negative rapid tests; negative discriminatory test. The samples were tested with the BioPlex assay, the ARCHITECT HIV Ag/Ab Combo test, the Bio-Rad GS HIV Combo Ag-Ab EIA test, and a viral load assay. Twelve (50.0%) of 24 samples had RNA detected only ( > 40 to 13,476 copies/mL). Ten (43.5%) samples had reactive results with all three Ag/Ab assays, one sample was reactive with the ARCHITECT and Bio-Rad assays, and one sample was reactive with the Bio-Rad and BioPlex assays. The 11 samples that were reactive with the BioPlex assay had viral loads from 83,010 to >750,000 copies/mL; 9/11 samples were classified as Ag positive/Ab negative by the BioPlex assay. Detection of acute HIV infection was similar for the BioPlex assay and two other Ag/Ab assays. All three tests were less sensitive than a qualitative RNA assay and only detected HIV Ag when the viral load was high. The BioPlex assay detected acute infection in about half of the cases, and identified most of those infections as Ag positive/Ab negative. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Performance differences between male and female marines on standardized physical fitness tests and combat proxy tasks: identifying the gap.

    Science.gov (United States)

    Jameson, Jason; Pappa, Leon; McGuire, Brian; Kelly, Karen R

    2015-01-01

    For decades women have been restricted from direct assignment to certain military occupational specialties such as infantry. These restrictions can limit the advancement of women through the ranks of military leadership. Thus, the purpose of this effort was to identify those physical requirements most likely to serve as barriers for women wanting to enter closed combat arms positions, and to evaluate the quality of existing physical fitness tests as potential measures of assessment of combat readiness. Data were collected from 3 different sites within the US Marine Corps Training and Education Command. All participants (409 male, 379 femaile) were active-duty Marines who recently completed the Physical Fitness Test (PFT) and Combat Fitness Test (CFT). Participants completed 6 physical tasks: 120-mm tank loading drill, 155-mm artillery round carry, negotiating an obstacle course wall while wearing a fighting load (≈30 lb), pull-ups, deadlift, and clean and press. Overall, there was a high rate of successful completion on the combat proxy tasks (men, ≈80% to 100%; women, ≈70% to 100%), with the notable exception being the clean and press (men, 80%; women, 9%) and pull-ups (men, 16±4; women, 4±2). The PFT and CFT components tasks were also related, strongly in some cases, with performance on combat-related proxy tasks (Spearman's ρ typically ranged from 0.60 to 0.80). Estimates of fat-free mass and VO2max were also strongly related to an overall measure of combat readiness (Spearman's ρ=0.77 and ρ=0.56, respectively). The primary physical obstacle for women is upper body strength. However, some women could successfully complete all of the proxy tasks and thus are physically capable of meeting the demands of closed combat occupations. The fact that some female Marines could complete the most challenging upper body strength tasks suggests that these barriers are not inherent but may be due to a lack of training specificity.

  16. A Data Filter for Identifying Steady-State Operating Points in Engine Flight Data for Condition Monitoring Applications

    Science.gov (United States)

    Simon, Donald L.; Litt, Jonathan S.

    2010-01-01

    This paper presents an algorithm that automatically identifies and extracts steady-state engine operating points from engine flight data. It calculates the mean and standard deviation of select parameters contained in the incoming flight data stream. If the standard deviation of the data falls below defined constraints, the engine is assumed to be at a steady-state operating point, and the mean measurement data at that point are archived for subsequent condition monitoring purposes. The fundamental design of the steady-state data filter is completely generic and applicable for any dynamic system. Additional domain-specific logic constraints are applied to reduce data outliers and variance within the collected steady-state data. The filter is designed for on-line real-time processing of streaming data as opposed to post-processing of the data in batch mode. Results of applying the steady-state data filter to recorded helicopter engine flight data are shown, demonstrating its utility for engine condition monitoring applications.

  17. An accurate analysis for guaranteed performance of multiprocessor streaming applications

    NARCIS (Netherlands)

    Poplavko, P.

    2008-01-01

    Already for more than a decade, consumer electronic devices have been available for entertainment, educational, or telecommunication tasks based on multimedia streaming applications, i.e., applications that process streams of audio and video samples in digital form. Multimedia capabilities are

  18. Content Validity and Psychometric Properties of the Nomination Scale for Identifying Football Talent (NSIFT: Application to Coaches, Parents and Players

    Directory of Open Access Journals (Sweden)

    Alejandro Prieto-Ayuso

    2017-01-01

    Full Text Available The identification of football talent is a critical issue both for clubs and the families of players. However, despite its importance in a sporting, economic and social sense, there appears to be a lack of instruments that can reliably measure talent performance. The aim of this study was to design and validate the Nomination Scale for Identifying Football Talent (NSIFT, with the aim of optimising the processes for identifying said talent. The scale was first validated through expert judgment, and then statistically, by means of an exploratory factor analysis (EFA, confirmatory factor analysis (CFA, internal reliability and convergent validity. The results reveal the presence of three factors in the scale’s factor matrix, with these results being confirmed by the CFA. The scale revealed suitable internal reliability and homogeneity indices. Convergent validity showed that it is teammates who are best able to identify football talent, followed by coaches and parents. It can be concluded that the NSIFT is suitable for use in the football world. Future studies should seek to confirm these results in different contexts by means of further CFAs.

  19. Evaluation of the feasibility and performance of early warning scores to identify patients at risk of adverse outcomes in a low-middle income country setting

    Science.gov (United States)

    Beane, Abi; De Silva, Ambepitiyawaduge Pubudu; De Silva, Nirodha; Sujeewa, Jayasingha A; Rathnayake, R M Dhanapala; Sigera, P Chathurani; Athapattu, Priyantha Lakmini; Mahipala, Palitha G; Rashan, Aasiyah; Munasinghe, Sithum Bandara; Jayasinghe, Kosala Saroj Amarasiri; Dondorp, Arjen M; Haniffa, Rashan

    2018-01-01

    Objective This study describes the availability of core parameters for Early Warning Scores (EWS), evaluates the ability of selected EWS to identify patients at risk of death or other adverse outcome and describes the burden of triggering that front-line staff would experience if implemented. Design Longitudinal observational cohort study. Setting District General Hospital Monaragala. Participants All adult (age >17 years) admitted patients. Main outcome measures Existing physiological parameters, adverse outcomes and survival status at hospital discharge were extracted daily from existing paper records for all patients over an 8-month period. Statistical analysis Discrimination for selected aggregate weighted track and trigger systems (AWTTS) was assessed by the area under the receiver operating characteristic (AUROC) curve. Performance of EWS are further evaluated at time points during admission and across diagnostic groups. The burden of trigger to correctly identify patients who died was evaluated using positive predictive value (PPV). Results Of the 16 386 patients included, 502 (3.06%) had one or more adverse outcomes (cardiac arrests, unplanned intensive care unit admissions and transfers). Availability of physiological parameters on admission ranged from 90.97% (95% CI 90.52% to 91.40%) for heart rate to 23.94% (95% CI 23.29% to 24.60%) for oxygen saturation. Ability to discriminate death on admission was less than 0.81 (AUROC) for all selected EWS. Performance of the best performing of the EWS varied depending on admission diagnosis, and was diminished at 24 hours prior to event. PPV was low (10.44%). Conclusion There is limited observation reporting in this setting. Indiscriminate application of EWS to all patients admitted to wards in this setting may result in an unnecessary burden of monitoring and may detract from clinician care of sicker patients. Physiological parameters in combination with diagnosis may have a place when applied on admission to

  20. An application of data mining in district heating substations for improving energy performance

    Science.gov (United States)

    Xue, Puning; Zhou, Zhigang; Chen, Xin; Liu, Jing

    2017-11-01

    Automatic meter reading system is capable of collecting and storing a huge number of district heating (DH) data. However, the data obtained are rarely fully utilized. Data mining is a promising technology to discover potential interesting knowledge from vast data. This paper applies data mining methods to analyse the massive data for improving energy performance of DH substation. The technical approach contains three steps: data selection, cluster analysis and association rule mining (ARM). Two-heating-season data of a substation are used for case study. Cluster analysis identifies six distinct heating patterns based on the primary heat of the substation. ARM reveals that secondary pressure difference and secondary flow rate have a strong correlation. Using the discovered rules, a fault occurring in remote flow meter installed at secondary network is detected accurately. The application demonstrates that data mining techniques can effectively extrapolate potential useful knowledge to better understand substation operation strategies and improve substation energy performance.

  1. Veld condition and animal performance: application of an optimal ...

    African Journals Online (AJOL)

    Keywords: Diet choices; Food ingestion; Foraging models; Grassland conditions; Herbivores; animal performance; animal production; diet choice; digestion rate; grazing; needs; quality; veld condition score; condition; performance; grassland; production; foraging; model; food; ingestion; digestion; food preferences; food ...

  2. Performance scores in general practice: a comparison between the clinical versus medication-based approach to identify target populations.

    Directory of Open Access Journals (Sweden)

    Olivier Saint-Lary

    Full Text Available CONTEXT: From one country to another, the pay-for-performance mechanisms differ on one significant point: the identification of target populations, that is, populations which serve as a basis for calculating the indicators. The aim of this study was to compare clinical versus medication-based identification of populations of patients with diabetes and hypertension over the age of 50 (for men or 60 (for women, and any consequences this may have on the calculation of P4P indicators. METHODS: A comparative, retrospective, observational study was carried out with clinical and prescription data from a panel of general practitioners (GPs, the Observatory of General Medicine (OMG for the year 2007. Two indicators regarding the prescription for statins and aspirin in these populations were calculated. RESULTS: We analyzed data from 21.690 patients collected by 61 GPs via electronic medical files. Following the clinical-based approach, 2.278 patients were diabetic, 8,271 had hypertension and 1.539 had both against respectively 1.730, 8.511 and 1.304 following the medication-based approach (% agreement = 96%, kappa = 0.69. The main reasons for these differences were: forgetting to code the morbidities in the clinical approach, not taking into account the population of patients who were given life style and diet rules only or taking into account patients for whom morbidities other than hypertension could justify the use of antihypertensive drugs in the medication-based approach. The mean (confidence interval per doctor was 33.7% (31.5-35.9 for statin indicator and 38.4% (35.4-41.4 for aspirin indicator when the target populations were identified on the basis of clinical criteria whereas they were 37.9% (36.3-39.4 and 43.8% (41.4-46.3 on the basis of treatment criteria. CONCLUSION: The two approaches yield very "similar" scores but these scores cover different realities and offer food for thought on the possible usage of these indicators in the

  3. Handbook of polymer nanocomposites processing, performance and application

    CERN Document Server

    Mohanty, Amar; Misra, Manjusri; Kar, Kamal K; Pandey, Jitendra; Rana, Sravendra; Takagi, Hitoshi; Nakagaito, Antonio; Kim, Hyun-Joong

    Volume A forms one volume of a Handbook about Polymer Nanocomposites. In some 20 chapters the preparation, architecture, characterisation, properties and application of polymer nanocomposites are discussed by experts in their respective fields.

  4. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    Science.gov (United States)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  5. What's down below? Current and potential future applications of geophysical techniques to identify subsurface permafrost conditions (Invited)

    Science.gov (United States)

    Douglas, T. A.; Bjella, K.; Campbell, S. W.

    2013-12-01

    For infrastructure design, operations, and maintenance requirements in the North the ability to accurately and efficiently detect the presence (or absence) of ground ice in permafrost terrains is a serious challenge. Ground ice features including ice wedges, thermokarst cave-ice, and segregation ice are present in a variety of spatial scales and patterns. Currently, most engineering applications use borehole logging and sampling to extrapolate conditions at the point scale. However, there is high risk of over or under estimating the presence of frozen or unfrozen features when relying on borehole information alone. In addition, boreholes are costly, especially for planning linear structures like roads or runways. Predicted climate warming will provide further challenges for infrastructure development and transportation operations where permafrost degradation occurs. Accurately identifying the subsurface character in permafrost terrains will allow engineers and planners to cost effectively create novel infrastructure designs to withstand the changing environment. There is thus a great need for a low cost rapidly deployable, spatially extensive means of 'measuring' subsurface conditions. Geophysical measurements, both terrestrial and airborne, have strong potential to revolutionize our way of mapping subsurface conditions. Many studies in continuous and discontinuous permafrost have used geophysical measurements to identify discrete features and repeatable patterns in the subsurface. The most common measurements include galvanic and capacitive coupled resistivity, ground penetrating radar, and multi frequency electromagnetic induction techniques. Each of these measurements has strengths, weaknesses, and limitations. By combining horizontal geophysical measurements, downhole geophysics, multispectral remote sensing images, LiDAR measurements, and soil and vegetation mapping we can start to assemble a holistic view of how surface conditions and standoff measurements

  6. Application of multiple tracers (SF6 and chloride) to identify the transport by characteristics of contaminant at two separate contaminated sites

    Science.gov (United States)

    Lee, K. K.; Lee, S. S.; Kim, H. H.; Koh, E. H.; Kim, M. O.; Lee, K.; Kim, H. J.

    2016-12-01

    Multiple tracers were applied for source and pathway detection at two different sites. CO2 gas injected in the subsurface for a shallow-depth CO2 injection and leak test can be regarded as a potential contaminant source. Therefore, it is necessary to identify the migration pattern of CO2 gas. Also, at a DNAPL contaminated site, it is important to figure out the characteristics of plume evolution from the source zone. In this study, multiple tracers (SF6 and chloride) were used to evaluate the applicability of volatile and non-volatile tracers and to identify the characteristics of contaminant transport at each CO2 injection and leak test site and DNAPL contaminated site. Firstly, at the CO2 test site, multiple tracers were used to perform the single well push-drift-pull tracer test at total 3 specific depth zones. As results of tests, volatile and non-volatile tracers showed different mass recovery percentage. Most of chloride mass was recovered but less than half of SF6 mass was recovered due to volatile property. This means that only gaseous SF6 leak out to unsaturated zone. However, breakthrough curves of both tracers indicated similar peak time, effective porosity, and regional groundwater velocity. Also, at both contaminated sites, natural gradient tracer tests were performed with multiple tracers. With the results of natural gradient tracer test, it was possible to confirm the applicability of multiple tracers and to understand the contaminant transport in highly heterogeneous aquifer systems through the long-term monitoring of tracers. Acknowledgement: financial support was provided by the R&D Project on Environmental Management of Geologic CO2 Storage)" from the KEITI (Project Number: 2014001810003) and Korea Ministry of Environment as "The GAIA project (2014000540010)".

  7. Innovative new technologies to identify and treat traumatic brain injuries: crossover technologies and approaches between military and civilian applications.

    Science.gov (United States)

    Doarn, Charles R; McVeigh, Francis; Poropatich, Ronald

    2010-04-01

    Traumatic brain injury (TBI) has become the signature injury of Operation Iraqi Freedom and Operation Enduring Freedom. The use of improvised explosive devices has seen an exponential increase in both Iraq and Afghanistan. In previous conflicts prior to Iraq, survivability of such an injury was far less. Today, technological improvements in trauma care have increased an injured warfighter's chance of survival. A reduction in severe TBI has been achieved but an increase in mild or moderate TBI has been observed. The consequences of this kind of injury can be both physical and mental and can often be hidden or even misdiagnosed. The U.S. Army is interested in pursuing technological solutions for early detection and treatment of TBI to reduce its lasting impact on the warfighter. Such technological breakthroughs have benefit beyond the military, as TBI is a high probable event in nonmilitary settings as well. To gauge what technologies or methods are currently available, the U.S. Army's Telemedicine and Advanced Technology Research Center partnered with the American Telemedicine Association to organize and conduct a discipline-specific symposium entitled "Innovative New Technologies to Identify and Treat Traumatic Brain Injuries: Crossover Technologies and Approaches Between Military and Civilian Applications." This symposium was held in Palm Springs, CA, in September 2009. The purpose of the meeting was to provide a unique opportunity for leaders from disparate organizations involved in telemedicine and related other activities to meet and explore opportunities to collaborate in new partnership models. The meeting was designed to help Telemedicine and Advanced Technology Research Center identify opportunities to expand strategic operations and form new alliances. This report summarizes this symposium while raising awareness for collaboration into better ways of adapting and adopting technologies to address this growing health issue.

  8. Assessing Confidence in Performance Assessments Using an Evidence Support Logic Methodology: An Application of Tesla

    International Nuclear Information System (INIS)

    Egan, M.; Paulley, A.; Lehman, L.; Lowe, J.; Rochette, E.; Baker, St.

    2009-01-01

    The assessment of uncertainties and their implications is a key requirement when undertaking performance assessment (PA) of radioactive waste facilities. Decisions based on the outcome of such assessments become translated into judgments about confidence in the information they provide. This confidence, in turn, depends on uncertainties in the underlying evidence. Even if there is a large amount of information supporting an assessment, it may be only partially relevant, incomplete or less than completely reliable. In order to develop a measure of confidence in the outcome, sources of uncertainty need to be identified and adequately addressed in the development of the PA, or in any overarching strategic decision-making processes. This paper describes a trial application of the technique of Evidence Support Logic (ESL), which has been designed for application in support of 'high stakes' decisions, where important aspects of system performance are subject to uncertainty. The aims of ESL are to identify the amount of uncertainty or conflict associated with evidence relating to a particular decision, and to guide understanding of how evidence combines to support confidence in judgments. Elicitation techniques are used to enable participants in the process to develop a logical hypothesis model that best represents the relationships between different sources of evidence to the proposition under examination. The aim is to identify key areas of subjectivity and other sources of potential bias in the use of evidence (whether for or against the proposition) to support judgments of confidence. Propagation algorithms are used to investigate the overall implications of the logic according to the strength of the underlying evidence and associated uncertainties. (authors)

  9. Impact of wireless communication on multimedia application performance

    Science.gov (United States)

    Brown, Kevin A.

    1999-01-01

    Multimedia applications and specifically voice and video conferencing tools are widely used in business communications, and are quickly being discovered by the consumer market as well. At the same time, wireless communication services such as PCS voice and cellular data are becoming very popular, leading to the desire to deploy multimedia applications in the wireless environment. Wireless links, however, exhibit several characteristics which are different from traditional wired networks. These include: dynamically changing bandwidth due to mobile host movement in and out of cell where bandwidth is shared, high rates of packet corruption and subsequent loss, and frequent are lengthy disconnections due to obstacles, fading, and movement between cells. In addition, these effects are short-lived and difficult to reproduce, leading to a lack of adequate testing and analysis for applications used in wireless environments.

  10. Performance Based Building and its application to Healthy Buildings

    NARCIS (Netherlands)

    Loomans, M.G.L.C.; Bluyssen, P.M.

    2005-01-01

    The European funded Project PeBBu, Performance Based Building, is a Thematic network under the Competitive and Sustainable Growth program, which started September 1st, 2001 andwill run for 4 years. In one of the domains of PeBBu, the domain Indoor Environment, a stateof-the-art on the Performance

  11. Identifying New Strategies to Assess and Promote Online Health Communication and Social Media Outreach: An Application in Bullying Prevention.

    Science.gov (United States)

    Edgerton, Elizabeth; Reiney, Erin; Mueller, Siobhan; Reicherter, Barry; Curtis, Katherine; Waties, Stephanie; Limber, Susan P

    2016-05-01

    Every day in classrooms, playgrounds and school hallways, through text messages and mobile technology apps, children are bullied by other children. Conversations about this bullying-what it is, who is involved, and how to stop it-are taking place online. To fill a need for relevant, research-based materials on bullying, the U.S. Department of Health and Human Services' Health Resources and Services Administration worked with Widmeyer Communications to investigate the scope of media conversations about bullying and discover new strategies for promoting appropriate public health messages about bullying to intended audiences. Key components of the methodology included: analyzing common search terms and aligning social media content with terms used in searches rather than technical language; identifying influencers in social media spheres, cultivating relationships with them, and sharing their positive, relevant content; examining which digital formats are most popular for sharing and creating content across platforms; tracking and reporting on a wide variety of metrics (such as click-through and engagement rates and reach, resonance, relevance, and Klout scores) to understand conversations around bullying; and looking at online conversations and engaging participants using applicable resources and calls to action. A key finding included a significant gap between search terms and online content and has led to recommendations and comprehensive ideas for improving the reach and resonance of StopBullying.gov content and communications. © 2016 Society for Public Health Education.

  12. Performance Testing of Data Delivery Techniques for AJAX Applications

    NARCIS (Netherlands)

    Bozdag, E.; Mesbah, A.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: Journal of Web Engineering (Rinton Press), 8 (4), 2009 AJAX applications are designed to have high user interactivity and low user-perceived latency. Real-time dynamic web data such as news headlines, stock tickers, and auction updates need to be propagated to the

  13. Multiuser MIMO: Principle, Performance in Measured Channels and Applicable Service

    DEFF Research Database (Denmark)

    Bauch, Gerhard; Tejera, Pedro; Utschick, Wolfgang

    2007-01-01

    The exploitation of multiuser diversity and the application of multiple antennas at transmitter and receiver are considered to be key technologies for future highly bandwidthefficient wireless systems. We combine both ideas in a downlink multicarrier transmission scheme where multiple users compe...

  14. Application and Performance of 3D Printing in Nanobiomaterials

    Directory of Open Access Journals (Sweden)

    Wenyong Liu

    2013-01-01

    Full Text Available 3D printing (3DP is becoming a research and development focus in nanobiomaterials as it can quickly and accurately fabricate any desired 3D tissuemodel only if itssize is appropriate. The different material powders (with different dimensional scales and the printing strategies are the most direct factors influencing 3DP quality. With the development of nanotechnologies, 3DP is adopted more frequently for its rapidness in fabrication and precision in geometry. The fabrication in micro/nanoscale may change the performance of biomaterials and devices because it can retain more anisotropy of biomaterials compared with the traditionally rapid prototyping techniques. Thus, the biosafety issue is especially concerned by many researchers and is investigated in performance and safety of biomaterials and devices. This paper investigates the performance of 3DP in fabrication of nanobiomaterials and devices so as to partially explain how 3DP influences the performance and safety of nanobiomaterials.

  15. Development and application of high performance liquid shielding materials

    International Nuclear Information System (INIS)

    Miura, Toshimasa; Omata, Sadao; Otano, Naoteru; Hirao, Yoshihiro; Kanai, Yasuji

    1998-01-01

    Development of liquid shielding material with good performance for neutron and γ-ray was investigated. Lead, hydrogen and boron were selected as the elements of shielding materials which were made by the ultraviolet curing method. Good performance shielding materials with about 1 mm width to neutron and gamma ray were produced by mixing lead, boron compound and ultraviolet curing monomer with many hydrogens. The shielding performance was the same as a concrete with two times width. The activation was very small such as 1/10 6 -1/10 8 of the standard concrete. The weight and the external appearance did not charged from room temperature to 100degC. Polyfunctional monomer had good thermal resistance. This shielding material was applied to double bending cylindrical duct and annulus ring duct. The results proved the shielding materials developed had good performance. (S.Y.)

  16. Direct alcohol fuel cells materials, performance, durability and applications

    CERN Document Server

    Corti, Horacio R; Antolini, Ermete

    2014-01-01

    After an introductory overview of this emerging form of clean, portable energy, experts from industry and academia discuss the challenges in materials development, performance, and commercialization standing between DAFCs and widespread public use.

  17. Properties, performance, and applications of biofuel blends: a review

    Directory of Open Access Journals (Sweden)

    Husam Al-Mashhadani

    2017-08-01

    Full Text Available Biofuels such as ethanol and biodiesel derived from living plants or animal matter can be used directly in their neat forms or as blends with their fossil counterparts in internal combustion engines. Although the properties and performance of neat biofuels have been extensively reported, this is not the case for many blends. The purpose of this review is to analyze different forms of biofuel blends that are under research and development comparing their utility and performance in the two primary classes of engines, i.e., spark ignition and compression ignition engines. The fuel properties, performance and emission characteristics, advantages and disadvantages of various fuel blends are compared and discussed. The analysis reveals certain blends possess better overall fuel properties and yield better overall performance than the neat or fossil forms.

  18. Network DEA: an application to analysis of academic performance

    Science.gov (United States)

    Saniee Monfared, Mohammad Ali; Safi, Mahsa

    2013-05-01

    As governmental subsidies to universities are declining in recent years, sustaining excellence in academic performance and more efficient use of resources have become important issues for university stakeholders. To assess the academic performances and the utilization of the resources, two important issues need to be addressed, i.e., a capable methodology and a set of good performance indicators as we consider in this paper. In this paper, we propose a set of performance indicators to enable efficiency analysis of academic activities and apply a novel network DEA structure to account for subfunctional efficiencies such as teaching quality, research productivity, as well as the overall efficiency. We tested our approach on the efficiency analysis of academic colleges at Alzahra University in Iran.

  19. Performance Evaluation of Polybenzimidazole for Potential Aerospace Applications

    NARCIS (Netherlands)

    Iqbal, H.M.S.

    2014-01-01

    With the increasing use of polymer based composite materials, there is an increasing demand of polymeric resins with high glass transition temperature, high thermal stability and excellent mechanical properties at high temperature. Polybenzimidazole (PBI) is a recently emerged high performance

  20. Application of objective clinical human reliability analysis (OCHRA) in assessment of technical performance in laparoscopic rectal cancer surgery.

    Science.gov (United States)

    Foster, J D; Miskovic, D; Allison, A S; Conti, J A; Ockrim, J; Cooper, E J; Hanna, G B; Francis, N K

    2016-06-01

    Laparoscopic rectal resection is technically challenging, with outcomes dependent upon technical performance. No robust objective assessment tool exists for laparoscopic rectal resection surgery. This study aimed to investigate the application of the objective clinical human reliability analysis (OCHRA) technique for assessing technical performance of laparoscopic rectal surgery and explore the validity and reliability of this technique. Laparoscopic rectal cancer resection operations were described in the format of a hierarchical task analysis. Potential technical errors were defined. The OCHRA technique was used to identify technical errors enacted in videos of twenty consecutive laparoscopic rectal cancer resection operations from a single site. The procedural task, spatial location, and circumstances of all identified errors were logged. Clinical validity was assessed through correlation with clinical outcomes; reliability was assessed by test-retest. A total of 335 execution errors identified, with a median 15 per operation. More errors were observed during pelvic tasks compared with abdominal tasks (p technical performance of laparoscopic rectal surgery.

  1. Performances of Magnetic Fluid Seal and Application to Turbopumps

    OpenAIRE

    北洞, 貴也; 黒川, 淳一; 宮副, 雄貴; 林, 正悦

    1994-01-01

    A magnetic fluid shaft seal can achieve zero-leakage and operate stably against shaft vibration, but the sealing pressure is very low. In order to improve the pressure performance of a magnetic fluid seal and apply it to a turbopump, the seal pressure characteristics are studied theoretically and experimentally. The Poisson equation for magnetic vector potential is solved by FEM, and the seal performances are determined by use of the Bernoulli equation. The validity of the theory is confirmed...

  2. Performance prediction for families of data-intensive software applications

    NARCIS (Netherlands)

    Vasenev, A.

    2018-01-01

    Partial Networking, as a mechanism for moving-to-sleep and waking-up embedded systems, is beneficial for saving energy within a vehicle (or within other complex distributed systems). Even though a number of models exist which identify benefits of partial networking, they often address rather

  3. High Performance Computing for Solving Fractional Differential Equations with Applications

    OpenAIRE

    Zhang, Wei

    2014-01-01

    Fractional calculus is the generalization of integer-order calculus to rational order. This subject has at least three hundred years of history. However, it was traditionally regarded as a pure mathematical field and lacked real world applications for a very long time. In recent decades, fractional calculus has re-attracted the attention of scientists and engineers. For example, many researchers have found that fractional calculus is a useful tool for describing hereditary materials and p...

  4. Services Acquisition in the Department of Defense: Analysis of Operational and Performance Data to Identify Drivers of Success

    Science.gov (United States)

    2015-03-24

    improving the disclosure of CPARS program office Audit results (Black et al., 2014, pp. 48–49). Acquisition Research Program Graduate School of...improving the disclosure of CPARS program office audit results (Black et al., 2014, pp. 44–49). Recommendations Based on our conclusions, we identified...Fitzsimmons, J. A., & Fitzsimmons, M. J. (2006). Service management: Operations, strategy, and information technology (5th ed.). New York, NY: McGraw -Hill

  5. Patient’s Cross-border Mobility Directive: Application, Performance and Perceptions Two Years after Transposition

    Directory of Open Access Journals (Sweden)

    Riedel Rafał

    2016-10-01

    Full Text Available This paper seeks to analyse the directive on the application of patients’ rights in cross-border healthcare. Two years after the transposition, it is time for first evaluations of its application, performance and perception. The analysis consists of three major elements: reconstruction of the legal scope and subject matter of the new legislation, conclusions of the evaluative reports monitoring its implementation and performance as well as the public opinion polls revealing the EU citizens’ perception of its details. These three components combined together deliver a picture of the state of play about the pan-European cross-border patients’ mobility. The bottomline conclusions negatively verify the supposition present in some earlier literature on patients’ cross-border mobility that the directive has a transformative potential leading towards the creation of truly competitive pan-European medical market. After two years of its operation, there is still no increased patients’ mobility across EU internal borders observed. As regards the speculations for the future, there are only some weak symptoms identified and they may result in intensified cross-border mobility for healthcare.

  6. An evaluation of applicability of seismic refraction method in identifying shallow archaeological features A case study at archaeological site

    Science.gov (United States)

    Jahangardi, Morteza; Hafezi Moghaddas, Naser; Keivan Hosseini, Sayyed; Garazhian, Omran

    2015-04-01

    We applied the seismic refraction method at archaeological site, Tepe Damghani located in Sabzevar, NE of Iran, in order to determine the structures of archaeological interests. This pre-historical site has special conditions with respect to geographical location and geomorphological setting, so it is an urban archaeological site, and in recent years it has been used as an agricultural field. In spring and summer of 2012, the third season of archaeological excavation was carried out. Test trenches of excavations in this site revealed that cultural layers were often disturbed adversely due to human activities such as farming and road construction in recent years. Conditions of archaeological cultural layers in southern and eastern parts of Tepe are slightly better, for instance, in test trench 3×3 m²1S03, third test trench excavated in the southern part of Tepe, an adobe in situ architectural structure was discovered that likely belongs to cultural features of a complex with 5 graves. After conclusion of the third season of archaeological excavation, all of the test trenches were filled with the same soil of excavated test trenches. Seismic refraction method was applied with12 channels of P geophones in three lines with a geophone interval of 0.5 meter and a 1.5 meter distance between profiles on test trench 1S03. The goal of this operation was evaluation of applicability of seismic method in identification of archaeological features, especially adobe wall structures. Processing of seismic data was done with the seismic software, SiesImager. Results were presented in the form of seismic section for every profile, so that identification of adobe wall structures was achieved hardly. This could be due to that adobe wall had been built with the same materials of the natural surrounding earth. Thus, there is a low contrast and it has an inappropriate effect on seismic processing and identifying of archaeological features. Hence the result could be that application of

  7. Performance Analysis of a Bunch and Track Identifier Prototype (BTI) for the CMS Barrel Muon Drift Chambers; Estudio de las Prestaciones de un Prototipo de Bunch and Track Identifier (BTI) para las Camaras de Deriva de CMS

    Energy Technology Data Exchange (ETDEWEB)

    Puerta Pelayo, J.

    2001-07-01

    This note contains a short description of the first step in the first level trigger applied to the barrel muon drift chambers of CMS: the Bunch and Track Identifier (BTI). The test beam results obtained with a BTI prototype have been also analysed BTI performance for different incidence angles and in presence of external magnetic field has been tested, as well as BTI capability as trigger device and track reconstructor. (Author) 30 refs.

  8. Application Performance Management Impact On Organizational Performance Local Company Studies In West Java - Indonesia

    Directory of Open Access Journals (Sweden)

    Teni Listiani

    2015-02-01

    Full Text Available Abstract This study aimed to analyze the effect on the performance of the companys performance management area. Management performance is measured through three dimensions performance planning performance assessment and feeding it behind. While organizational performance is measured through four dimensions financial perspective customer perspective internal business processes and learning and growth perspective. The unit of analysis in this research area comprising 30 companies from 23 taps 6 PD Market and 1 PD Health. Meanwhile the unit of observation is the top-middle-level managers-down of a total of 360 people. To determine the influence of the variables studied used Structural Equation Model SEM based on the model variant with Partial Least Square PLS. The results showed that in the enterprise area performance management affect the performance of the organization but the effect is not too large.

  9. A panel of microsatellites to individually identify leopards and its application to leopard monitoring in human dominated landscapes

    Directory of Open Access Journals (Sweden)

    Selvaraj Velu

    2009-12-01

    Full Text Available Abstract Background Leopards are the most widely distributed of the large cats, ranging from Africa to the Russian Far East. Because of habitat fragmentation, high human population densities and the inherent adaptability of this species, they now occupy landscapes close to human settlements. As a result, they are the most common species involved in human wildlife conflict in India, necessitating their monitoring. However, their elusive nature makes such monitoring difficult. Recent advances in DNA methods along with non-invasive sampling techniques can be used to monitor populations and individuals across large landscapes including human dominated ones. In this paper, we describe a DNA-based method for leopard individual identification where we used fecal DNA samples to obtain genetic material. Further, we apply our methods to non-invasive samples collected in a human-dominated landscape to estimate the minimum number of leopards in this human-leopard conflict area in Western India. Results In this study, 25 of the 29 tested cross-specific microsatellite markers showed positive amplification in 37 wild-caught leopards. These loci revealed varied levels of polymorphism (four-12 alleles and heterozygosity (0.05-0.79. Combining data on amplification success (including non-invasive samples and locus specific polymorphisms, we showed that eight loci provide a sibling probability of identity of 0.0005, suggesting that this panel can be used to discriminate individuals in the wild. When this microsatellite panel was applied to fecal samples collected from a human-dominated landscape, we identified 7 individuals, with a sibling probability of identity of 0.001. Amplification success of field collected scats was up to 72%, and genotype error ranged from 0-7.4%. Conclusion Our results demonstrated that the selected panel of eight microsatellite loci can conclusively identify leopards from various kinds of biological samples. Our methods can be used to

  10. A Spreadsheet-Based Visualized Mindtool for Improving Students' Learning Performance in Identifying Relationships between Numerical Variables

    Science.gov (United States)

    Lai, Chiu-Lin; Hwang, Gwo-Jen

    2015-01-01

    In this study, a spreadsheet-based visualized Mindtool was developed for improving students' learning performance when finding relationships between numerical variables by engaging them in reasoning and decision-making activities. To evaluate the effectiveness of the proposed approach, an experiment was conducted on the "phenomena of climate…

  11. The application of DEA model in enterprise environmental performance auditing

    Science.gov (United States)

    Li, F.; Zhu, L. Y.; Zhang, J. D.; Liu, C. Y.; Qu, Z. G.; Xiao, M. S.

    2017-01-01

    As a part of society, enterprises have an inescapable responsibility for environmental protection and governance. This article discusses the feasibility and necessity of enterprises environmental performance auditing and uses DEA model calculate the environmental performance of Haier for example. The most of reference data are selected and sorted from Haier’s environmental reportspublished in 2008, 2009, 2011 and 2015, and some of the data from some published articles and fieldwork. All the calculation results are calculated by DEAP software andhave a high credibility. The analysis results of this article can give corporate managements an idea about using environmental performance auditing to adjust their corporate environmental investments capital quota and change their company’s environmental strategies.

  12. Orion: a web-based application designed to monitor resident and fellow performance on-call.

    Science.gov (United States)

    Itri, Jason N; Kim, Woojin; Scanlon, Mary H

    2011-10-01

    Radiology residency and fellowship training provides a unique opportunity to evaluate trainee performance and determine the impact of various educational interventions. We have developed a simple software application (Orion) using open-source tools to facilitate the identification and monitoring of resident and fellow discrepancies in on-call preliminary reports. Over a 6-month period, 19,200 on-call studies were interpreted by 20 radiology residents, and 13,953 on-call studies were interpreted by 25 board-certified radiology fellows representing eight subspecialties. Using standard review macros during faculty interpretation, each of these reports was classified as "agreement", "minor discrepancy", and "major discrepancy" based on the potential to impact patient management or outcome. Major discrepancy rates were used to establish benchmarks for resident and fellow performance by year of training, modality, and subspecialty, and to identify residents and fellows demonstrating a significantly higher major discrepancy rate compared with their classmates. Trends in discrepancies were used to identify subspecialty-specific areas of increased major discrepancy rates in an effort to tailor the didactic and case-based curriculum. A series of missed-case conferences were developed based on trends in discrepancies, and the impact of these conferences is currently being evaluated. Orion is a powerful information technology tool that can be used by residency program directors, fellowship programs directors, residents, and fellows to improve radiology education and training.

  13. Associations between Otolaryngology Applicant Characteristics and Future Performance in Residency or Practice: A Systematic Review.

    Science.gov (United States)

    Bowe, Sarah N; Laury, Adrienne M; Gray, Stacey T

    2017-06-01

    Objective This systematic review aims to evaluate which applicant characteristics available to an otolaryngology selection committee are associated with future performance in residency or practice. Data Sources PubMed, Scopus, ERIC, Health Business, Psychology and Behavioral Sciences Collection, and SocINDEX. Review Methods Study eligibility was performed by 2 independent investigators in accordance with the PRISMA protocol (Preferred Reporting Items for Systematic Reviews and Meta-analyses). Data obtained from each article included research questions, study design, predictors, outcomes, statistical analysis, and results/findings. Study bias was assessed with the Quality in Prognosis Studies tool. Results The initial search identified 439 abstracts. Six articles fulfilled all inclusion and exclusion criteria. All studies were retrospective cohort studies (level 4). Overall, the studies yielded relatively few criteria that correlated with residency success, with generally conflicting results. Most studies were found to have a high risk of bias. Conclusion Previous resident selection research has lacked a theoretical background, thus predisposing this work to inconsistent results and high risk of bias. The included studies provide historical insight into the predictors and criteria (eg, outcomes) previously deemed pertinent by the otolaryngology field. Additional research is needed, possibly integrating aspects of personnel selection, to engage in an evidence-based approach to identify highly qualified candidates who will succeed as future otolaryngologists.

  14. Cross sectional study of performance indicators for English Primary Care Trusts: testing construct validity and identifying explanatory variables

    Directory of Open Access Journals (Sweden)

    Lilford Richard

    2006-06-01

    Full Text Available Abstract Background The performance of Primary Care Trusts in England is assessed and published using a number of different performance indicators. Our study has two broad purposes. Firstly, to find out whether pairs of indicators that purport to measure similar aspects of quality are correlated (as would be expected if they are both valid measures of the same construct. Secondly, we wanted to find out whether broad (global indicators correlated with any particular features of Primary Care Trusts, such as expenditure per capita. Methods Cross sectional quantitative analysis using data from six 2004/05 PCT performance indicators for 303 English Primary Care Trusts from four sources in the public domain: Star Rating, aggregated Quality and Outcomes Framework scores, Dr Foster mortality index, Dr Foster equity index (heart by-pass and hip replacements, NHS Litigation Authority Risk Management standards and Patient Satisfaction scores from the Star Ratings. Forward stepwise multiple regression analysis to determine the effect of Primary Care Trust characteristics on performance. Results Star Rating and Quality and Outcomes Framework total, both summary measures of global quality, were not correlated with each other (F = 0.66, p = 0.57. There were however positive correlations between Quality and Outcomes Framework total and patient satisfaction (r = 0.61, p Conclusion Performance assessment in healthcare remains on the Government's agenda, with new core and developmental standards set to replace the Star Ratings in 2006. Yet the results of this analysis provide little evidence that the current indicators have sufficient construct validity to measure the underlying concept of quality, except when the specific area of screening is considered.

  15. Techniques for Performance Improvement of Integer Multiplication in Cryptographic Applications

    Directory of Open Access Journals (Sweden)

    Robert Brumnik

    2014-01-01

    Full Text Available The problem of arithmetic operations performance in number fields is actively researched by many scientists, as evidenced by significant publications in this field. In this work, we offer some techniques to increase performance of software implementation of finite field multiplication algorithm, for both 32-bit and 64-bit platforms. The developed technique, called “delayed carry mechanism,” allows to preventing necessity to consider a significant bit carry at each iteration of the sum accumulation loop. This mechanism enables reducing the total number of additions and applies the modern parallelization technologies effectively.

  16. A network application for modeling a centrifugal compressor performance map

    Science.gov (United States)

    Nikiforov, A.; Popova, D.; Soldatova, K.

    2017-08-01

    The approximation of aerodynamic performance of a centrifugal compressor stage and vaneless diffuser by neural networks is presented. Advantages, difficulties and specific features of the method are described. An example of a neural network and its structure is shown. The performances in terms of efficiency, pressure ratio and work coefficient of 39 model stages within the range of flow coefficient from 0.01 to 0.08 were modeled with mean squared error 1.5 %. In addition, the loss and friction coefficients of vaneless diffusers of relative widths 0.014-0.10 are modeled with mean squared error 2.45 %.

  17. Improvements of phosphorescent white OLEDs performance for lighting application.

    Science.gov (United States)

    Lee, Jonghee; Chu, Hye Yong; Lee, Jeong-Ik; Song, Ki-Im; Lee, Su Jin

    2008-10-01

    We developed white OLED device with high power efficiency, in which blue and orange phosphorescent emitters were used. By introduction of multi-functional interlayer which has partial doping of orange dopant inside EBL, we report WOLEDs with peak external efficiencies up to (14.1% EQE, 31.3 Im/W) without light out-coupling technique. At 1000 cd/m2, the performance achieved was 11.9% EQE, 18.7 Im/W with CIE = (0.39, 0.44). We also found that WOLED performances are related with doping ratio of the orange dopant that was inserted inside EBL.

  18. Monte Carlo Frameworks Building Customisable High-performance C++ Applications

    CERN Document Server

    Duffy, Daniel J

    2011-01-01

    This is one of the first books that describe all the steps that are needed in order to analyze, design and implement Monte Carlo applications. It discusses the financial theory as well as the mathematical and numerical background that is needed to write flexible and efficient C++ code using state-of-the art design and system patterns, object-oriented and generic programming models in combination with standard libraries and tools.   Includes a CD containing the source code for all examples. It is strongly advised that you experiment with the code by compiling it and extending it to suit your ne

  19. Identifying beliefs underlying pre-drivers' intentions to take risks: An application of the Theory of Planned Behaviour.

    Science.gov (United States)

    Rowe, Richard; Andrews, Elizabeth; Harris, Peter R; Armitage, Christopher J; McKenna, Frank P; Norman, Paul

    2016-04-01

    Novice motorists are at high crash risk during the first few months of driving. Risky behaviours such as speeding and driving while distracted are well-documented contributors to crash risk during this period. To reduce this public health burden, effective road safety interventions need to target the pre-driving period. We use the Theory of Planned Behaviour (TPB) to identify the pre-driver beliefs underlying intentions to drive over the speed limit (N=77), and while over the legal alcohol limit (N=72), talking on a hand-held mobile phone (N=77) and feeling very tired (N=68). The TPB explained between 41% and 69% of the variance in intentions to perform these behaviours. Attitudes were strong predictors of intentions for all behaviours. Subjective norms and perceived behavioural control were significant, though weaker, independent predictors of speeding and mobile phone use. Behavioural beliefs underlying these attitudes could be separated into those reflecting perceived disadvantages (e.g., speeding increases my risk of crash) and advantages (e.g., speeding gives me a thrill). Interventions that can make these beliefs safer in pre-drivers may reduce crash risk once independent driving has begun. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Theory and Application of an Economic Performance Measure of Risk

    NARCIS (Netherlands)

    C. Niu (Cuizhen); X. Guo (Xu); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2017-01-01

    textabstractHomm and Pigorsch (2012a) use the Aumann and Serrano index to develop a new economic performance measure (EPM), which is well known to have advantages over other measures. In this paper, we extend the theory by constructing a one-sample confidence interval of EPM, and construct

  1. Truths and Lies: Exploring the Ethics of Performance Applications

    Science.gov (United States)

    Shaughnessy, Nicola

    2005-01-01

    This paper examines the ethics of the contract between the performer and client group in applied theatre practice. The paper examines the problematics of the conventional drama framework as a fictional space of pretence. What are the ethics of activities carried out in a context of disbelief? How can the contradictions between the agreement to…

  2. Modeling performance measurement applications and implementation issues in DEA

    CERN Document Server

    Cook, Wade D

    2005-01-01

    Addresses advanced/new DEA methodology and techniques that are developed for modeling unique and new performance evaluation issuesPesents new DEA methodology and techniques via discussions on how to solve managerial problemsProvides an easy-to-use DEA software - DEAFrontier (www.deafrontier.com) which is an excellent tool for both DEA researchers and practitioners.

  3. The Relationship between Cost Leadership Strategy, Total Quality Management Applications and Financial Performance

    Directory of Open Access Journals (Sweden)

    Ali KURT

    2016-03-01

    Full Text Available Firms need to implement some competition strategies and total quality management applications to overcome the fierce competition among others. The purpose of this study is to show the relationship between cost leadership strategy, total quality management applications and firms’ financial performance with literature review and empirical analysis. 449 questionnaires were conducted to the managers of 142 big firms. The data gathered was assessed with AMOS. As a result, the relationship between cost leadership strategy, total quality management applications and firms’ financial performance has been gathered. In addition, the relationship between TQM applications and financial performance has also been gathered.

  4. Ultra-thin lithium micro-batteries. Performances and applications; Microaccumulateurs ultra minces au lithium. Performances et applications

    Energy Technology Data Exchange (ETDEWEB)

    Martin, M.; Terrat, J.P. [Hydromecanique et frottement (HEF), 42 - Andrezieux Boutheon (France); Levasseur, A.; Vinatier, P.; Meunier, G. [Centre National de la Recherche Scientifique (CNRS), 33 - Talence (France). Institut de Chimie de la Matiere Condensee et Physique de Bordeaux

    1996-12-31

    This short paper (abstract) describes the characteristics and performances of prototypes of ultra-thin lithium micro-batteries (thickness < 0.2 mm) which can be incorporated into microelectronic circuits. (J.S.)

  5. Ultra-thin lithium micro-batteries. Performances and applications; Microaccumulateurs ultra minces au lithium. Performances et applications

    Energy Technology Data Exchange (ETDEWEB)

    Martin, M; Terrat, J P [Hydromecanique et frottement (HEF), 42 - Andrezieux Boutheon (France); Levasseur, A; Vinatier, P; Meunier, G [Centre National de la Recherche Scientifique (CNRS), 33 - Talence (France). Institut de Chimie de la Matiere Condensee et Physique de Bordeaux

    1997-12-31

    This short paper (abstract) describes the characteristics and performances of prototypes of ultra-thin lithium micro-batteries (thickness < 0.2 mm) which can be incorporated into microelectronic circuits. (J.S.)

  6. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  7. Multiple choice questions are superior to extended matching questions to identify medicine and biomedical sciences students who perform poorly.

    Science.gov (United States)

    Eijsvogels, Thijs M H; van den Brand, Tessa L; Hopman, Maria T E

    2013-11-01

    In recent years, medical faculties at Dutch universities have implemented a legally binding study advice to students of medicine and biomedical sciences during their propaedeutic phase. Appropriate examination is essential to discriminate between poor (grade age and examination preference on this score. Data were collected for 452 first-year medical and biomedical science students during three distinct course examinations: one examination with EMQ only, one with MCQ only and one mixed examination (including EMQ and MCQ). Logistic regression analysis revealed that MCQ examination was 3 times better in identifying poor students compared with EMQ (RR 3.0, CI 2.0-4.5), whereas EMQ better detected excellent students (average grade ≥8) (RR 1.93, CI 1.47-2.53). Mixed examination had comparable characteristics to MCQ. Sex and examination preference did not impact the score of the student. Students ≥20 years had a 4-fold higher risk ratio of obtaining a poor grade (<6) compared with students ≤18 years old (RR 4.1, CI 2.1-8.0). Given the strong discriminative capacity of MCQ examinations to identify poor students, we recommend the use of this type of examination during the propaedeutic phase of medicine and biomedical science study programmes, in the light of the binding study advice.

  8. On Performance of Linear Multiuser Detectors for Wireless Multimedia Applications

    Science.gov (United States)

    Agarwal, Rekha; Reddy, B. V. R.; Bindu, E.; Nayak, Pinki

    In this paper, performance of different multi-rate schemes in DS-CDMA system is evaluated. The analysis of multirate linear multiuser detectors with multiprocessing gain is analyzed for synchronous Code Division Multiple Access (CDMA) systems. Variable data rate is achieved by varying the processing gain. Our conclusion is that bit error rate for multirate and single rate systems can be made same with a tradeoff with number of users in linear multiuser detectors.

  9. Evaluation of CFVS Performance with SPARC Model and Application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; Na, Young Su; Ha, Kwang Soon; Cho, Song Won [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Containment Filtered Venting System (CFVS) is one of the important safety features to reduce the amount of released fission product into the environment by depressurizing the containment. KAERI has been conducted the integrated performance verification test of CFVS as a part of a Ministry of Trade, Industry and Energy (MOTIE) project. Generally, some codes are used in the case of wet type filter, such as SPARC, BUSCA, SUPRA, etc. Especially SPARC model is included in the MELCOR to calculate the fission product removal rate through the pool scrubbing. In this study, CFVS performance is evaluated using SPARC model in MELCOR according to the steam fraction in the containment. The calculation is mainly focused on the effect of steam fraction in the containment, and the calculation result is explained with the aerosol removal model in SPARC. Previous study on the OPR 1000 is applied to the result. There were two CFVS valve opening period and it is found that the CFVS performance is different in each case. The result of the study provides the fundamental data can be used to decide the CFVS operation time, however, more calculation data is necessary to generalize the result.

  10. Application of the Performance Validation Tool for the Evaluation of NSSS Control System Performance

    International Nuclear Information System (INIS)

    Sohn, Suk-whun

    2011-01-01

    When a control system is supplied to nuclear power plant (NPP) under construction, static tests and dynamic tests are typically performed for evaluating its performance. The dynamic test is not realistic for validating the performance of the replaced hardware in operating NPPs because of potential risks and economic burden. We have, therefore, developed a performance validation tool which can evaluate the dynamic performances of the control system without undertaking real plant tests. The window-based nuclear plant performance analyzer (Win-NPA) is used as a virtual NPP in the developed tool and provides appropriate control loads to the target control system via hardwired cables in a manner that the interfaces are identical to the field wiring. The outputs from the control system are used as the simulation inputs of the plant model within the Win-NPA. With this closed-loop configuration, major transient events were simulated to check the performance of the implemented control system by comparing it with that of the control system model of the Win-NPA and that of the old hardware. The developed tool and the methodology were successfully applied to the hardware replacement project for Yonggwang (YGN) 3 and 4 feedwater control system (FWCS) in 2008. Several errors in the implemented control system were fixed through the performance validation tests and the operability tests. As a result, the control system of the YGN 3 and 4 has demonstrated an excellent control performance since then. On the basis of YGN 3 and 4 project experiences, we are performing a similar project in Ulchin (UCN) 3 and 4. This methodology can also be applied to other NPPs under construction as a tool for pre-operational dynamic tests. These performance tests before performing power ascension tests (PATs) are conducive to preventing unnecessary plant transients or unwanted reactor trips caused by hidden errors of control systems during PATs. (author)

  11. Capacitor performance limitations in high power converter applications

    DEFF Research Database (Denmark)

    El-Khatib, Walid Ziad; Holbøll, Joachim; Rasmussen, Tonny Wederberg

    2013-01-01

    High voltage low inductance capacitors are used in converters as HVDC-links, snubber circuits and sub model (MMC) capacitances. They facilitate the possibility of large peak currents under high frequent or transient voltage applications. On the other hand, using capacitors with larger equivalent...... series inductances include the risk of transient overvoltages, with a negative effect on life time and reliability of the capacitors. These allowable limits of such current and voltage peaks are decided by the ability of the converter components, including the capacitors, to withstand them over...... the expected life time. In this paper results are described from investigations on the electrical environment of these capacitors, including all the conditions they would be exposed to, thereby trying to find the tradeoffs needed to find a suitable capacitor. Different types of capacitors with the same voltage...

  12. Dehalogenases: From Improved Performance to Potential Microbial Dehalogenation Applications

    Directory of Open Access Journals (Sweden)

    Thiau-Fu Ang

    2018-05-01

    Full Text Available The variety of halogenated substances and their derivatives widely used as pesticides, herbicides and other industrial products is of great concern due to the hazardous nature of these compounds owing to their toxicity, and persistent environmental pollution. Therefore, from the viewpoint of environmental technology, the need for environmentally relevant enzymes involved in biodegradation of these pollutants has received a great boost. One result of this great deal of attention has been the identification of environmentally relevant bacteria that produce hydrolytic dehalogenases—key enzymes which are considered cost-effective and eco-friendly in the removal and detoxification of these pollutants. These group of enzymes catalyzing the cleavage of the carbon-halogen bond of organohalogen compounds have potential applications in the chemical industry and bioremediation. The dehalogenases make use of fundamentally different strategies with a common mechanism to cleave carbon-halogen bonds whereby, an active-site carboxylate group attacks the substrate C atom bound to the halogen atom to form an ester intermediate and a halide ion with subsequent hydrolysis of the intermediate. Structurally, these dehalogenases have been characterized and shown to use substitution mechanisms that proceed via a covalent aspartyl intermediate. More so, the widest dehalogenation spectrum of electron acceptors tested with bacterial strains which could dehalogenate recalcitrant organohalides has further proven the versatility of bacterial dehalogenators to be considered when determining the fate of halogenated organics at contaminated sites. In this review, the general features of most widely studied bacterial dehalogenases, their structural properties, basis of the degradation of organohalides and their derivatives and how they have been improved for various applications is discussed.

  13. Dehalogenases: From Improved Performance to Potential Microbial Dehalogenation Applications.

    Science.gov (United States)

    Ang, Thiau-Fu; Maiangwa, Jonathan; Salleh, Abu Bakar; Normi, Yahaya M; Leow, Thean Chor

    2018-05-07

    The variety of halogenated substances and their derivatives widely used as pesticides, herbicides and other industrial products is of great concern due to the hazardous nature of these compounds owing to their toxicity, and persistent environmental pollution. Therefore, from the viewpoint of environmental technology, the need for environmentally relevant enzymes involved in biodegradation of these pollutants has received a great boost. One result of this great deal of attention has been the identification of environmentally relevant bacteria that produce hydrolytic dehalogenases—key enzymes which are considered cost-effective and eco-friendly in the removal and detoxification of these pollutants. These group of enzymes catalyzing the cleavage of the carbon-halogen bond of organohalogen compounds have potential applications in the chemical industry and bioremediation. The dehalogenases make use of fundamentally different strategies with a common mechanism to cleave carbon-halogen bonds whereby, an active-site carboxylate group attacks the substrate C atom bound to the halogen atom to form an ester intermediate and a halide ion with subsequent hydrolysis of the intermediate. Structurally, these dehalogenases have been characterized and shown to use substitution mechanisms that proceed via a covalent aspartyl intermediate. More so, the widest dehalogenation spectrum of electron acceptors tested with bacterial strains which could dehalogenate recalcitrant organohalides has further proven the versatility of bacterial dehalogenators to be considered when determining the fate of halogenated organics at contaminated sites. In this review, the general features of most widely studied bacterial dehalogenases, their structural properties, basis of the degradation of organohalides and their derivatives and how they have been improved for various applications is discussed.

  14. Sustainability Performance of Scandinavian Corporations and their Value Chains assessed by UN Global Compact and Global Reporting Initiative standards - a way to identify superior performers?

    DEFF Research Database (Denmark)

    Kjærgaard, Thomas

    2014-01-01

    The purpose of this study was to introduce a combination of the two most adopted multi- stakeholder standards for sustainability reporting as an alternate framework for assessing sustainability performance in Scandinavian corporations. This novel approach leverages numeric measures on the criteria...

  15. Application of a drug-induced apoptosis assay to identify treatment strategies in recurrent or metastatic breast cancer.

    Directory of Open Access Journals (Sweden)

    Linda Bosserman

    Full Text Available A drug-induced apoptosis assay has been developed to determine which chemotherapy drugs or regimens can produce higher cell killing in vitro. This study was done to determine if this assay could be performed in patients with recurrent or metastatic breast cancer patients, to characterize the patterns of drug-induced apoptosis, and to evaluate the clinical utility of the assay. A secondary goal was to correlate assay use with clinical outcomes.In a prospective, non-blinded, multi institutional controlled trial, 30 evaluable patients with recurrent or metastatic breast cancer who were treated with chemotherapy had tumor samples submitted for the MiCK drug-induced apoptosis assay. After receiving results within 72 hours after biopsy, physicians could use the test to determine therapy (users, or elect to not use the test (non-users.The assay was able to characterize drug-induced apoptosis in tumor specimens from breast cancer patients and identified which drugs or combinations gave highest levels of apoptosis. Patterns of drug activity were also analyzed in triple negative breast cancer. Different drugs from a single class of agents often produced significantly different amounts of apoptosis. Physician frequently (73% used the assay to help select chemotherapy treatments in patients, Patients whose physicians were users had a higher response (CR+PR rate compared to non-users (38.1% vs 0%, p = 0.04 and a higher disease control (CR+PR+Stable rate (81% vs 25%, p<0.01. Time to relapse was longer in users 7.4 mo compared to non-users 2.2 mo (p<0.01.The MiCK assay can be performed in breast cancer specimens, and results are often used by physicians in breast cancer patients with recurrent or metastatic disease. These results from a good laboratory phase II study can be the basis for a future larger prospective multicenter study to more definitively establish the value of the assay.Clinicaltrials.gov NCT00901264.

  16. Polythiophene nanocomposites as high performance electrode material for supercapacitor application

    Science.gov (United States)

    Vijeth, H.; Niranjana, M.; Yesappa, L.; Ashokkumar, S. P.; Devendrappa, H.

    2018-04-01

    A polythiophene-aluminium oxide nanocomposite is prepared by in situ chemical polymerisation in presence of anionic surfactant camphor sulfonic acid (CSA). The characterisation of nano composite was done by X-ray Diffraction (XRD), surface morphology was studied using Atomic Force Microscopy (AFM). The electrochemical performance is evaluated using cyclic voltammetry in 1M H2SO4. As an electroactive material, it exhibits high specific capacitance of 654.5 and 757 F/g for PTH and PTHA nanocomposites at scan rate of 30mV s-1 respectively.

  17. Performance Analysis of 8-bit ODACs for VLC Applications

    Directory of Open Access Journals (Sweden)

    A. Dobesch

    2017-06-01

    Full Text Available A discrete optical power level stepping technique in visible light communication (VLC, also known, as an optical digital to analog conversion (ODAC has been proposed. This is an alternative concept for VLC front-end design, able to mitigate the LED intrinsic non-linearity. Additionally, it removes the need of an electrical digital to analog conversion (EDAC in the driver stage. This paper provides an experimental evaluation of two different ODAC front-ends. The results investigate the spatial relation between the optical front-end and the optical receiver. In addition, the performance evaluation employs dynamic test metrics rather than conventional static metrics previously reported in the literature.

  18. Application of Advanced Technology to Improve Plant Performance. Safety and Performance in Current NPPs

    International Nuclear Information System (INIS)

    Hashemian, H.M.

    2011-01-01

    Advances in computer technologies, signal processing, analytical modeling, and the advent of wireless sensors have provided the nuclear industry with ample means to automate and optimize maintenance activities and improve safety, efficiency, and availability, while reducing costs and radiation exposure to maintenance personnel. This paper provides a review of these developments and presents examples of their use in the nuclear power industry and the financial and safety benefits that they have produced. As the current generation of nuclear power plants have passed their mid-life, increased monitoring of their health is critical to their safe operation. This is especially true now that license renewal of nuclear power plants has accelerated, allowing some plants to operate up to 60 years or more. Furthermore, many utilities are maximizing their power output through uprating projects and retrofits. This puts additional demand and more stress on the plant equipment such as the instrumentation and control (I and C) systems and the reactor internal components making them more vulnerable to the effects of aging, degradation, and failure. In the meantime, the nuclear power industry is working to reduce generation costs by adopting condition-based maintenance strategies and automation of testing activities. These developments have stimulated great interest in on-line monitoring (OLM) technologies and new diagnostic and prognostic methods to anticipate, identify, and resolve equipment and process problems and ensure plant safety, efficiency, and immunity to accidents. The foundation for much of the required technologies has already been established through 40 years of research and development (R and D) efforts performed by numerous organizations, scientists, and engineers around the world including the author. This paper provides examples of these technologies and demonstrates how the gap between some of the more important R and D efforts and end users have been filled

  19. Assessing the Ability of Vegetation Indices to Identify Shallow Subsurface Water Flow Pathways from Hyperspectral Imagery Using Machine Learning: Application

    Science.gov (United States)

    Doctor, K.; Byers, J. M.

    2017-12-01

    Shallow underground water flow pathways expressed as slight depressions are common in the land surface. Under conditions of saturated overland flow, such as during heavy rain or snow melt, these areas of preferential flow might appear on the surface as very shallow flowing streams. When there is no water flowing in these ephemeral channels it can be difficult to identify them. It is especially difficult to discern the slight depressions above the subsurface water flow pathways (SWFP) when the area is covered by vegetation. Since the soil moisture content in these SWFP is often greater than the surrounding area, the vegetation growing on top of these channels shows different vigor and moisture content than the vegetation growing above the non-SWFP area. Vegetation indices (VI) are used in visible and near infrared (VNIR) hyperspectral imagery to enhance biophysical properties of vegetation, and so the brightness values between vegetation atop SWFP and the surrounding vegetation were highlighted. We performed supervised machine learning using ground-truth class labels to determine the conditional probability of a SWFP at a given pixel given either the spectral distribution or VI at that pixel. The training data estimates the probability distributions to a determined finite sampling accuracy for a binary Naïve Bayes classifier between SWFP and non-SWFP. The ground-truth data provides a test bed for understanding the ability to build SWFP classifiers using hyperspectral imagery. SWFP were distinguishable in the imagery within corn and grass fields and in areas with low-lying vegetation. However, the training data is limited to particular types of terrain and vegetation cover in the Shenandoah Valley, Virginia and this would limit the resulting classifier. Further training data could extend its use to other environments.

  20. Applicability and performance of an imaging plate at subzero temperatures

    International Nuclear Information System (INIS)

    Sakoda, Akihiro; Ishimori, Yuu; Hanamoto, Katsumi; Kawabe, Atsushi; Kataoka, Takahiro; Nagamatsu, Tomohiro; Yamaoka, Kiyonori

    2010-01-01

    The performance of imaging plates (IPs) has not been studied at temperatures lower than 0 o C. In the present study, an IP was irradiated with gamma rays emitted from the mineral monazite at temperatures between -80 and 30 o C to determine its fundamental properties. The IP response as a function of irradiation time was found to be linear, suggesting that the IP works properly at low temperatures. Fading, an effect which should be considered at temperatures of more than 0 o C, was not observed at -30 and -80 o C. Furthermore, the fading-corrected PSL value of the IP irradiated at -80 o C was lower than at other temperatures (30, 5 and -30 o C). This can be explained by thermostimulated luminescence (TSL). Since the only intensive TSL peak in the temperature range from -80 to 30 o C is present at about -43 o C, some of the electrons trapped at F centers recombine with holes through the process of TSL before the stored radiation image is read out at room temperature. This finding suggests that the apparent sensitivity of the IP is lower at -80 o C although it is similar to sensitivities between -30 and 30 o C. This low sensitivity should be corrected to perform quantitative measurements.

  1. Performance evaluation of integrated fuel processor for residential PEMFCs application

    International Nuclear Information System (INIS)

    Yu Taek Seo; Dong Joo Seo; Young-Seog Seo; Hyun-Seog Roh; Wang Lai Yoon; Jin Hyeok Jeong

    2006-01-01

    KIER has been developing the natural gas fuel processor to produce hydrogen rich gas for residential PEMFCs system. To realize a compact and high efficiency, the unit processes of steam reforming, water gas shift, and preferential oxidation are chemically and physically integrated in a package. Current fuel processor designed for 1 kW class PEMFCs shows thermal efficiency of 78% as a HHV basis with methane conversion of 90% at rated load operation. CO concentration below 10 ppm in the produced gas is achieved with preferential oxidation unit using Pt and Ru based catalyst under the condition of [O 2 ]/[CO]=2.0. The partial load operation have been carried out to test the performance of fuel processor from 40% to 80% load, showing stable methane conversion and CO concentration below 10 ppm. The durability test for the daily start-stop and 8 hr operation procedure is under investigation and shows no deterioration of its performance after 40 start-stop cycles. (authors)

  2. Predicting Resident Performance from Preresidency Factors: A Systematic Review and Applicability to Neurosurgical Training.

    Science.gov (United States)

    Zuckerman, Scott L; Kelly, Patrick D; Dewan, Michael C; Morone, Peter J; Yengo-Kahn, Aaron M; Magarik, Jordan A; Baticulon, Ronnie E; Zusman, Edie E; Solomon, Gary S; Wellons, John C

    2018-02-01

    Neurosurgical educators strive to identify the best applicants, yet formal study of resident selection has proved difficult. We conducted a systematic review to answer the following question: What objective and subjective preresidency factors predict resident success? PubMed, ProQuest, Embase, and the CINAHL databases were queried from 1952 to 2015 for literature reporting the impact of preresidency factors (PRFs) on outcomes of residency success (RS), among neurosurgery and all surgical subspecialties. Due to heterogeneity of specialties and outcomes, a qualitative summary and heat map of significant findings were constructed. From 1489 studies, 21 articles met inclusion criteria, which evaluated 1276 resident applicants across five surgical subspecialties. No neurosurgical studies met the inclusion criteria. Common objective PRFs included standardized testing (76%), medical school performance (48%), and Alpha Omega Alpha (43%). Common subjective PRFs included aggregate rank scores (57%), letters of recommendation (38%), research (33%), interviews (19%), and athletic or musical talent (19%). Outcomes of RS included faculty evaluations, in-training/board exams, chief resident status, and research productivity. Among objective factors, standardized test scores correlated well with in-training/board examinations but poorly correlated with faculty evaluations. Among subjective factors, aggregate rank scores, letters of recommendation, and athletic or musical talent demonstrated moderate correlation with faculty evaluations. Standardized testing most strongly correlated with future examination performance but correlated poorly with faculty evaluations. Moderate predictors of faculty evaluations were aggregate rank scores, letters of recommendation, and athletic or musical talent. The ability to predict success of neurosurgical residents using an evidence-based approach is limited, and few factors have correlated with future resident performance. Given the importance of

  3. Mechanical performance of pyrolytic carbon in prosthetic heart valve applications.

    Science.gov (United States)

    Cao, H

    1996-06-01

    An experimental procedure has been developed for rigorous characterization of the fracture resistance and fatigue crack extension in pyrolytic carbon for prosthetic heart valve application. Experiments were conducted under sustained and cyclic loading in a simulated biological environment using Carbomedics Pyrolite carbon. While the material was shown to have modest fracture toughness, it exhibited excellent resistance to subcritical crack growth. The crack growth kinetics in pyrolytic carbon were formulated using a phenomenological description. A fatigue threshold was observed below which the crack growth rate diminishes. A damage tolerance concept based on fracture mechanics was used to develop an engineering design approach for mechanical heart valve prostheses. In particular, a new quantity, referred to as the safe-life index, was introduced to assess the design adequacy against subcritical crack growth in brittle materials. In addition, a weakest-link statistical description of the fracture strength is provided and used in the design of component proof-tests. It is shown that the structural reliability of mechanical heart valves can be assured by combining effective flaw detection and manufacturing quality control with adequate damage tolerance design.

  4. High performance photovoltaic applications using solution-processed small molecules.

    Science.gov (United States)

    Chen, Yongsheng; Wan, Xiangjian; Long, Guankui

    2013-11-19

    Energy remains a critical issue for the survival and prosperity of humancivilization. Many experts believe that the eventual solution for sustainable energy is the use of direct solar energy as the main energy source. Among the options for renewable energy, photovoltaic technologies that harness solar energy offer a way to harness an unlimited resource and minimum environment impact in contrast with other alternatives such as water, nuclear, and wind energy. Currently, almost all commercial photovoltaic technologies use Si-based technology, which has a number of disadvantages including high cost, lack of flexibility, and the serious environmental impact of the Si industry. Other technologies, such as organic photovoltaic (OPV) cells, can overcome some of these issues. Today, polymer-based OPV (P-OPV) devices have achieved power conversion efficiencies (PCEs) that exceed 9%. Compared with P-OPV, small molecules based OPV (SM-OPV) offers further advantages, including a defined structure for more reproducible performance, higher mobility and open circuit voltage, and easier synthetic control that leads to more diversified structures. Therefore, while largely undeveloped, SM-OPV is an important emerging technology with performance comparable to P-OPV. In this Account, we summarize our recent results on solution-processed SM-OPV. We believe that solution processing is essential for taking full advantage of OPV technologies. Our work started with the synthesis of oligothiophene derivatives with an acceptor-donor-acceptor (A-D-A) structure. Both the backbone conjugation length and electron withdrawing terminal groups play an important role in the light absorption, energy levels and performance of the devices. Among those molecules, devices using a 7-thiophene-unit backbone and a 3-ethylrhodanine (RD) terminal unit produced a 6.1% PCE. With the optimized conjugation length and terminal unit, we borrowed from the results with P-OPV devices to optimize the backbone. Thus we

  5. Performance and applications of the ORNL local electrode atom probe

    International Nuclear Information System (INIS)

    Miller, M.K.; Russell, K.F.

    2004-01-01

    Full text: The commercial introduction in 2003 of the local electrode atom probe (LEAP) developed by Imago Scientific Instruments has made dramatic, orders of magnitude improvements in the data acquisition rate and the size of the analyzed volume compared to previous types of three-dimensional atom probes and other scanning atom probes. This state-of-the-art instrument may be used for the analysis of traditional needle-shaped specimens and specimens fabricated from 'flat' specimens with focused ion beam (FIB) techniques. The advantage of this local electrode configuration is that significantly lower (∼50 %) standing and pulse voltages are required to produce the field strength required to field evaporate ions from the specimen. New high speed (200 kHz) pulse generators coupled with crossed delay line detectors and faster timing systems also enable significantly faster (up to 300 times) data acquisition rates to be achieved. This new design also permits a significantly larger field of view to be analyzed and results in data sets containing up to 10 8 atoms. In the local electrode atom probe, a ∼10-50 μm diameter aperture is typically positioned approximately one aperture diameter in front of the specimen. In order to accurately align the specimen to the aperture in the funnel-shaped electrode, the specimen is mounted on a three axis nanopositioning stage. An approximate alignment is performed while viewing the relative positions of the specimen and the local electrode with a pair of low magnification video cameras and then a pair of higher magnification video cameras attached to long range microscopes. The final alignment is performed with the use of the field evaporated ions from the specimen. A discussion on the alignment of the specimen with the local electrode, the effects of the fields on the specimen, and the effects of aperture size on aperture lifetime will be presented. The performance of the ORNL local electrode atom probe will be described. The

  6. Practical applications and methods in performing cardiac digital subtraction angiography

    International Nuclear Information System (INIS)

    Markovic, D.M.; Withrow, S.; Moodie, D.S.

    1986-01-01

    One of the purposes of this book is to outline the utility of digital subtraction angiography (DSA) in common clinical practice. No text has dealt with the actual physical setup of the room or the patient prior and during a digital subtraction angiographic study at rest and with exercise. This chapter outlines the steps commonly used when cardiac DSA is performed on patients in the authors' laboratory. The authors have learned over the last few years the best way to prepare the patient and the equipment and it is hoped that utilizing this experience, other centers may avoid the mistakes the authors have made in the past and develop new techniques for the future

  7. Application of fuzzy expert system on LILW performance assessment

    International Nuclear Information System (INIS)

    Lemos, F.L. de; Sullivan, T.

    2002-01-01

    A complete LILW repository performance assessment requires the involvement between several experts in many fields of science. Many sources of uncertainties arise due to complexity of interaction of environmental parameters, lack of data and ignorance, this makes predictive analysis and interpretation difficult. This difficulty in understanding the impact of the ambiguities is even higher when it comes to public and decision makers involvement. Traditional methods of data analysis, while having strong mathematical basis, many times are not adequate to deal with ambiguous data. These ambiguities can be an obstacle to make the results easier to understand and defensible. A methodology of decision making, based on fuzzy logic, can help the interaction between experts, decision makers and the public. This method is the basis of an expert system which can help the analysis of very complex and ambiguous processes. (author)

  8. Inflated applicants: attribution errors in performance evaluation by professionals.

    Directory of Open Access Journals (Sweden)

    Samuel A Swift

    Full Text Available When explaining others' behaviors, achievements, and failures, it is common for people to attribute too much influence to disposition and too little influence to structural and situational factors. We examine whether this tendency leads even experienced professionals to make systematic mistakes in their selection decisions, favoring alumni from academic institutions with high grade distributions and employees from forgiving business environments. We find that candidates benefiting from favorable situations are more likely to be admitted and promoted than their equivalently skilled peers. The results suggest that decision-makers take high nominal performance as evidence of high ability and do not discount it by the ease with which it was achieved. These results clarify our understanding of the correspondence bias using evidence from both archival studies and experiments with experienced professionals. We discuss implications for both admissions and personnel selection practices.

  9. Performance evaluation of direct methanol fuel cells for portable applications

    Energy Technology Data Exchange (ETDEWEB)

    Rashidi, R.; Dincer, I.; Naterer, G.F. [Faculty of Engineering and Applied Science, University of Ontario Institute of Technology, 2000 Simcoe Street North, Oshawa, Ontario (Canada); Berg, P. [Faculty of Science, University of Ontario Institute of Technology, 2000 Simcoe Street North, Oshawa, Ontario (Canada)

    2009-02-15

    This study examines the feasibility of powering a range of portable devices with a direct methanol fuel cell (DMFC). The analysis includes a comparison between a Li-ion battery and DMFC to supply the power for a laptop, camcorder and a cell phone. A parametric study of the systems for an operational period of 4 years is performed. Under the assumptions made for both the Li-ion battery and DMFC system, the battery cost is lower than the DMFC during the first year of operation. However, by the end of 4 years of operational time, the DMFC system would cost less. The weight and cost comparisons show that the fuel cell system occupies less space than the battery to store a higher amount of energy. The weight of both systems is almost identical. Finally, the CO{sub 2} emissions can be decreased by a higher exergetic efficiency of the DMFC, which leads to improved sustainability. (author)

  10. Hybrid Building Performance Simulation Models for Industrial Energy Efficiency Applications

    Directory of Open Access Journals (Sweden)

    Peter Smolek

    2018-06-01

    Full Text Available In the challenge of achieving environmental sustainability, industrial production plants, as large contributors to the overall energy demand of a country, are prime candidates for applying energy efficiency measures. A modelling approach using cubes is used to decompose a production facility into manageable modules. All aspects of the facility are considered, classified into the building, energy system, production and logistics. This approach leads to specific challenges for building performance simulations since all parts of the facility are highly interconnected. To meet this challenge, models for the building, thermal zones, energy converters and energy grids are presented and the interfaces to the production and logistics equipment are illustrated. The advantages and limitations of the chosen approach are discussed. In an example implementation, the feasibility of the approach and models is shown. Different scenarios are simulated to highlight the models and the results are compared.

  11. Quality New Mexico Performance Excellence Award Roadrunner Application 2016

    Energy Technology Data Exchange (ETDEWEB)

    Petru, Ernest Frank [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-06

    The Human Resources (HR) Division is a critical part of Los Alamos National Laboratory, an internationally recognized science and R&D facility with a specialized workforce of more than 10,000. The Laboratory’s mission is to solve national security challenges through scientific excellence. The HR Division partners with employees and managers to support the Laboratory in hiring, retaining, and motivating an exceptional workforce. The Laboratory is owned by the U.S. Department of Energy (DOE), with oversight by the DOE’s National Nuclear Security Administration (NNSA). In 2006, NNSA awarded the contract for managing and operating the Laboratory to Los Alamos National Security, LLC (LANS), and a for-profit consortium. This report expounds on performance excellence efforts, presenting a strategic plan and operations.

  12. Does Vitamin D Supplementation Enhance Musculoskeletal Performance in Individuals Identified as Vitamin D Deficient through Blood Spot Testing?

    Science.gov (United States)

    Murphy, Kellie A.

    This thesis investigated possible changes in performance after one month of vitamin D supplementation in individuals found to be vitamin D deficient or insufficient through blood spot testing. Thirty-two males, ages 18-32, participated. Each subject visited the lab three times in one-month, completing four performance tests each session, including an isometric mid-thigh pull and a vertical jump on a force plate, a isometric 90-degree elbow flexion test using a load cell, and a psychomotor vigilance test on a palm pilot. The initial lab included blood spot tests to find vitamin D levels. In a single blind manner, 16 subjects were assigned vitamin D and 16 the placebo. Repeated measures ANOVA analysis did not reveal any main effects for time (F=2.626, p=0.364), treatment (vitamin D3 vs placebo; F=1.282, p=0.999), or interaction effects for treatment by time (F=0.304, p=0.999) for maximum force production during an isometric mid-thigh pull. Repeated measures ANOVA analysis did not reveal any main effects for time (F=1.323, p=0.999), treatment (vitamin D3 vs placebo; F=0.510, p=0.999), or interaction effects for treatment by time (F= 1.625, p=0.860) for rate of force production during a vertical jump. Repeated measures ANOVA analysis did not reveal any main effects for time (F=0.194, p=0.999), treatment (vitamin D3 vs placebo; F=2.452, p=0.513), or interaction effects for treatment by time (F= 1.179, p=0.999) for maximal force production during a 90-degree isometric elbow flexion. Repeated measures ANOVA analysis did not reveal any main effects for time (F=1.710, p=0.804), treatment (vitamin D3 vs placebo; F=1.471, p=0.94), or interaction effects for treatment by time (F= 0.293, p=0.999) for mean reaction time to random stimuli during the psychomotor vigilance test. Repeated measures ANOVA analysis did not reveal any main effects for time (F=0.530, p=0.999), treatment (vitamin D3 vs placebo; F=0.141, p=0.999), or interaction effects for treatment by time (F=0.784 p=0

  13. Development and application of high performance resins for crud removal

    International Nuclear Information System (INIS)

    Deguchi, Tatsuya; Izumi, Takeshi; Hagiwara, Masahiro

    1998-01-01

    The development of crud removal technology has started with the finding of the resin aging effect that an old ion exchange resin, aged by long year of use in the condensate demineralizer, had an enhanced crud removal capability. It was confirmed that some physical properties such as specific surface area and water retention capacity were increased due to degradation caused by long year of contact with active oxygens in the condensate water. So, it was speculated that those degradation in the resin matrix enhanced the adsorption of crud particulate onto the resin surface, hence the crud removal capability. Based on this, crud removal resin with greater surface area was first developed. This resin has shown an excellent crud removal efficiency in an actual power plant, and the crud iron concentration in the condensate effluent was drastically reduced by this application. However, the cross-linkage of the cation resin had to be lowered in a delicate manner for that specific purpose, and this has caused higher organic leachables from the resin, and the sulfate level in the reactor was raised accordingly. Our major goals, therefore, has been to develop a crud resin of as little organic leachables as possible with keeping the original crud removal efficiency. It was revealed through the evaluation of the first generation crud resin and its improved version installed in the actual condensate demineralizers that there was a good correlation between crud removal efficiency and organic leaching rate. The bast one among a number of developmental resins has shown the organic leaching rate of 1/10 of that of the original crud resin (ETR-C), and the crud removal efficiency of 90%. So far as we understand, the resin was considered to have the best overall balance between crud removal and leaching characteristics. The result of six month evaluation of this developmental resin, ETR-C3, in one vessel of condensate demineralizer of a power plant will be presented. (J.P.N.)

  14. New spallation neutron sources, their performance and applications

    International Nuclear Information System (INIS)

    1985-01-01

    Pulsed spallation sources now operating in the world are at the KEK Laboratory in Japan (the KENS source), at Los Alamos National Laboratory (WNR) and at Argonne National Laboratory (IPNS), both the latter being in the US. The Intense Pulsed Neutron Source (IPNS) is currently the world's most intense source with a peak neutron flux of 4 x 10 14 n cm -2 s -1 at a repetition rate of 30 Hz, and globally producing approx. 1.5 x 10 15 n/sec. Present pulsed sources are still relatively weak compared to their potential. In 1985 the Rutherford Spallation Neutron Source will come on line, and eventually be approx. 30 more intense than the present IPNS. Later, in 1986 the WNR/PSR option at Los Alamos will make that facility of comparable intensity, while a subcritical fission booster at IPNS will keep IPNS competitive. These new sources will expand the applications of pulsed neutrons but are still based on accelerators built for other scientific purposes, usually nuclear or high-energy physics. Accelerator physicists are now designing machines expressly for spallation neutron research, and the proton currents attainable appear in the milliamps. (IPNS now runs at 0.5 GeV and 14 μA). Such design teams are at the KFA Laboratory Julich, Argonne National Laboratory and KEK. Characteristics, particularly the different time structure of the pulses, of these new sources will be discussed. Such machines will be expensive and require national, if not international, collaboration across a wide spectrum of scientific disciplines. The new opportunities for neutron research will, of course, be dramatic with these new sources

  15. Diagnostic performance of dental students in identifying mandibular condyle fractures by panoramic radiography and the usefulness of reference images

    International Nuclear Information System (INIS)

    Cho, Bong Hae

    2011-01-01

    The purpose of this study was to evaluate the diagnostic performance of dental students in detection of mandibular condyle fractures and the effectiveness of reference panoramic images. Forty-six undergraduates evaluated 25 panoramic radiographs for condylar fractures and the data were analyzed through receiver operating characteristic (ROC) analysis. After a month, they were divided into two homogeneous groups based on the first results and re-evaluated the images with (group A) or without (group B) reference images. Eight reference images included indications showing either typical condylar fractures or anatomic structures which could be confused with fractures. Paired t-test was used for statistical analysis of the difference between the first and the second evaluations for each group, and student's t-test was used between the two groups in the second evaluation. The intra- and inter-observer agreements were evaluated with Kappa statistics. Intra- and inter-observer agreements were substantial (k=0.66) and moderate (k=0.53), respectively. The area under the ROC curve (Az) in the first evaluation was 0.802. In the second evaluation, it was increased to 0.823 for group A and 0.814 for group B. The difference between the first and second evaluations for group A was statistically significant (p<0.05), however there was no statistically significant difference between the two groups in the second evaluation. Providing reference images to less experienced clinicians would be a good way to improve the diagnostic ability in detecting condylar fracture.

  16. Identifying a key physical factor sensitive to the performance of Madden-Julian oscillation simulation in climate models

    Science.gov (United States)

    Kim, Go-Un; Seo, Kyong-Hwan

    2018-01-01

    A key physical factor in regulating the performance of Madden-Julian oscillation (MJO) simulation is examined by using 26 climate model simulations from the World Meteorological Organization's Working Group for Numerical Experimentation/Global Energy and Water Cycle Experiment Atmospheric System Study (WGNE and MJO-Task Force/GASS) global model comparison project. For this, intraseasonal moisture budget equation is analyzed and a simple, efficient physical quantity is developed. The result shows that MJO skill is most sensitive to vertically integrated intraseasonal zonal wind convergence (ZC). In particular, a specific threshold value of the strength of the ZC can be used as distinguishing between good and poor models. An additional finding is that good models exhibit the correct simultaneous convection and large-scale circulation phase relationship. In poor models, however, the peak circulation response appears 3 days after peak rainfall, suggesting unfavorable coupling between convection and circulation. For an improving simulation of the MJO in climate models, we propose that this delay of circulation in response to convection needs to be corrected in the cumulus parameterization scheme.

  17. Identifying the Gene Signatures from Gene-Pathway Bipartite Network Guarantees the Robust Model Performance on Predicting the Cancer Prognosis

    Directory of Open Access Journals (Sweden)

    Li He

    2014-01-01

    Full Text Available For the purpose of improving the prediction of cancer prognosis in the clinical researches, various algorithms have been developed to construct the predictive models with the gene signatures detected by DNA microarrays. Due to the heterogeneity of the clinical samples, the list of differentially expressed genes (DEGs generated by the statistical methods or the machine learning algorithms often involves a number of false positive genes, which are not associated with the phenotypic differences between the compared clinical conditions, and subsequently impacts the reliability of the predictive models. In this study, we proposed a strategy, which combined the statistical algorithm with the gene-pathway bipartite networks, to generate the reliable lists of cancer-related DEGs and constructed the models by using support vector machine for predicting the prognosis of three types of cancers, namely, breast cancer, acute myeloma leukemia, and glioblastoma. Our results demonstrated that, combined with the gene-pathway bipartite networks, our proposed strategy can efficiently generate the reliable cancer-related DEG lists for constructing the predictive models. In addition, the model performance in the swap analysis was similar to that in the original analysis, indicating the robustness of the models in predicting the cancer outcomes.

  18. The European ALMA production antennas: new drive applications for better performances and low cost management

    Science.gov (United States)

    Giacomel, L.; Manfrin, C.; Marchiori, G.

    2008-07-01

    From the first application on the VLT Telescopes till today, the linear motor identifies the best solution in terms of quality/cost for any technological application in the astronomical field. Its application also in the radio-astronomy sector with the ALMA project represents a whole of forefront technology, high reliability and minimum maintenance. The adoption of embedded electronics on each motor sector makes it a system at present modular, redundant with resetting of EMC troubles.

  19. Diagnostic performance of dental students in identifying mandibular condyle fractures by panoramic radiography and the usefulness of reference images

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Bong Hae [School of Dentistry, Pusan National University, Pusan (Korea, Republic of)

    2011-06-15

    The purpose of this study was to evaluate the diagnostic performance of dental students in detection of mandibular condyle fractures and the effectiveness of reference panoramic images. Forty-six undergraduates evaluated 25 panoramic radiographs for condylar fractures and the data were analyzed through receiver operating characteristic (ROC) analysis. After a month, they were divided into two homogeneous groups based on the first results and re-evaluated the images with (group A) or without (group B) reference images. Eight reference images included indications showing either typical condylar fractures or anatomic structures which could be confused with fractures. Paired t-test was used for statistical analysis of the difference between the first and the second evaluations for each group, and student's t-test was used between the two groups in the second evaluation. The intra- and inter-observer agreements were evaluated with Kappa statistics. Intra- and inter-observer agreements were substantial (k=0.66) and moderate (k=0.53), respectively. The area under the ROC curve (Az) in the first evaluation was 0.802. In the second evaluation, it was increased to 0.823 for group A and 0.814 for group B. The difference between the first and second evaluations for group A was statistically significant (p<0.05), however there was no statistically significant difference between the two groups in the second evaluation. Providing reference images to less experienced clinicians would be a good way to improve the diagnostic ability in detecting condylar fracture.

  20. Thermal performance of a PCB embedded pulsating heat pipe for power electronics applications

    International Nuclear Information System (INIS)

    Kearney, Daniel J.; Suleman, Omar; Griffin, Justin; Mavrakis, Georgios

    2016-01-01

    Highlights: • Planar, compact PCB embedded pulsating heat pipe for heat spreading applications. • Embedded heat pipe operates at sub-ambient pressure with environmentally. • Compatible fluids. • Range of optimum operating conditions, orientations and fill ratios identified. - Abstract: Low voltage power electronics applications (<1.2 kV) are pushing the design envelope towards increased functionality, better reliability, low profile and reduced cost. One packaging method to enable these constraints is the integration of active power electronic devices into the printed circuit board improving electrical and thermal performance. This development requires a reliable passive thermal management solution to mitigate hot spots due to the increased heat flux density. To this end, a 44 channel open looped pulsating heat pipe (OL-PHP) is experimentally investigated for two independent dielectric working fluids – Novec"T"M 649 and Novec"T"M 774 – due to their lower pressure operation and low global warming potential compared to traditional two-phase coolants. The OL-PHP is investigated in vertical (90°) orientation with fill ratios ranging from 0.30 to 0.70. The results highlight the steady state operating conditions for each working fluid with instantaneous plots of pressure, temperature, and thermal resistance; the minimum potential bulk thermal resistance for each fill ratio and the effective thermal conductivity achievable for the OL-PHP.

  1. Cocoa pulp in beer production: Applicability and fermentative process performance.

    Science.gov (United States)

    Nunes, Cassiane da Silva Oliveira; de Carvalho, Giovani Brandão Mafra; da Silva, Marília Lordêlo Cardoso; da Silva, Gervásio Paulo; Machado, Bruna Aparecida Souza; Uetanabaro, Ana Paula Trovatti

    2017-01-01

    This work evaluated the effect of cocoa pulp as a malt adjunct on the parameters of fermentation for beer production on a pilot scale. For this purpose, yeast isolated from the spontaneous fermentation of cachaça (SC52), belonging to the strain bank of the State University of Feira de Santana-Ba (Brazil), and a commercial strain of ale yeast (Safale S-04 Belgium) were used. The beer produced was subjected to acceptance and purchase intention tests for sensorial analysis. At the beginning of fermentation, 30% cocoa pulp (adjunct) was added to the wort at 12°P concentration. The production of beer on a pilot scale was carried out in a bioreactor with a 100-liter capacity, a usable volume of 60 liters, a temperature of 22°C and a fermentation time of 96 hours. The fermentation parameters evaluated were consumption of fermentable sugars and production of ethanol, glycerol and esters. The beer produced using the adjunct and yeast SC52 showed better fermentation performance and better acceptance according to sensorial analysis.

  2. Cocoa pulp in beer production: Applicability and fermentative process performance.

    Directory of Open Access Journals (Sweden)

    Cassiane da Silva Oliveira Nunes

    Full Text Available This work evaluated the effect of cocoa pulp as a malt adjunct on the parameters of fermentation for beer production on a pilot scale. For this purpose, yeast isolated from the spontaneous fermentation of cachaça (SC52, belonging to the strain bank of the State University of Feira de Santana-Ba (Brazil, and a commercial strain of ale yeast (Safale S-04 Belgium were used. The beer produced was subjected to acceptance and purchase intention tests for sensorial analysis. At the beginning of fermentation, 30% cocoa pulp (adjunct was added to the wort at 12°P concentration. The production of beer on a pilot scale was carried out in a bioreactor with a 100-liter capacity, a usable volume of 60 liters, a temperature of 22°C and a fermentation time of 96 hours. The fermentation parameters evaluated were consumption of fermentable sugars and production of ethanol, glycerol and esters. The beer produced using the adjunct and yeast SC52 showed better fermentation performance and better acceptance according to sensorial analysis.

  3. [Hearing aid application performance evaluation questionnaire to presbycusis].

    Science.gov (United States)

    Chen, Xianghong; Zhou, Huifang; Zhang, Jing; Wang, Liqun

    2011-02-01

    By matching patients with presbycusis hearing aids,hearing aid performance assessment questionnaire to fill out to assess the effect of its use and targeted to solve problems encountered in its use and improve the quality of life of older persons. Through face to face way to investigate and analyse patients with hearing aids fitting, totally 30 subjects accepted the analysis, preliminary assessment of the use of hearing aids in patient with presbycusis results and solve problems encountered in its use by using SPSS software to analyze the collecting data. HHIE questionnaire on statistical analysis, obtained in patients with hearing loss use hearing aids after the problem is a significant improvement statistical analysis of the SADL questionnaire, the conclusion is relatively satisfied with the overall satisfaction. Effects Assessment Questionnaire in patients with hearing aids hearing impairment can be epitomized the disabled after use to improve the situation and understand the satisfaction of patients with hearing aids can be an initial effect as the rehabilitation of a reliable subjective assessment of the impact assessment indicators.

  4. Application of an expert system to optimize reservoir performance

    International Nuclear Information System (INIS)

    Gharbi, Ridha

    2005-01-01

    The main challenge of oil displacement by an injected fluid, such as in Enhanced Oil Recovery (EOR) processes, is to reduce the cost and improve reservoir performance. An optimization methodology, combined with an economic model, is implemented into an expert system to optimize the net present value of full field development with an EOR process. The approach is automated and combines an economic package and existing numerical reservoir simulators to optimize the design of a selected EOR process using sensitivity analysis. The EOR expert system includes three stages of consultations: (1) select an appropriate EOR process on the basis of the reservoir characteristics, (2) prepare appropriate input data sets to design the selected EOR process using existing numerical simulators, and (3) apply the discounted-cash-flow methods to the optimization of the selected EOR process to find out under what conditions at current oil prices this EOR process might be profitable. The project profitability measures were used as the decision-making variables in an iterative approach to optimize the design of the EOR process. The economic analysis is based on the estimated recovery, residual oil in-place, oil price, and operating costs. Two case studies are presented for two reservoirs that have already been produced to their economic limits and are potential candidates for surfactant/polymer flooding, and carbon-dioxide flooding, respectively, or otherwise subject to abandonment. The effect of several design parameters on the project profitability of these EOR processes was investigated

  5. CDTE alloys and their application for increasing solar cell performance

    Science.gov (United States)

    Swanson, Drew E.

    Cadmium Telluride (CdTe) thin film solar is the largest manufactured solar cell technology in the United States and is responsible for one of the lowest costs of utility scale solar electricity at a purchase agreement of $0.0387/kWh. However, this cost could be further reduced by increasing the cell efficiency. To bridge the gap between the high efficiency technology and low cost manufacturing, a research and development tool and process was built and tested. This fully automated single vacuum PV manufacturing tool utilizes multiple inline close space sublimation (CSS) sources with automated substrate control. This maintains the proven scalability of the CSS technology and CSS source design but with the added versatility of independent substrate motion. This combination of a scalable deposition technology with increased cell fabrication flexibility has allowed for high efficiency cells to be manufactured and studied. The record efficiency of CdTe solar cells is lower than fundamental limitations due to a significant deficit in voltage. It has been modeled that there are two potential methods of decreasing this voltage deficiency. The first method is the incorporation of a high band gap film at the back contact to induce a conduction-band barrier that can reduce recombination by reflecting electrons from the back surface. The addition of a Cd1-x MgxTe (CMT) layer at the back of a CdTe solar cell should induce this desired offset and reflect both photoelectrons and forward-current electrons away from the rear surface. Higher collection of photoelectrons will increase the cells current and the reduction of forward current will increase the cells voltage. To have the optimal effect, CdTe must have reasonable carrier lifetimes and be fully depleted. To achieve this experimentally, CdTe layers have been grown sufficiently thin to help produce a fully depleted cell. A variety of measurements including performance curves, transmission electron microscopy, x

  6. 12 CFR 563e.29 - Effect of CRA performance on applications.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Effect of CRA performance on applications. 563e.29 Section 563e.29 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY COMMUNITY REINVESTMENT Standards for Assessing Performance § 563e.29 Effect of CRA performance on...

  7. 45 CFR 305.33 - Determination of applicable percentages based on performance levels.

    Science.gov (United States)

    2010-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES PROGRAM PERFORMANCE MEASURES, STANDARDS, FINANCIAL INCENTIVES, AND PENALTIES § 305.33 Determination of applicable percentages based on performance levels. (a) A State's... performance levels. 305.33 Section 305.33 Public Welfare Regulations Relating to Public Welfare OFFICE OF...

  8. Mobile NBM - Android medical mobile application designed to help in learning how to identify the different regions of interest in the brain's white matter.

    Science.gov (United States)

    Sánchez-Rola, Iskander; Zapirain, Begoña García

    2014-07-18

    One of the most critical tasks when conducting neurological studies is identifying the different regions of interest in the brain's white matter. Currently few programs or applications are available that serve as an interactive guide in this process. This is why a mobile application has been designed and developed in order to teach users how to identify the referred regions of the brain. It also enables users to share the results obtained and take an examination on the knowledge thus learnt. In order to provide direct user-user or user-developer contact, the project includes a website and a Twitter account. An application has been designed with a basic, minimalist look, which anyone can access easily in order to learn to identify a specific region in the brain's white matter. A survey has also been conducted on people who have used it, which has shown that the application is attractive both in the student (final mean satisfaction of 4.2/5) and in the professional (final mean satisfaction of 4.3/5) environment. The response obtained in the online part of the project reflects the high practical value and quality of the application, as shown by the fact that the website has seen a large number of visitors (over 1000 visitors) and the Twitter account has a high number of followers (over 280 followers). Mobile NBM is the first mobile application to be used as a guide in the process of identifying a region of interest in the brain's white matter. Although initially not many areas are available in the application, new ones can be added as required by users in their respective studies. Apart from the application itself, the online resources provided (website and Twitter account) significantly enhance users' experience.

  9. Investigating And Evaluating Of Network Failures And Performance Over Distributed WAN In Application Protocol Layer

    Directory of Open Access Journals (Sweden)

    Enoch Okoh Kofi

    2015-08-01

    Full Text Available The Experiment was done to find out network failures and application performance relationship over distributed Wide Area Net WAN. In order to access related application over the cloud there must be an internet connectivity which will help the respective workstations to access the remote server for applications being deployed over the network. Bandwidth improvement helps in reducing utilization over the network and it also helps in improving Application Efficiency of these Applications in terms of Response Time. Routers were configured under Enhance Interior Gateway Routing Protocol EIGRP to reduce utilization and to ensure load sharing over the network. Three scenarios were modeled and their performance efficiency was evaluated. A modeled computer Network with and without a fail Router under different scenarios and such Network was simulated with emphasis on the Application Performance. The Experiment was done for fifty workstations under three scenarios and these three scenarios were accessed and evaluated on experimental basis using Riverbed modeler to show the Effect of Application Network performance. The performance results show that increasing the bandwidth reduces utilization and also with the failure of one communication bandwidth users can still access Network Application with a minimal cost.

  10. Application of otolith shape analysis in identifying different ecotypes of Coilia ectenes in the Yangtze Basin, China

    Digital Repository Service at National Institute of Oceanography (India)

    Radhakrishnan, K.V.; Li, Y.; Jayalakshmy, K.V.; Liu, M.; Murphy, B.R.; Xie, S.

    The variability in otolith shape of the tapertail anchovy Coilia ectenes was investigated as a tool for identifying its different ecotypes. The outlines of 350 sagittal otoliths of known ecotypes collected from seven sampling areas, covering most...

  11. Multiple Genes Related to Muscle Identified through a Joint Analysis of a Two-stage Genome-wide Association Study for Racing Performance of 1,156 Thoroughbreds

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Shin

    2015-06-01

    Full Text Available Thoroughbred, a relatively recent horse breed, is best known for its use in horse racing. Although myostatin (MSTN variants have been reported to be highly associated with horse racing performance, the trait is more likely to be polygenic in nature. The purpose of this study was to identify genetic variants strongly associated with racing performance by using estimated breeding value (EBV for race time as a phenotype. We conducted a two-stage genome-wide association study to search for genetic variants associated with the EBV. In the first stage of genome-wide association study, a relatively large number of markers (~54,000 single-nucleotide polymorphisms, SNPs were evaluated in a small number of samples (240 horses. In the second stage, a relatively small number of markers identified to have large effects (170 SNPs were evaluated in a much larger number of samples (1,156 horses. We also validated the SNPs related to MSTN known to have large effects on racing performance and found significant associations in the stage two analysis, but not in stage one. We identified 28 significant SNPs related to 17 genes. Among these, six genes have a function related to myogenesis and five genes are involved in muscle maintenance. To our knowledge, these genes are newly reported for the genetic association with racing performance of Thoroughbreds. It complements a recent horse genome-wide association studies of racing performance that identified other SNPs and genes as the most significant variants. These results will help to expand our knowledge of the polygenic nature of racing performance in Thoroughbreds.

  12. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  13. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel's MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  14. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  15. Performance analysis of CRF-based learning for processing WoT application requests expressed in natural language.

    Science.gov (United States)

    Yoon, Young

    2016-01-01

    In this paper, we investigate the effectiveness of a CRF-based learning method for identifying necessary Web of Things (WoT) application components that would satisfy the users' requests issued in natural language. For instance, a user request such as "archive all sports breaking news" can be satisfied by composing a WoT application that consists of ESPN breaking news service and Dropbox as a storage service. We built an engine that can identify the necessary application components by recognizing a main act (MA) or named entities (NEs) from a given request. We trained this engine with the descriptions of WoT applications (called recipes) that were collected from IFTTT WoT platform. IFTTT hosts over 300 WoT entities that offer thousands of functions referred to as triggers and actions. There are more than 270,000 publicly-available recipes composed with those functions by real users. Therefore, the set of these recipes is well-qualified for the training of our MA and NE recognition engine. We share our unique experience of generating the training and test set from these recipe descriptions and assess the performance of the CRF-based language method. Based on the performance evaluation, we introduce further research directions.

  16. Identifying the Gifted Child Humorist.

    Science.gov (United States)

    Fern, Tami L.

    1991-01-01

    This study attempted to identify gifted child humorists among 1,204 children in grades 3-6. Final identification of 13 gifted child humorists was determined through application of such criteria as funniness, originality, and exemplary performance or product. The influence of intelligence, development, social factors, sex differences, family…

  17. Design and performance evaluation of a hall effect magnetic compass for oceanographic and meteorological applications

    Digital Repository Service at National Institute of Oceanography (India)

    Joseph, A.; Desai, R.G.P.; Agarvadekar, Y.; Tengali, T.; Mishra, M.; Fadate, C.; Gomes, L.

    A Hall Effect magnetic compass, suitable for oceanographic and meteorological applications, has been designed and its performance characteristics have been evaluated. Slope of the least-squares-fitted linear graph was found to be close to the ideal...

  18. High Performance Multi-GPU SpMV for Multi-component PDE-Based Applications

    KAUST Repository

    Abdelfattah, Ahmad; Ltaief, Hatem; Keyes, David E.

    2015-01-01

    -block structure. While these optimizations are important for high performance dense kernel executions, they are even more critical when dealing with sparse linear algebra operations. The most time-consuming phase of many multicomponent applications, such as models

  19. Using Modeling and Simulation to Analyze Application and Network Performance at the Radioactive Waste and Nuclear Material Disposition Facility

    International Nuclear Information System (INIS)

    LIFE, ROY A.; MAESTAS, JOSEPH H.; BATEMAN, DENNIS B.

    2003-01-01

    Telecommunication services customers at the Radioactive Waste and Nuclear Material Disposition Facility (RWNMDF) have endured regular service outages that seem to be associated with a custom Microsoft Access Database. In addition, the same customers have noticed periods when application response times are noticeably worse than at others. To the customers, the two events appear to be correlated. Although many network design activities can be accomplished using trial-and-error methods, there are as many, if not more occasions where computerized analysis is necessary to verify the benefits of implementing one design alternative versus another. This is particularly true when network design is performed with application flows and response times in mind. More times than not, it is unclear whether upgrading certain aspects of the network will provide sufficient benefit to justify the corresponding costs, and network modeling tools can be used to help staff make these decisions. This report summarizes our analysis of the situation at the RWNMDF, in which computerized analysis was used to accomplish four objectives: (1) identify the source of the problem; (2) identify areas where improvements make the most sense; (3) evaluate various scenarios ranging from upgrading the network infrastructure, installing an additional fiber trunk as a way to improve local network performance, and re-locating the RWNMDF database onto corporate servers; and (4) demonstrate a methodology for network design using actual application response times to predict, select, and implement the design alternatives that provide the best performance and cost benefits

  20. Where to from here? Future applications of mental models of complex performance

    International Nuclear Information System (INIS)

    Hahn, H.A.; Nelson, W.R.; Blackman, H.S.

    1988-01-01

    The purpose of this paper is to raise issues for discussion regarding the applications of mental models in the study of complex performance. Applications for training, expert systems and decision aids, job selection, workstation design, and other complex environments are considered. 1 ref

  1. 2003 Wastewater Land Application Site Performance Reports for the Idaho National Engineering and Environmental Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Teresa R. Meachum

    2004-02-01

    The 2003 Wastewater Land Application Site Performance Reports for the Idaho National Engineering and Environmental Laboratory describe the conditions for the facilities with State of Idaho Wastewater Land Application Permits. Permit-required monitoring data are summarized, and permit exceedences or environmental impacts relating to the operations of the facilities during the 2003 permit year are discussed.

  2. Remote Core Locking: Migrating Critical-Section Execution to Improve the Performance of Multithreaded Applications

    OpenAIRE

    Lozi , Jean-Pierre; David , Florian; Thomas , Gaël; Lawall , Julia; Muller , Gilles

    2014-01-01

    National audience; The scalability of multithreaded applications on current multicore systems is hampered by the performance of lock algorithms, due to the costs of access contention and cache misses. In this paper, we propose a new lock algorithm, Remote Core Locking (RCL), that aims to improve the performance of critical sections in legacy applications on multicore architectures. The idea of RCL is to replace lock acquisitions by optimized remote procedure calls to a dedicated server core. ...

  3. APPLICATION OF GIS AND GROUNDWATER MODELLING TECHNIQUES TO IDENTIFY THE PERCHED AQUIFERS TO DEMARKATE WATER LOGGING CONDITIONS IN PARTS OF MEHSANA

    Directory of Open Access Journals (Sweden)

    D. Rawal

    2016-06-01

    The study highlights the application of GIS in establishing the basic parameters of soil, land use and the distribution of water logging over a period of time and the groundwater modelling identifies the groundwater regime of the area and estimates the total recharge to the area due to surface water irrigation and rainfall and suggests suitable method to control water logging in the area.

  4. Revivification of a method for identifying longleaf pine timber and its application to southern pine relicts in southeastern Virginia

    Science.gov (United States)

    Thomas L. Eberhardt; Philip M. Sheridan; Arvind A.R. Bhuta

    2011-01-01

    Abstract: Longleaf pine (Pinus palustris Mill.) cannot be distinguished from the other southern pines based on wood anatomy alone. A method that involves measuring pith and second annual ring diameters, reported by Arthur Koehler in 1932 (The Southern Lumberman, 145: 36–37), was revisited as an option for identifying longleaf pine timbers and stumps. Cross-section...

  5. Identifying consumer segments in health services markets: an application of conjoint and cluster analyses to the ambulatory care pharmacy market.

    Science.gov (United States)

    Carrol, N V; Gagon, J P

    1983-01-01

    Because of increasing competition, it is becoming more important that health care providers pursue consumer-based market segmentation strategies. This paper presents a methodology for identifying and describing consumer segments in health service markets, and demonstrates the use of the methodology by presenting a study of consumer segments in the ambulatory care pharmacy market.

  6. Application of biclustering of gene expression data and gene set enrichment analysis methods to identify potentially disease causing nanomaterials

    Directory of Open Access Journals (Sweden)

    Andrew Williams

    2015-12-01

    Full Text Available Background: The presence of diverse types of nanomaterials (NMs in commerce is growing at an exponential pace. As a result, human exposure to these materials in the environment is inevitable, necessitating the need for rapid and reliable toxicity testing methods to accurately assess the potential hazards associated with NMs. In this study, we applied biclustering and gene set enrichment analysis methods to derive essential features of altered lung transcriptome following exposure to NMs that are associated with lung-specific diseases. Several datasets from public microarray repositories describing pulmonary diseases in mouse models following exposure to a variety of substances were examined and functionally related biclusters of genes showing similar expression profiles were identified. The identified biclusters were then used to conduct a gene set enrichment analysis on pulmonary gene expression profiles derived from mice exposed to nano-titanium dioxide (nano-TiO2, carbon black (CB or carbon nanotubes (CNTs to determine the disease significance of these data-driven gene sets.Results: Biclusters representing inflammation (chemokine activity, DNA binding, cell cycle, apoptosis, reactive oxygen species (ROS and fibrosis processes were identified. All of the NM studies were significant with respect to the bicluster related to chemokine activity (DAVID; FDR p-value = 0.032. The bicluster related to pulmonary fibrosis was enriched in studies where toxicity induced by CNT and CB studies was investigated, suggesting the potential for these materials to induce lung fibrosis. The pro-fibrogenic potential of CNTs is well established. Although CB has not been shown to induce fibrosis, it induces stronger inflammatory, oxidative stress and DNA damage responses than nano-TiO2 particles.Conclusion: The results of the analysis correctly identified all NMs to be inflammogenic and only CB and CNTs as potentially fibrogenic. In addition to identifying several

  7. The Epidemiology of Injuries Identified at the National Football League Scouting Combine and their Impact on Professional Sport Performance: 2203 athletes, 2009-2015

    Science.gov (United States)

    Price, Mark D.; Rossy, William H.; Sanchez, George; McHale, Kevin Jude; Logan, Catherine; Provencher, Matthew T.

    2017-01-01

    Objectives: Normal At the annual National Football League (NFL) Scouting Combine, the medical staff of each NFL franchise performs a comprehensive medical evaluation of all athletes potentially entering the NFL. Currently, little is known regarding the overall epidemiology of injuries identified at the Combine and their impact on NFL performance. The purpose of this study is to determine the epidemiology of injuries identified at the Combine and their impact on future NFL performance. Methods: All previous musculoskeletal injuries identified at the NFL combine (2009-2015) were retrospectively reviewed. Medical records and imaging reports were examined. Game statistics for the first two seasons of NFL play were obtained for all players from 2009 to 2013. Analysis of injury prevalence and overall impact on draft status and position-specific performance metrics of each injury was performed and compared versus a position-matched control group with no history of injury and surgery. Results: A total of 2,203 athletes over seven years were evaluated, including 1,490 (67.6%) drafted athletes and 1,040 (47.2%) who ultimately played at least two years in the NFL. The most common sites of injury were the ankle (1160, 52.7%), shoulder (1143, 51.9%), knee (1128, 51.2%), spine (785, 35.6%), and hand (739, 33.5%). Odds ratios (OR) demonstrated quarterbacks were most at risk of shoulder injury (OR 2.78, p=0.001) while running backs most commonly sustained ankle (OR 1.49, p=0.038) and shoulder injuries (OR 1.55, p=0.022). Ultimately, defensive players demonstrated a more negative impact than offensive players following injury with multiple performance metrics impacted for each defensive position analyzed whereas skilled offensive players (i.e. quarterbacks, running backs) demonstrated only one metric affected at each position. Conclusion: The most common sites of injury identified at the Combine were: (1) ankle, (2) shoulder, (3) knee, (4) spine, and (5) hand. Overall, performance

  8. Epidemiology of Injuries Identified at the NFL Scouting Combine and Their Impact on Performance in the National Football League: Evaluation of 2203 Athletes From 2009 to 2015.

    Science.gov (United States)

    Beaulieu-Jones, Brendin R; Rossy, William H; Sanchez, George; Whalen, James M; Lavery, Kyle P; McHale, Kevin J; Vopat, Bryan G; Van Allen, Joseph J; Akamefula, Ramesses A; Provencher, Matthew T

    2017-07-01

    At the annual National Football League (NFL) Scouting Combine, the medical staff of each NFL franchise performs a comprehensive medical evaluation of all athletes potentially entering the NFL. Currently, little is known regarding the overall epidemiology of injuries identified at the combine and their impact on NFL performance. To determine the epidemiology of injuries identified at the combine and their impact on initial NFL performance. Cohort study; Level of evidence, 3. All previous musculoskeletal injuries identified at the NFL Combine from 2009 to 2015 were retrospectively reviewed. Medical records and imaging reports were examined. Game statistics for the first 2 seasons of NFL play were obtained for all players from 2009 to 2013. Analysis of injury prevalence and overall impact on the draft status and position-specific performance metrics of each injury was performed and compared with a position-matched control group with no history of injury or surgery. A total of 2203 athletes over 7 years were evaluated, including 1490 (67.6%) drafted athletes and 1040 (47.2%) who ultimately played at least 2 years in the NFL. The most common sites of injury were the ankle (1160, 52.7%), shoulder (1143, 51.9%), knee (1128, 51.2%), spine (785, 35.6%), and hand (739, 33.5%). Odds ratios (ORs) demonstrated that quarterbacks were most at risk of shoulder injury (OR, 2.78; P = .001), while running backs most commonly sustained ankle (OR, 1.39; P = .040) and shoulder injuries (OR, 1.55; P = .020) when compared with all other players. Ultimately, defensive players demonstrated a greater negative impact due to injury than offensive players, with multiple performance metrics significantly affected for each defensive position analyzed, whereas skilled offensive players (eg, quarterbacks, running backs) demonstrated only 1 metric significantly affected at each position. The most common sites of injury identified at the combine were (1) ankle, (2) shoulder, (3) knee, (4) spine, and

  9. Application of balanced score card in the development of performance indicator system in nuclear power plant

    International Nuclear Information System (INIS)

    Shen Shuguang; Huang Fang; Fang Zhaoxia

    2013-01-01

    Performance indicator, which is one of ten performance monitoring tools recommended by WANO performance improvement model, has become an effective tool for performance improvement of nuclear power plant. At present, performance indicator system has been built in nuclear power plant. However, how to establish the performance indicator system that is reasonable and applicable for plant is still a question to be discussed. Performance indictor is closely tied to the strategic direction of a corporation by a balanced score card, and the performance indicator system is established from the point of performance management and strategic development. The performance indicator system of nuclear power plant is developed by introducing the balanced score card, and can be as a reference for other domestic nuclear power plants. (authors)

  10. The performance of blood pressure-to-height ratio as a screening measure for identifying children and adolescents with hypertension: a meta-analysis.

    Science.gov (United States)

    Ma, Chunming; Liu, Yue; Lu, Qiang; Lu, Na; Liu, Xiaoli; Tian, Yiming; Wang, Rui; Yin, Fuzai

    2016-02-01

    The blood pressure-to-height ratio (BPHR) has been shown to be an accurate index for screening hypertension in children and adolescents. The aim of the present study was to perform a meta-analysis to assess the performance of BPHR for the assessment of hypertension. Electronic and manual searches were performed to identify studies of the BPHR. After methodological quality assessment and data extraction, pooled estimates of the sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio, area under the receiver operating characteristic curve and summary receiver operating characteristics were assessed systematically. The extent of heterogeneity for it was assessed. Six studies were identified for analysis. The pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio and diagnostic odds ratio values of BPHR, for assessment of hypertension, were 96% [95% confidence interval (CI)=0.95-0.97], 90% (95% CI=0.90-0.91), 10.68 (95% CI=8.03-14.21), 0.04 (95% CI=0.03-0.07) and 247.82 (95% CI=114.50-536.34), respectively. The area under the receiver operating characteristic curve was 0.9472. The BPHR had higher diagnostic accuracies for identifying hypertension in children and adolescents.

  11. Identifying mRNA targets of microRNA dysregulated in cancer: with application to clear cell Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Liou Louis S

    2010-04-01

    Full Text Available Abstract Background MicroRNA regulate mRNA levels in a tissue specific way, either by inducing degradation of the transcript or by inhibiting translation or transcription. Putative mRNA targets of microRNA identified from seed sequence matches are available in many databases. However, such matches have a high false positive rate and cannot identify tissue specificity of regulation. Results We describe a simple method to identify direct mRNA targets of microRNA dysregulated in cancers from expression level measurements in patient matched tumor/normal samples. The word "direct" is used here in a strict sense to: a represent mRNA which have an exact seed sequence match to the microRNA in their 3'UTR, b the seed sequence match is strictly conserved across mouse, human, rat and dog genomes, c the mRNA and microRNA expression levels can distinguish tumor from normal with high significance and d the microRNA/mRNA expression levels are strongly and significantly anti-correlated in tumor and/or normal samples. We apply and validate the method using clear cell Renal Cell Carcinoma (ccRCC and matched normal kidney samples, limiting our analysis to mRNA targets which undergo degradation of the mRNA transcript because of a perfect seed sequence match. Dysregulated microRNA and mRNA are first identified by comparing their expression levels in tumor vs normal samples. Putative dysregulated microRNA/mRNA pairs are identified from these using seed sequence matches, requiring that the seed sequence be conserved in human/dog/rat/mouse genomes. These are further pruned by requiring a strong anti-correlation signature in tumor and/or normal samples. The method revealed many new regulations in ccRCC. For instance, loss of miR-149, miR-200c and mir-141 causes gain of function of oncogenes (KCNMA1, LOX, VEGFA and SEMA6A respectively and increased levels of miR-142-3p, miR-185, mir-34a, miR-224, miR-21 cause loss of function of tumor suppressors LRRC2, PTPN13, SFRP1

  12. Review of the performance assessment in the WIPP draft compliance application

    International Nuclear Information System (INIS)

    Lee, W.W.L.

    1996-01-01

    On March 31, 1995, the U.S. Department of Energy (USDOE) filed a draft compliance certification application (DCCA) with the U.S. Environmental Protection Agency (USEPA) to show the Waste Isolation Pilot Plant's compliance with the USEPA's environmental standards for disposal of high-level and transuranic waste. Demonstration of compliance is by a performance assessment. This paper is an early review of the performance assessment in the draft application, by the Environmental Evaluation Group, an oversight group. The performance assessment in the draft application is incomplete. Not all relevant scenarios have been analyzed. The calculation of potential consequences often does not use experimental data but rather estimates by workers developing additional data. The final compliance application, scheduled for October 1996, needs to consider additional scenarios, and be fully based on experimental data

  13. Modified Principal Component Analysis for Identifying Key Environmental Indicators and Application to a Large-Scale Tidal Flat Reclamation

    Directory of Open Access Journals (Sweden)

    Kejian Chu

    2018-01-01

    Full Text Available Identification of the key environmental indicators (KEIs from a large number of environmental variables is important for environmental management in tidal flat reclamation areas. In this study, a modified principal component analysis approach (MPCA has been developed for determining the KEIs. The MPCA accounts for the two important attributes of the environmental variables: pollution status and temporal variation, in addition to the commonly considered numerical divergence attribute. It also incorporates the distance correlation (dCor to replace the Pearson’s correlation to measure the nonlinear interrelationship between the variables. The proposed method was applied to the Tiaozini sand shoal, a large-scale tidal flat reclamation region in China. Five KEIs were identified as dissolved inorganic nitrogen, Cd, petroleum in the water column, Hg, and total organic carbon in the sediment. The identified KEIs were shown to respond well to the biodiversity of phytoplankton. This demonstrated that the identified KEIs adequately represent the environmental condition in the coastal marine system. Therefore, the MPCA is a practicable method for extracting effective indicators that have key roles in the coastal and marine environment.

  14. The application of visceral adiposity index in identifying type 2 diabetes risks based on a prospective cohort in China.

    Science.gov (United States)

    Chen, Chen; Xu, Yan; Guo, Zhi-rong; Yang, Jie; Wu, Ming; Hu, Xiao-shu

    2014-07-08

    Visceral adiposity index (VAI), a novel sex-specific index for visceral fat measurement, has been proposed recently. We evaluate the efficacy of VAI in identifying diabetes risk in Chinese people, and compare the predictive ability between VAI and other body fatness indices, i.e., waist circumference (WC), body mass index (BMI) and waist- to- height ratio (WHtR). Participants (n=3,461) were recruited from an ongoing cohort study in Jiangsu Province, China. Hazard ratio (HR) and corresponding 95% confidence interval (CI) between diabetes risk and different body fatness indices were evaluated by Cox proportional hazard regression model. Receiver operating characteristic (ROC) curve and area under curve (AUC) were applied to compare the ability of identifying diabetes risk between VAI, WC, WHtR and BMI. A total number of 160 new diabetic cases occurred during the follow-up, with an incidence of 4.6%. Significant positive associations were observed for VAI with blood pressure, fasting plasma glucose, triglyceride, WC, BMI and WHtR. Moreover, increased VAI was observed to be associated with higher diabetes risk with a positive dose-response trend (p for trendconvenience surrogate marker for visceral adipose measurement and could be used in identifying the risk of diabetes in large-scale epidemiologic studies.

  15. Pay for Performance Proposals in Race to the Top Round II Applications. Briefing Memo

    Science.gov (United States)

    Rose, Stephanie

    2010-01-01

    The Education Commission of the States reviewed all 36 Race to the Top (RttT) round II applications. Each of the 36 states that applied for round II funding referenced pay for performance under the heading of "Improving teacher and principal effectiveness based on performance." The majority of states outlined pay for performance…

  16. Methods for Identifying Part Quality Issues and Estimating Their Cost with an Application Using the UH-60

    Science.gov (United States)

    2014-01-01

    publications do not necessarily reflect the opinions of its research clients and sponsors. Support RAND—make a tax-deductible charitable contribution at...tools such as Pareto and Weibull analysis allow practitioners to monitor system performance over time and assess improvements or degradation in time

  17. Results of data base management system parameterized performance testing related to GSFC scientific applications

    Science.gov (United States)

    Carchedi, C. H.; Gough, T. L.; Huston, H. A.

    1983-01-01

    The results of a variety of tests designed to demonstrate and evaluate the performance of several commercially available data base management system (DBMS) products compatible with the Digital Equipment Corporation VAX 11/780 computer system are summarized. The tests were performed on the INGRES, ORACLE, and SEED DBMS products employing applications that were similar to scientific applications under development by NASA. The objectives of this testing included determining the strength and weaknesses of the candidate systems, performance trade-offs of various design alternatives and the impact of some installation and environmental (computer related) influences.

  18. Application of the Analytic Hierarchy Process to Identify the Most Suitable Lessor of Freight Car Finance Leasing

    Directory of Open Access Journals (Sweden)

    Lei Lei

    2017-01-01

    Full Text Available Finance leasing (also “equipment leasing” saves the cost, improves the efficiency and benefit, larger the manufacture supply channels, which is an optimal solution for equipment supply with uncertain freight demand. The article collects the definitions of Finance Leasing based on the four pillars theory of finance leasing, also divides the lessors in Freight Car finance leasing into three categories according to their major business: manufacturers, banks as the representative financial institutions, firms that specialized in finance leasing. To identify the most suitable lessor for each railway department, an indicator system is built and operated by Yaahp (a software based on Analytic Hierarchy Process.

  19. Development of GEM detector for plasma diagnostics application: simulations addressing optimization of its performance

    Science.gov (United States)

    Chernyshova, M.; Malinowski, K.; Kowalska-Strzęciwilk, E.; Czarski, T.; Linczuk, P.; Wojeński, A.; Krawczyk, R. D.

    2017-12-01

    The advanced Soft X-ray (SXR) diagnostics setup devoted to studies of the SXR plasma emissivity is at the moment a highly relevant and important for ITER/DEMO application. Especially focusing on the energy range of tungsten emission lines, as plasma contamination by W and its transport in the plasma must be understood and monitored for W plasma-facing material. The Gas Electron Multiplier, with a spatial and energy-resolved photon detecting chamber, based SXR radiation detection system under development by our group may become such a diagnostic setup considering and solving many physical, technical and technological aspects. This work presents the results of simulations aimed to optimize a design of the detector's internal chamber and its performance. The study of the effect of electrodes alignment allowed choosing the gap distances which maximizes electron transmission and choosing the optimal magnitudes of the applied electric fields. Finally, the optimal readout structure design was identified suitable to collect a total formed charge effectively, basing on the range of the simulated electron cloud at the readout plane which was in the order of ~ 2 mm.

  20. Predictors of academic performance for applicants to an international dental studies program in the United States.

    Science.gov (United States)

    Pitigoi-Aron, Gabriela; King, Patricia A; Chambers, David W

    2011-12-01

    The number of U.S. and Canadian dental schools offering programs for dentists with degrees from other countries leading to the D.D.S. or D.M.D. degree has increased recently. This fact, along with the diversity of educational systems represented by candidates for these programs, increases the importance of identifying valid admissions predictors of success in international dental student programs. Data from 148 students accepted into the international dental studies program at the University of the Pacific from 1994 through 2004 were analyzed. Dependent variables were comprehensive cumulative GPA at the end of both the first and second years of the two-year program. The Test of English as a Foreign Language (TOEFL) and both Parts I and II of the National Board Dental Examination (NBDE) were significant positive predictors of success. Performance on laboratory tests of clinical skill in operative dentistry and in fixed prosthodontics and ratings from interviewers were not predictive of overall success in the program. Although this study confirms the predictive value of written tests such as the TOEFL and NBDE, it also contributes to the literature documenting inconsistent results regarding other types of predictors. It may be the case that characteristics of individual programs or features of the applicant pools for each may require use of admissions predictors that are unique to schools.

  1. Medical School Applicant Characteristics Associated With Performance in Multiple Mini-Interviews Versus Traditional Interviews: A Multi-Institutional Study.

    Science.gov (United States)

    Henderson, Mark C; Kelly, Carolyn J; Griffin, Erin; Hall, Theodore R; Jerant, Anthony; Peterson, Ellena M; Rainwater, Julie A; Sousa, Francis J; Wofsy, David; Franks, Peter

    2017-10-31

    To examine applicant characteristics associated with multi mini-interview (MMI) or traditional interview (TI) performance at five California public medical schools. Of the five California Longitudinal Evaluation of Admissions Practices (CA-LEAP) consortium schools, three used TIs and two used MMIs. Schools provided the following retrospective data on all 2011-2013 admissions cycle interviewees: age, gender, race/ethnicity (under-represented in medicine [UIM] or not), self-identified disadvantaged (DA) status, undergraduate GPA, Medical College Admission Test (MCAT) score, and interview score (standardized as z-score, mean = 0, SD = 1). Adjusted linear regression analyses, stratified by interview type, examined associations with interview performance. The 4,993 applicants who completed 7,516 interviews included 931 (18.6%) UIM and 962 (19.3%) DA individuals; 3,226 (64.6%) had one interview. Mean age was 24.4 (SD = 2.7); mean GPA and MCAT score were 3.72 (SD = 0.22) and 33.6 (SD = 3.7), respectively. Older age, female gender, and number of prior interviews were associated with better performance on both MMIs and TIs. Higher GPA was associated with lower MMI scores (z-score, per unit GPA = -0.26, 95% CI [-0.45, -0.06]), but unrelated to TI scores. DA applicants had higher TI scores (z-score = 0.17, 95% CI [0.07, 0.28]), but lower MMI scores (z-score = -0.18, 95% CI [-0.28, -.08]) than non-DA applicants. Neither UIM status nor MCAT score were associated with interview performance. These findings have potentially important workforce implications, particularly regarding DA applicants, and illustrate the need for other multi-institutional studies of medical school admissions processes.

  2. Correlation of Behavioral Interviewing Performance With Obstetrics and Gynecology Residency Applicant Characteristics☆?>.

    Science.gov (United States)

    Breitkopf, Daniel M; Vaughan, Lisa E; Hopkins, Matthew R

    To determine which individual residency applicant characteristics were associated with improved performance on standardized behavioral interviews. Behavioral interviewing has become a common technique for assessing resident applicants. Few data exist on factors that predict success during the behavioral interview component of the residency application process. Interviewers were trained in behavioral interviewing techniques before each application season. Standardized questions were used. Behavioral interview scores and Electronic Residency Application Service data from residency applicants was collected prospectively for 3 years. It included the Accreditation Council for Graduate Medical Education-accredited obstetrics-gynecology residency program at a Midwestern academic medical center. Medical students applying to a single obstetrics-gynecology residency program from 2012 to 2014 participated in the study. Data were collected from 104 applicants during 3 successive interview seasons. Applicant's age was associated with higher overall scores on questions about leadership, coping, and conflict management (for applicants aged ≤25, 26-27, or ≥28y, mean scores were 15.2, 16.0, and 17.2, respectively; p = 0.03), as was a history of employment before medical school (16.8 vs 15.5; p = 0.03). Applicants who participated in collegiate team sports scored lower on questions asking influence/persuasion, initiative, and relationship management compared with those who did not (mean, 15.5 vs 17.1; p = 0.02). Advanced applicant age and history of work experience before medical school may improve skills in dealing with difficult situations and offer opportunities in leadership. In the behavioral interview format, having relevant examples from life experience to share during the interviews may improve the quality of the applicant's responses. Increased awareness of the factors predicting interview performance helps inform the selection process and allows program directors to

  3. Application of Network Analysis to Identify and Map Relationships between Information Systems in the context of Arctic Sustainability

    Science.gov (United States)

    Kontar, Y. Y.

    2017-12-01

    The Arctic Council is an intergovernmental forum promoting cooperation, coordination and interaction among the Arctic States and indigenous communities on issues of sustainable development and environmental protection in the North. The work of the Council is primarily carried out by six Working Groups: Arctic Contaminants Action Program, Arctic Monitoring and Assessment Programme, Conservation of Arctic Flora and Fauna, Emergency Prevention, Preparedness and Response, Protection of the Arctic Marine Environment, and Sustainable Development Working Group. The Working Groups are composed of researchers and representatives from government agencies. Each Working Group issues numerous scientific assessments and reports on a broad field of subjects, from climate change to emergency response in the Arctic. A key goal of these publications is to contribute to policy-making in the Arctic. Complex networks of information systems and the connections between the diverse elements within the systems have been identified via network analysis. This allowed to distinguish data sources that were used in the composition of the primary publications of the Working Groups. Next step is to implement network analysis to identify and map the relationships between the Working Groups and policy makers in the Arctic.

  4. Combining Methods to Describe Important Marine Habitats for Top Predators: Application to Identify Biological Hotspots in Tropical Waters.

    Science.gov (United States)

    Thiers, Laurie; Louzao, Maite; Ridoux, Vincent; Le Corre, Matthieu; Jaquemet, Sébastien; Weimerskirch, Henri

    2014-01-01

    In tropical waters resources are usually scarce and patchy, and predatory species generally show specific adaptations for foraging. Tropical seabirds often forage in association with sub-surface predators that create feeding opportunities by bringing prey close to the surface, and the birds often aggregate in large multispecific flocks. Here we hypothesize that frigatebirds, a tropical seabird adapted to foraging with low energetic costs, could be a good predictor of the distribution of their associated predatory species, including other seabirds (e.g. boobies, terns) and subsurface predators (e.g., dolphins, tunas). To test this hypothesis, we compared distribution patterns of marine predators in the Mozambique Channel based on a long-term dataset of both vessel- and aerial surveys, as well as tracking data of frigatebirds. By developing species distribution models (SDMs), we identified key marine areas for tropical predators in relation to contemporaneous oceanographic features to investigate multi-species spatial overlap areas and identify predator hotspots in the Mozambique Channel. SDMs reasonably matched observed patterns and both static (e.g. bathymetry) and dynamic (e.g. Chlorophyll a concentration and sea surface temperature) factors were important explaining predator distribution patterns. We found that the distribution of frigatebirds included the distributions of the associated species. The central part of the channel appeared to be the best habitat for the four groups of species considered in this study (frigatebirds, brown terns, boobies and sub-surface predators).

  5. Combining Methods to Describe Important Marine Habitats for Top Predators: Application to Identify Biological Hotspots in Tropical Waters.

    Directory of Open Access Journals (Sweden)

    Laurie Thiers

    Full Text Available In tropical waters resources are usually scarce and patchy, and predatory species generally show specific adaptations for foraging. Tropical seabirds often forage in association with sub-surface predators that create feeding opportunities by bringing prey close to the surface, and the birds often aggregate in large multispecific flocks. Here we hypothesize that frigatebirds, a tropical seabird adapted to foraging with low energetic costs, could be a good predictor of the distribution of their associated predatory species, including other seabirds (e.g. boobies, terns and subsurface predators (e.g., dolphins, tunas. To test this hypothesis, we compared distribution patterns of marine predators in the Mozambique Channel based on a long-term dataset of both vessel- and aerial surveys, as well as tracking data of frigatebirds. By developing species distribution models (SDMs, we identified key marine areas for tropical predators in relation to contemporaneous oceanographic features to investigate multi-species spatial overlap areas and identify predator hotspots in the Mozambique Channel. SDMs reasonably matched observed patterns and both static (e.g. bathymetry and dynamic (e.g. Chlorophyll a concentration and sea surface temperature factors were important explaining predator distribution patterns. We found that the distribution of frigatebirds included the distributions of the associated species. The central part of the channel appeared to be the best habitat for the four groups of species considered in this study (frigatebirds, brown terns, boobies and sub-surface predators.

  6. Comparing performance on the MNREAD iPad application with the MNREAD acuity chart.

    Science.gov (United States)

    Calabrèse, Aurélie; To, Long; He, Yingchen; Berkholtz, Elizabeth; Rafian, Paymon; Legge, Gordon E

    2018-01-01

    Our purpose was to compare reading performance measured with the MNREAD Acuity Chart and an iPad application (app) version of the same test for both normally sighted and low-vision participants. Our methods included 165 participants with normal vision and 43 participants with low vision tested on the standard printed MNREAD and on the iPad app version of the test. Maximum Reading Speed, Critical Print Size, Reading Acuity, and Reading Accessibility Index were compared using linear mixed-effects models to identify any potential differences in test performance between the printed chart and the iPad app. Our results showed the following: For normal vision, chart and iPad yield similar estimates of Critical Print Size and Reading Acuity. The iPad provides significantly slower estimates of Maximum Reading Speed than the chart, with a greater difference for faster readers. The difference was on average 3% at 100 words per minute (wpm), 6% at 150 wpm, 9% at 200 wpm, and 12% at 250 wpm. For low vision, Maximum Reading Speed, Reading Accessibility Index, and Critical Print Size are equivalent on the iPad and chart. Only the Reading Acuity is significantly smaller (I. E., better) when measured on the digital version of the test, but by only 0.03 logMAR (p = 0.013). Our conclusions were that, overall, MNREAD parameters measured with the printed chart and the iPad app are very similar. The difference found in Maximum Reading Speed for the normally sighted participants can be explained by differences in the method for timing the reading trials.

  7. Identifying the environmental support and constraints to the Chinese economic growth—An application of the Emergy Accounting method

    International Nuclear Information System (INIS)

    Lou, Bo; Ulgiati, Sergio

    2013-01-01

    The economy of China keeps increasing at high rate, although a bit slower recently than in the past due to the international economic turmoil. The Chinese economic performance affects the world economy in many ways (from increased primary resource and commodity imports to a more active financial role of China worldwide). Not unexpectedly, several and diverse environmental problems are coupled with economic growth, linked to resource availability, competition for energy resources and the overall carrying capacity of the environment as a source and a sink. Monodimensional assessments of either economic growth or environmental aspects are unlikely to provide the needed understanding of development opportunities and potential environmental loading. We suggest in this paper an assessment of the evolution of Chinese Economy based on the Emergy Accounting method, developed by H.T. Odum in the Eighties and further refined more recently. The emergy approach is being increasingly applied worldwide, and in China as well, to study individual production processes, sectors and whole economies and provides a comprehensive picture of the interaction of economic growth and the environment, much useful for economic and environmental policy making. A set of emergy-based performance indicators was calculated with reference to the year 2009 and compared with previous studies from literature, by means of a standardization procedure to ensure consistency. The 2009 national Emergy/GDP ratio, an indicator of the emergy investment per unit of economic product generated, has been calculated respectively as 8.61E+11 solar equivalent joules/Yuan RMB (equivalent to 5.88E+12 sej/US$), showing a decreasing trend from 1975 up-to-date, similar to other countries over their development path. The Emergy Sustainability Index (ESI), an aggregate measure of economic performance and environmental load, also shows a decreasing trend signaling that the Chinese economic development is strictly coupled to

  8. Web-based application on employee performance assessment using exponential comparison method

    Science.gov (United States)

    Maryana, S.; Kurnia, E.; Ruyani, A.

    2017-02-01

    Employee performance assessment is also called a performance review, performance evaluation, or assessment of employees, is an effort to assess the achievements of staffing performance with the aim to increase productivity of employees and companies. This application helps in the assessment of employee performance using five criteria: Presence, Quality of Work, Quantity of Work, Discipline, and Teamwork. The system uses the Exponential Comparative Method and Weighting Eckenrode. Calculation results using graphs were provided to see the assessment of each employee. Programming language used in this system is written in Notepad++ and MySQL database. The testing result on the system can be concluded that this application is correspond with the design and running properly. The test conducted is structural test, functional test, and validation, sensitivity analysis, and SUMI testing.

  9. Network analysis of patient flow in two UK acute care hospitals identifies key sub-networks for A&E performance.

    Science.gov (United States)

    Bean, Daniel M; Stringer, Clive; Beeknoo, Neeraj; Teo, James; Dobson, Richard J B

    2017-01-01

    The topology of the patient flow network in a hospital is complex, comprising hundreds of overlapping patient journeys, and is a determinant of operational efficiency. To understand the network architecture of patient flow, we performed a data-driven network analysis of patient flow through two acute hospital sites of King's College Hospital NHS Foundation Trust. Administration databases were queried for all intra-hospital patient transfers in an 18-month period and modelled as a dynamic weighted directed graph. A 'core' subnetwork containing only 13-17% of all edges channelled 83-90% of the patient flow, while an 'ephemeral' network constituted the remainder. Unsupervised cluster analysis and differential network analysis identified sub-networks where traffic is most associated with A&E performance. Increased flow to clinical decision units was associated with the best A&E performance in both sites. The component analysis also detected a weekend effect on patient transfers which was not associated with performance. We have performed the first data-driven hypothesis-free analysis of patient flow which can enhance understanding of whole healthcare systems. Such analysis can drive transformation in healthcare as it has in industries such as manufacturing.

  10. An Application-Based Performance Evaluation of NASAs Nebula Cloud Computing Platform

    Science.gov (United States)

    Saini, Subhash; Heistand, Steve; Jin, Haoqiang; Chang, Johnny; Hood, Robert T.; Mehrotra, Piyush; Biswas, Rupak

    2012-01-01

    The high performance computing (HPC) community has shown tremendous interest in exploring cloud computing as it promises high potential. In this paper, we examine the feasibility, performance, and scalability of production quality scientific and engineering applications of interest to NASA on NASA's cloud computing platform, called Nebula, hosted at Ames Research Center. This work represents the comprehensive evaluation of Nebula using NUTTCP, HPCC, NPB, I/O, and MPI function benchmarks as well as four applications representative of the NASA HPC workload. Specifically, we compare Nebula performance on some of these benchmarks and applications to that of NASA s Pleiades supercomputer, a traditional HPC system. We also investigate the impact of virtIO and jumbo frames on interconnect performance. Overall results indicate that on Nebula (i) virtIO and jumbo frames improve network bandwidth by a factor of 5x, (ii) there is a significant virtualization layer overhead of about 10% to 25%, (iii) write performance is lower by a factor of 25x, (iv) latency for short MPI messages is very high, and (v) overall performance is 15% to 48% lower than that on Pleiades for NASA HPC applications. We also comment on the usability of the cloud platform.

  11. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  12. MaMR: High-performance MapReduce programming model for material cloud applications

    Science.gov (United States)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  13. Application of Distribution-free Methods of Study for Identifying the Degree of Reliability of Ukrainian Banks

    Directory of Open Access Journals (Sweden)

    Burkina Natalia V.

    2014-03-01

    Full Text Available Bank ratings are integral elements of information infrastructure that ensure sound development of the banking business. One of the key issues that the clients of banking structures are worried about is the issue of identification of the degree of reliability and trust to the bank. As of now there are no common generally accepted methods of bank rating and the issue of bank reliability is rather problematic. The article considers a modern DEA method of economic and mathematical analysis which is a popular instrument of assessment of quality of services of different subjects and which became very popular in foreign econometric studies. The article demonstrates application of the data encapsulation method (data envelopment analysis, DEA for obtaining new methods of development of bank ratings and marks out incoming and outgoing indicators for building a DEA model as applied to the Ukrainian banking system. The authors also discuss some methodical problems that might appear when applying component indicators for ranging the subjects and offer methods of their elimination.

  14. Application of Satellite Remote Sensing to Identify Climatic and Anthropogenic Changes Related to Water and Health Conditions in Emerging Megacities

    Science.gov (United States)

    Akanda, A. S.; Serman, E. A.; Jutla, A.

    2014-12-01

    By 2050, more than 70% of the world's population is expected to be living in a city. In many of the urbanizing regions in Asia and Africa, most new development is taking place without adequate urban or regional planning, and a majority population is crowded into densely populated unplanned settlements, also known as slums. During the same period, precipitation and temperature patterns are likely to see significant changes in many of these regions while coastal megacities will have to accommodate sea-level rise in their ecosystems. The rapid increase in population is usually observed in fringes of the urban sprawl without adequate water or sanitation facilities or access to other municipal amenities (such as utilities, healthcare, and education). Collectively, these issues make the ever increasing slum dwellers in emerging megacities significantly vulnerable to a combination of climatic and anthropogenic threats. However, how the growth of unplanned urban and peri-urban sprawl and simultaneous change in climatic patterns have impacted public health in the emerging megacities remain largely unexplored due to lack of readily available and usable data. We employ a number of Remote Sensing products (GRACE, LANDSAT, MODIS) to bridge above knowledge gaps and to identify relevant hydrologic and anthropogenic changes in emerging megacities that are most vulnerable due to the climate-water-health nexus. We explore one of the largest and the fastest growing megacities in the world - Dhaka, Bangladesh - on identifying and investigating the changes in the water environment and growth of slum areas, and impact on water services and health outcomes. The hydroclimatology of South Asia is highly seasonal and the asymmetric availability of water affects vast areas of Bangladesh differently in space and time, exposing the population of Dhaka region to both droughts and floods and periodic spring-fall outbreaks of diarrheal diseases, such as cholera and rotavirus. This research

  15. Impact of power limitations on the performance of WLANs for home networking applications

    OpenAIRE

    Armour, SMD; Lee, BS; Doufexi, A; Nix, AR; Bull, DR

    2001-01-01

    This paper considers the application of 5 GHz wireless LAN technology to home networking applications. An assessment of physical layer performance is presented in the form of the achievable data rate as a function of received signal to noise ratio. The transmit power limitations imposed by the relevant regulatory bodies are also summarised. Based on this information, a state of the art propagation modelling tool is used to evaluate the coverage achieved by a WLAN system in an example resident...

  16. The Relationship between Cost Leadership Strategy, Total Quality Management Applications and Financial Performance

    OpenAIRE

    Ali KURT; Cemal ZEHİR

    2016-01-01

    Firms need to implement some competition strategies and total quality management applications to overcome the fierce competition among others. The purpose of this study is to show the relationship between cost leadership strategy, total quality management applications and firms’ financial performance with literature review and empirical analysis. 449 questionnaires were conducted to the managers of 142 big firms. The data gathered was assessed with AMOS. As a result, the relationship between ...

  17. Develop feedback system for intelligent dynamic resource allocation to improve application performance.

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, Ann C.; Brandt, James M.; Tucker, Thomas (Open Grid Computing, Inc., Austin, TX); Thompson, David

    2011-09-01

    This report provides documentation for the completion of the Sandia Level II milestone 'Develop feedback system for intelligent dynamic resource allocation to improve application performance'. This milestone demonstrates the use of a scalable data collection analysis and feedback system that enables insight into how an application is utilizing the hardware resources of a high performance computing (HPC) platform in a lightweight fashion. Further we demonstrate utilizing the same mechanisms used for transporting data for remote analysis and visualization to provide low latency run-time feedback to applications. The ultimate goal of this body of work is performance optimization in the face of the ever increasing size and complexity of HPC systems.

  18. Failure Mode and Effect Analysis (FMEA) Applications to Identify Iron Sand Reject and Losses in Cement Industry : A Case Study

    Science.gov (United States)

    Helia, V. N.; Wijaya, W. N.

    2017-06-01

    One of the main raw materials required in the manufacture of cement is iron sand. Data from the Procurement Department on XYZ Company shows that the number of defective iron sand (reject) fluctuates every month. Iron sand is an important raw material in the cement production process, so that the amount of iron sand reject and losses got financial and non-financial impact. This study aims to determine the most dominant activity as the cause of rejection and losses of iron sands and suggest improvements that can be made by using the approach of FMEA (Failure Mode and Effect Analysis). Data collection techniques in this study was using the method of observation, interviews, and focus group discussion (FGD) as well as the assessment of the experts to identify it. Results from this study is there are four points of the most dominant cause of the defect of iron sand (mining activities, acceptance, examination and delivery). Recommendation for overcoming these problem is presented (vendor improvement).

  19. Using Ecological Momentary Assessment to Identify Mechanisms of Change: An Application From a Pharmacotherapy Trial With Adolescent Cannabis Users.

    Science.gov (United States)

    Treloar Padovano, Hayley; Miranda, Robert

    2018-03-01

    The present study used youth's in vivo reports of subjective responses to cannabis while smoking in their natural environments to identify real-world mechanisms of topiramate treatment for cannabis misuse. Participants were 40 cannabis users (≥ twice weekly in past 30 days), ages 15-24 years (47.5% female), with at least one cannabis use episode during the final 3 weeks of a 6-week, randomized clinical trial. Youth reported subjective "high" while smoking, stimulation, sedation, stress, craving, and grams of marijuana used in the natural environment via wireless electronic devices. Bayesian multilevel structural equation modeling (MSEM) evaluated mediation via indirect effect tests. Significant within (daily) and between (person) variability and distinctive within and between effects supported the MSEM approach. Subjective high while smoking was significantly reduced for youth in the topiramate condition, relative to placebo, and the indirect effect of reduced subjective high on total grams of cannabis smoked that day was significant. Indirect effects through other subjective responses were not significant. The results of this initial study suggest that altering subjective responses to smoking, specifically subjective high, may be a key target for developing adjunctive pharmacotherapies for cannabis misuse. More generally, this work provides an example for applying ecological momentary assessment and analytic techniques to evaluate mechanisms of behavior change in longitudinal data.

  20. APPLICATION OF MULTIPLE LOGISTIC REGRESSION, BAYESIAN LOGISTIC AND CLASSIFICATION TREE TO IDENTIFY THE SIGNIFICANT FACTORS INFLUENCING CRASH SEVERITY

    Directory of Open Access Journals (Sweden)

    MILAD TAZIK

    2017-11-01

    Full Text Available Identifying cases in which road crashes result in fatality or injury of drivers may help improve their safety. In this study, datasets of crashes happened in TehranQom freeway, Iran, were examined by three models (multiple logistic regression, Bayesian logistic and classification tree to analyse the contribution of several variables to fatal accidents. For multiple logistic regression and Bayesian logistic models, the odds ratio was calculated for each variable. The model which best suited the identification of accident severity was determined based on AIC and DIC criteria. Based on the results of these two models, rollover crashes (OR = 14.58, %95 CI: 6.8-28.6, not using of seat belt (OR = 5.79, %95 CI: 3.1-9.9, exceeding speed limits (OR = 4.02, %95 CI: 1.8-7.9 and being female (OR = 2.91, %95 CI: 1.1-6.1 were the most important factors in fatalities of drivers. In addition, the results of the classification tree model have verified the findings of the other models.

  1. The coupling of the neutron transport application RATTLESNAKE to the nuclear fuels performance application BISON under the MOOSE framework

    Energy Technology Data Exchange (ETDEWEB)

    Gleicher, Frederick N.; Williamson, Richard L.; Ortensi, Javier; Wang, Yaqi; Spencer, Benjamin W.; Novascone, Stephen R.; Hales, Jason D.; Martineau, Richard C.

    2014-10-01

    The MOOSE neutron transport application RATTLESNAKE was coupled to the fuels performance application BISON to provide a higher fidelity tool for fuel performance simulation. This project is motivated by the desire to couple a high fidelity core analysis program (based on the self-adjoint angular flux equations) to a high fidelity fuel performance program, both of which can simulate on unstructured meshes. RATTLESNAKE solves self-adjoint angular flux transport equation and provides a sub-pin level resolution of the multigroup neutron flux with resonance treatment during burnup or a fast transient. BISON solves the coupled thermomechanical equations for the fuel on a sub-millimeter scale. Both applications are able to solve their respective systems on aligned and unaligned unstructured finite element meshes. The power density and local burnup was transferred from RATTLESNAKE to BISON with the MOOSE Multiapp transfer system. Multiple depletion cases were run with one-way data transfer from RATTLESNAKE to BISON. The eigenvalues are shown to agree well with values obtained from the lattice physics code DRAGON. The one-way data transfer of power density is shown to agree with the power density obtained from an internal Lassman-style model in BISON.

  2. Criteria for confirming sequence periodicity identified by Fourier transform analysis: application to GCR2, a candidate plant GPCR?

    Science.gov (United States)

    Illingworth, Christopher J R; Parkes, Kevin E; Snell, Christopher R; Mullineaux, Philip M; Reynolds, Christopher A

    2008-03-01

    Methods to determine periodicity in protein sequences are useful for inferring function. Fourier transformation is one approach but care is required to ensure the periodicity is genuine. Here we have shown that empirically-derived statistical tables can be used as a measure of significance. Genuine protein sequences data rather than randomly generated sequences were used as the statistical backdrop. The method has been applied to G-protein coupled receptor (GPCR) sequences, by Fourier transformation of hydrophobicity values, codon frequencies and the extent of over-representation of codon pairs; the latter being related to translational step times. Genuine periodicity was observed in the hydrophobicity whereas the apparent periodicity (as inferred from previously reported measures) in the translation step times was not validated statistically. GCR2 has recently been proposed as the plant GPCR receptor for the hormone abscisic acid. It has homology to the Lanthionine synthetase C-like family of proteins, an observation confirmed by fold recognition. Application of the Fourier transform algorithm to the GCR2 family revealed strongly predicted seven fold periodicity in hydrophobicity, suggesting why GCR2 has been reported to be a GPCR, despite negative indications in most transmembrane prediction algorithms. The underlying multiple sequence alignment, also required for the Fourier transform analysis of periodicity, indicated that the hydrophobic regions around the 7 GXXG motifs commence near the C-terminal end of each of the 7 inner helices of the alpha-toroid and continue to the N-terminal region of the helix. The results clearly explain why GCR2 has been understandably but erroneously predicted to be a GPCR.

  3. Whole genome association study identifies regions of the bovine genome and biological pathways involved in carcass trait performance in Holstein-Friesian cattle.

    Science.gov (United States)

    Doran, Anthony G; Berry, Donagh P; Creevey, Christopher J

    2014-10-01

    Four traits related to carcass performance have been identified as economically important in beef production: carcass weight, carcass fat, carcass conformation of progeny and cull cow carcass weight. Although Holstein-Friesian cattle are primarily utilized for milk production, they are also an important source of meat for beef production and export. Because of this, there is great interest in understanding the underlying genomic structure influencing these traits. Several genome-wide association studies have identified regions of the bovine genome associated with growth or carcass traits, however, little is known about the mechanisms or underlying biological pathways involved. This study aims to detect regions of the bovine genome associated with carcass performance traits (employing a panel of 54,001 SNPs) using measures of genetic merit (as predicted transmitting abilities) for 5,705 Irish Holstein-Friesian animals. Candidate genes and biological pathways were then identified for each trait under investigation. Following adjustment for false discovery (q-value carcass traits using a single SNP regression approach. Using a Bayesian approach, 46 QTL were associated (posterior probability > 0.5) with at least one of the four traits. In total, 557 unique bovine genes, which mapped to 426 human orthologs, were within 500kbs of QTL found associated with a trait using the Bayesian approach. Using this information, 24 significantly over-represented pathways were identified across all traits. The most significantly over-represented biological pathway was the peroxisome proliferator-activated receptor (PPAR) signaling pathway. A large number of genomic regions putatively associated with bovine carcass traits were detected using two different statistical approaches. Notably, several significant associations were detected in close proximity to genes with a known role in animal growth such as glucagon and leptin. Several biological pathways, including PPAR signaling, were

  4. Application of Microarrays and qPCR to Identify Phylogenetic and Functional Biomarkers Diagnostic of Microbial Communities that Biodegrade Chlorinated Solvents to Ethene

    Science.gov (United States)

    2012-01-01

    appropriate and cost - effective biomarkers to assess, monitor, and optimize performance. Commonly, biomarker development has focused on identifying...field sites. Firmicutes (Mostly Clostridium spp.), Bacteroidetes (Mostly Bacteroides spp.), as well as Proteobacteria (Mostly sulfate-reducer, i.e...continuous-flow chemostat, and environmental samples from contaminated field sites. Firmicutes (Mostly Clostridium spp.), Bacteroidetes (Mostly

  5. RAPPORT: running scientific high-performance computing applications on the cloud.

    Science.gov (United States)

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  6. Application of controllable unit approach (CUA) to performance-criterion-based nuclear material control and accounting

    International Nuclear Information System (INIS)

    Foster, K.W.; Rogers, D.R.

    1979-01-01

    The Nuclear Regulatory Commission is considering the use of maximum-loss performance criteria as a means of controlling SNM in nuclear plants. The Controllable Unit Approach to material control and accounting (CUA) was developed by Mound to determine the feasibility of controlling a plant to a performance criterion. The concept was tested with the proposed Anderson, SC, mixed-oxide plant, and it was shown that CUA is indeed a feasible method for controlling a complex process to a performance criterion. The application of CUA to an actual low-enrichment plant to assist the NRC in establishing performance criteria for uranium processes is discussed. 5 refs

  7. The application of structure from motion (SfM) to identify the geological structure and outcrop studies

    Science.gov (United States)

    Saputra, Aditya; Rahardianto, Trias; Gomez, Christopher

    2017-07-01

    Adequate knowledge of geological structure is an essential for most studies in geoscience, mineral exploration, geo-hazard and disaster management. The geological map is still one the datasets the most commonly used to obtain information about the geological structure such as fault, joint, fold, and unconformities, however in rural areas such as Central Java data is still sparse. Recent progress in data acquisition technologies and computing have increased the interest in how to capture the high-resolution geological data effectively and for a relatively low cost. Some methods such as Airborne Laser Scanning (ALS), Terrestrial Laser Scanning (TLS), and Unmanned Aerial Vehicles (UAVs) have been widely used to obtain this information, however, these methods need a significant investment in hardware, software, and time. Resolving some of those issues, the photogrammetric method structure from motion (SfM) is an image-based method, which can provide solutions equivalent to laser technologies for a relatively low-cost with minimal time, specialization and financial investment. Using SfM photogrammetry, it is possible to generate high resolution 3D images rock surfaces and outcrops, in order to improve the geological understanding of Indonesia. In the present contribution, it is shown that the information about fault and joint can be obtained at high-resolution and in a shorter time than with the conventional grid mapping and remotely sensed topographic surveying. The SfM method produces a point-cloud through image matching and computing. This task can be run with open- source or commercial image processing and 3D reconstruction software. As the point cloud has 3D information as well as RGB values, it allows for further analysis such as DEM extraction and image orthorectification processes. The present paper describes some examples of SfM to identify the fault in the outcrops and also highlight the future possibilities in terms of earthquake hazard assessment, based on

  8. Transcriptome analysis and its application in identifying genes associated with fruiting body development in basidiomycete Hypsizygus marmoreus.

    Directory of Open Access Journals (Sweden)

    Jinjing Zhang

    Full Text Available To elucidate the mechanisms of fruit body development in H. marmoreus, a total of 43609521 high-quality RNA-seq reads were obtained from four developmental stages, including the mycelial knot (H-M, mycelial pigmentation (H-V, primordium (H-P and fruiting body (H-F stages. These reads were assembled to obtain 40568 unigenes with an average length of 1074 bp. A total of 26800 (66.06% unigenes were annotated and analyzed with the Kyoto Encyclopedia of Genes and Genomes (KEGG, Gene Ontology (GO, and Eukaryotic Orthologous Group (KOG databases. Differentially expressed genes (DEGs from the four transcriptomes were analyzed. The KEGG enrichment analysis revealed that the mycelium pigmentation stage was associated with the MAPK, cAMP, and blue light signal transduction pathways. In addition, expression of the two-component system members changed with the transition from H-M to H-V, suggesting that light affected the expression of genes related to fruit body initiation in H. marmoreus. During the transition from H-V to H-P, stress signals associated with MAPK, cAMP and ROS signals might be the most important inducers. Our data suggested that nitrogen starvation might be one of the most important factors in promoting fruit body maturation, and nitrogen metabolism and mTOR signaling pathway were associated with this process. In addition, 30 genes of interest were analyzed by quantitative real-time PCR to verify their expression profiles at the four developmental stages. This study advances our understanding of the molecular mechanism of fruiting body development in H. marmoreus by identifying a wealth of new genes that may play important roles in mushroom morphogenesis.

  9. Identifying dynamic functional connectivity biomarkers using GIG-ICA: Application to schizophrenia, schizoaffective disorder, and psychotic bipolar disorder.

    Science.gov (United States)

    Du, Yuhui; Pearlson, Godfrey D; Lin, Dongdong; Sui, Jing; Chen, Jiayu; Salman, Mustafa; Tamminga, Carol A; Ivleva, Elena I; Sweeney, John A; Keshavan, Matcheri S; Clementz, Brett A; Bustillo, Juan; Calhoun, Vince D

    2017-05-01

    Functional magnetic resonance imaging (fMRI) studies have shown altered brain dynamic functional connectivity (DFC) in mental disorders. Here, we aim to explore DFC across a spectrum of symptomatically-related disorders including bipolar disorder with psychosis (BPP), schizoaffective disorder (SAD), and schizophrenia (SZ). We introduce a group information guided independent component analysis procedure to estimate both group-level and subject-specific connectivity states from DFC. Using resting-state fMRI data of 238 healthy controls (HCs), 140 BPP, 132 SAD, and 113 SZ patients, we identified measures differentiating groups from the whole-brain DFC and traditional static functional connectivity (SFC), separately. Results show that DFC provided more informative measures than SFC. Diagnosis-related connectivity states were evident using DFC analysis. For the dominant state consistent across groups, we found 22 instances of hypoconnectivity (with decreasing trends from HC to BPP to SAD to SZ) mainly involving post-central, frontal, and cerebellar cortices as well as 34 examples of hyperconnectivity (with increasing trends HC through SZ) primarily involving thalamus and temporal cortices. Hypoconnectivities/hyperconnectivities also showed negative/positive correlations, respectively, with clinical symptom scores. Specifically, hypoconnectivities linking postcentral and frontal gyri were significantly negatively correlated with the PANSS positive/negative scores. For frontal connectivities, BPP resembled HC while SAD and SZ were more similar. Three connectivities involving the left cerebellar crus differentiated SZ from other groups and one connection linking frontal and fusiform cortices showed a SAD-unique change. In summary, our method is promising for assessing DFC and may yield imaging biomarkers for quantifying the dimension of psychosis. Hum Brain Mapp 38:2683-2708, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. Modified performance test of vented lead acid batteries for stationary applications

    International Nuclear Information System (INIS)

    Uhlir, K.W.; Fletcher, R.J.

    1995-01-01

    The concept of a modified performance test for vented lead acid batteries in stationary applications has been developed by the IEEE Battery Working Group. The modified performance test is defined as a test in the ''as found'' condition of the battery capacity and its ability to provide a high rate, short duration load (usually the highest rate of the duty cycle) that will confirm the battery's ability to meet the critical period of the load duty cycle, in addition to determining its percentage of rated capacity. This paper will begin by reviewing performance and service test requirements and concerns associated with both types of tests. The paper will then discuss the rationale for developing a modified performance test along with the benefits that can be derived from performing a modified performance test in lieu of a capacity test and/or a service test. The paper will conclude with an example on how to apply a modified performance test and test acceptance criteria

  11. Performance measurement in the UK construction industry and its role in supporting the application of lean construction concepts

    Directory of Open Access Journals (Sweden)

    Saad Sarhan

    2013-03-01

    Full Text Available Performance measurement has received substantial attention from researchers and the construction industry over the past two decades. This study sought to assess UK practitioners’ awareness of the importance of the use of appropriate performance measures and its role in supporting the application of Lean Construction (LC concepts. To enable the study to achieve its objectives, a review of a range of measurements developed to evaluate project performance including those devoted to support LC efforts was conducted. Consequently a questionnaire survey was developed and sent to 198 professionals in the UK construction industry as well as a small sample of academics with an interest in LC. Results indicated that although practitioners recognise the importance of the selection of non-financial performance measures, it has not been properly and widely implemented. The study identified the most common techniques used by UK construction organisations for performance measurement, and ranked a number of non-financial key performance indicators as significant. Some professed to have embraced the Last Planner System methodology as a means for performance measurement and organisational learning, while further questioning suggested otherwise. It was also suggested that substance thinking amongst professionals could be a significant hidden barrier that militates against the successful implementation of LC.

  12. Development and Performance Analysis of a Photonics-Assisted RF Converter for 5G Applications

    Science.gov (United States)

    Borges, Ramon Maia; Muniz, André Luiz Marques; Sodré Junior, Arismar Cerqueira

    2017-03-01

    This article presents a simple, ultra-wideband and tunable radiofrequency (RF) converter for 5G cellular networks. The proposed optoelectronic device performs broadband photonics-assisted upconversion and downconversion using a single optical modulator. Experimental results demonstrate RF conversion from DC to millimeter waves, including 28 and 38 GHz that are potential frequency bands for 5G applications. Narrow linewidth and low phase noise characteristics are observed in all generated RF carriers. An experimental digital performance analysis using different modulation schemes illustrates the applicability of the proposed photonics-based device in reconfigurable optical wireless communications.

  13. An Allometric Modelling Approach to Identify the Optimal Body Shape Associated with, and Differences between Brazilian and Peruvian Youth Motor Performance.

    Directory of Open Access Journals (Sweden)

    Simonete Silva

    Full Text Available Children from developed and developing countries differ in their body size and shape due to marked differences across their life history caused by social, economic and cultural differences which are also linked to their motor performance (MP. We used allometric models to identify size/shape characteristics associated with MP tests between Brazilian and Peruvian schoolchildren. A total of 4,560 subjects, 2,385 girls and 2,175 boys aged 9-15 years were studied. Height and weight were measured; biological maturation was estimated with the maturity offset technique; MP measures included the 12 minute run (12MR, handgrip strength (HG, standing long jump (SLJ and the shuttle run speed (SR tests; physical activity (PA was assessed using the Baecke questionnaire. A multiplicative allometric model was adopted to adjust for body size differences across countries. Reciprocal ponderal index (RPI was found to be the most suitable body shape indicator associated with the 12MR, SLJ, HG and SR performance. A positive maturation offset parameter was also associated with a better performance in SLJ, HG and SR tests. Sex differences were found in all motor tests. Brazilian youth showed better scores in MP than their Peruvian peers, even when controlling for their body size differences The current study identified the key body size associated with four body mass-dependent MP tests. Biological maturation and PA were associated with strength and motor performance. Sex differences were found in all motor tests, as well as across countries favoring Brazilian children even when accounting for their body size/shape differences.

  14. An Allometric Modelling Approach to Identify the Optimal Body Shape Associated with, and Differences between Brazilian and Peruvian Youth Motor Performance

    Science.gov (United States)

    Silva, Simonete; Bustamante, Alcibíades; Nevill, Alan; Katzmarzyk, Peter T.; Freitas, Duarte; Prista, António; Maia, José

    2016-01-01

    Children from developed and developing countries differ in their body size and shape due to marked differences across their life history caused by social, economic and cultural differences which are also linked to their motor performance (MP). We used allometric models to identify size/shape characteristics associated with MP tests between Brazilian and Peruvian schoolchildren. A total of 4,560 subjects, 2,385 girls and 2,175 boys aged 9–15 years were studied. Height and weight were measured; biological maturation was estimated with the maturity offset technique; MP measures included the 12 minute run (12MR), handgrip strength (HG), standing long jump (SLJ) and the shuttle run speed (SR) tests; physical activity (PA) was assessed using the Baecke questionnaire. A multiplicative allometric model was adopted to adjust for body size differences across countries. Reciprocal ponderal index (RPI) was found to be the most suitable body shape indicator associated with the 12MR, SLJ, HG and SR performance. A positive maturation offset parameter was also associated with a better performance in SLJ, HG and SR tests. Sex differences were found in all motor tests. Brazilian youth showed better scores in MP than their Peruvian peers, even when controlling for their body size differences The current study identified the key body size associated with four body mass-dependent MP tests. Biological maturation and PA were associated with strength and motor performance. Sex differences were found in all motor tests, as well as across countries favoring Brazilian children even when accounting for their body size/shape differences. PMID:26939118

  15. An Allometric Modelling Approach to Identify the Optimal Body Shape Associated with, and Differences between Brazilian and Peruvian Youth Motor Performance.

    Science.gov (United States)

    Silva, Simonete; Bustamante, Alcibíades; Nevill, Alan; Katzmarzyk, Peter T; Freitas, Duarte; Prista, António; Maia, José

    2016-01-01

    Children from developed and developing countries differ in their body size and shape due to marked differences across their life history caused by social, economic and cultural differences which are also linked to their motor performance (MP). We used allometric models to identify size/shape characteristics associated with MP tests between Brazilian and Peruvian schoolchildren. A total of 4,560 subjects, 2,385 girls and 2,175 boys aged 9-15 years were studied. Height and weight were measured; biological maturation was estimated with the maturity offset technique; MP measures included the 12 minute run (12MR), handgrip strength (HG), standing long jump (SLJ) and the shuttle run speed (SR) tests; physical activity (PA) was assessed using the Baecke questionnaire. A multiplicative allometric model was adopted to adjust for body size differences across countries. Reciprocal ponderal index (RPI) was found to be the most suitable body shape indicator associated with the 12MR, SLJ, HG and SR performance. A positive maturation offset parameter was also associated with a better performance in SLJ, HG and SR tests. Sex differences were found in all motor tests. Brazilian youth showed better scores in MP than their Peruvian peers, even when controlling for their body size differences The current study identified the key body size associated with four body mass-dependent MP tests. Biological maturation and PA were associated with strength and motor performance. Sex differences were found in all motor tests, as well as across countries favoring Brazilian children even when accounting for their body size/shape differences.

  16. Enhancing Application Performance Using Mini-Apps: Comparison of Hybrid Parallel Programming Paradigms

    Science.gov (United States)

    Lawson, Gary; Sosonkina, Masha; Baurle, Robert; Hammond, Dana

    2017-01-01

    In many fields, real-world applications for High Performance Computing have already been developed. For these applications to stay up-to-date, new parallel strategies must be explored to yield the best performance; however, restructuring or modifying a real-world application may be daunting depending on the size of the code. In this case, a mini-app may be employed to quickly explore such options without modifying the entire code. In this work, several mini-apps have been created to enhance a real-world application performance, namely the VULCAN code for complex flow analysis developed at the NASA Langley Research Center. These mini-apps explore hybrid parallel programming paradigms with Message Passing Interface (MPI) for distributed memory access and either Shared MPI (SMPI) or OpenMP for shared memory accesses. Performance testing shows that MPI+SMPI yields the best execution performance, while requiring the largest number of code changes. A maximum speedup of 23 was measured for MPI+SMPI, but only 11 was measured for MPI+OpenMP.

  17. The electronic residency application service application can predict accreditation council for graduate medical education competency-based surgical resident performance.

    Science.gov (United States)

    Tolan, Amy M; Kaji, Amy H; Quach, Chi; Hines, O Joe; de Virgilio, Christian

    2010-01-01

    Program directors often struggle to determine which factors in the Electronic Residency Application Service (ERAS) application are important in the residency selection process. With the establishment of the Accreditation Council for Graduate Medical Education (ACGME) competencies, it would be important to know whether information available in the ERAS application can predict subsequent competency-based performance of general surgery residents. This study is a retrospective correlation of data points found in the ERAS application with core competency-based clinical rotation evaluations. ACGME competency-based evaluations as well as technical skills assessment from all rotations during residency were collected. The overall competency score was defined as an average of all 6 competencies and technical skills. A total of77 residents from two (one university and one community based university-affiliate) general surgery residency programs were included in the analysis. Receiving honors for many of the third year clerkships and AOA membership were associated with a number of the individual competencies. USMLE scores were predictive only of Medical Knowledge (p = 0.004). Factors associated with higher overall competency were female gender (p = 0.02), AOA (p = 0.06), overall number of honors received (p = 0.04), and honors in Ob/Gyn (p = 0.03) and Pediatrics (p = 0.05). Multivariable analysis showed honors in Ob/Gyn, female gender, older age, and total number of honors to be predictive of a number of individual core competencies. USMLE scores were only predictive of Medical Knowledge. The ERAS application is useful for predicting subsequent competency based performance in surgical residents. Receiving honors in the surgery clerkship, which has traditionally carried weight when evaluating a potential surgery resident, may not be as strong a predictor of future success. Copyright © 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights

  18. Research on dynamic performance design of mobile phone application based on context awareness

    Science.gov (United States)

    Bo, Zhang

    2018-05-01

    It aims to explore the dynamic performance of different mobile phone applications and the user's cognitive differences, reduce the cognitive burden, and enhance the sense of experience. By analyzing the dynamic design performance in four different interactive contexts, and constructing the framework of information service process in the interactive context perception and the two perception principles of the cognitive consensus between designer and user, and the two kinds of knowledge in accordance with the perception principles. The analysis of the context will help users sense the dynamic performance more intuitively, so that the details of interaction will be performed more vividly and smoothly, thus enhance user's experience in the interactive process. The common perception experience enables designers and users to produce emotional resonance in different interactive contexts, and help them achieve rapid understanding of interactive content and perceive the logic and hierarchy of the content and the structure, therefore the effectiveness of mobile applications will be improved.

  19. Performance Engineering for a Medical Imaging Application on the Intel Xeon Phi Accelerator

    OpenAIRE

    Hofmann, Johannes; Treibig, Jan; Hager, Georg; Wellein, Gerhard

    2013-01-01

    We examine the Xeon Phi, which is based on Intel's Many Integrated Cores architecture, for its suitability to run the FDK algorithm--the most commonly used algorithm to perform the 3D image reconstruction in cone-beam computed tomography. We study the challenges of efficiently parallelizing the application and means to enable sensible data sharing between threads despite the lack of a shared last level cache. Apart from parallelization, SIMD vectorization is critical for good performance on t...

  20. Performance of large-scale scientific applications on the IBM ASCI Blue-Pacific system

    International Nuclear Information System (INIS)

    Mirin, A.

    1998-01-01

    The IBM ASCI Blue-Pacific System is a scalable, distributed/shared memory architecture designed to reach multi-teraflop performance. The IBM SP pieces together a large number of nodes, each having a modest number of processors. The system is designed to accommodate a mixed programming model as well as a pure message-passing paradigm. We examine a number of applications on this architecture and evaluate their performance and scalability

  1. Sentinel nodes are identifiable in formalin-fixed specimens after surgeon-performed ex vivo sentinel lymph node mapping in colorectal cancer.

    LENUS (Irish Health Repository)

    Smith, Fraser McLean

    2012-02-03

    BACKGROUND: In recent years, the technique of sentinel lymph node (SLN) mapping has been applied to colorectal cancer. One aim was to ultrastage patients who were deemed node negative by routine pathologic processing but who went on to develop systemic disease. Such a group may benefit from adjuvant chemotherapy. METHODS: With fully informed consent and ethical approval, 37 patients with primary colorectal cancer and 3 patients with large adenomas were prospectively mapped. Isosulfan blue dye (1 to 2 mL) was injected around tumors within 5 to 10 minutes of resection. After gentle massage to recreate in vivo lymph flow, specimens were placed directly into formalin. During routine pathologic analysis, all nodes were bivalved, and blue-staining nodes were noted. These later underwent multilevel step sectioning with hematoxylin and eosin and cytokeratin staining. RESULTS: SLNs were found in 39 of 40 patients (98% sensitivity), with an average of 4.1 SLNs per patient (range, 1-8). In 14 of 16 (88% specificity) patients with nodal metastases on routine reporting, SLN status was in accordance. Focused examination of SLNs identified occult tumor deposits in 6 (29%) of 21 node-negative patients. No metastatic cells were found in SLNs draining the three adenomas. CONCLUSIONS: The ability to identify SLNs after formalin fixation increases the ease and applicability of SLN mapping in colorectal cancer. Furthermore, the sensitivity and specificity of this simple ex vivo method for establishing regional lymph node status were directly comparable to those in previously published reports.

  2. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.

  3. SELECTION OF ENDOCRINOLOGY SUBSPECIALTY TRAINEES: WHICH APPLICANT CHARACTERISTICS ARE ASSOCIATED WITH PERFORMANCE DURING FELLOWSHIP TRAINING?

    Science.gov (United States)

    Natt, Neena; Chang, Alice Y; Berbari, Elie F; Kennel, Kurt A; Kearns, Ann E

    2016-01-01

    To determine which residency characteristics are associated with performance during endocrinology fellowship training as measured by competency-based faculty evaluation scores and faculty global ratings of trainee performance. We performed a retrospective review of interview applications from endocrinology fellows who graduated from a single academic institution between 2006 and 2013. Performance measures included competency-based faculty evaluation scores and faculty global ratings. The association between applicant characteristics and measures of performance during fellowship was examined by linear regression. The presence of a laudatory comparative statement in the residency program director's letter of recommendation (LoR) or experience as a chief resident was significantly associated with competency-based faculty evaluation scores (β = 0.22, P = .001; and β = 0.24, P = .009, respectively) and faculty global ratings (β = 0.85, P = .006; and β = 0.96, P = .015, respectively). The presence of a laudatory comparative statement in the residency program director's LoR or experience as a chief resident were significantly associated with overall performance during subspecialty fellowship training. Future studies are needed in other cohorts to determine the broader implications of these findings in the application and selection process.

  4. Diagnostic performance of Body Mass Index, Waist Circumference and the Waist-to-Height Ratio for identifying cardiometabolic risk in Scottish pre-adolescents.

    Science.gov (United States)

    Buchan, Duncan S; McLellan, Gillian; Donnelly, Samantha; Arthur, Rosie

    2017-06-01

    Limited studies have examined the diagnostic performance of body mass index (BMI), waist circumference (WC) or waist-to-height ratio (WHtR) for identifying cardiometabolic risk (increased clustered glucose, triglycerides, mean arterial pressure and inv-HDL-cholesterol) in pre-adolescent youth. To compare the utility of BMI, WC and WHtR as predictors of cardiometabolic risk (CMR) in Scottish pre-adolescent children. A cross-sectional analysis of 223 Scottish children (55.2% boys, mean age =8.4 years) was undertaken. BMI, WC and WHtR were used as exposure variables within multivariate logistic regression analysis and ROC analysis to examine the utility of these anthropometrical indices in identifying those at cardiometabolic risk. Individuals with an elevated WHtR, WC and BMI were 3.51 (95% CI = 1.71-7.23; p < .001); 2.34 (95% CI = 1.35-4.06; p = .002) and 2.59 (95% CI = 1.42-4.73; p = .002) times more likely to be at cardiometabolic risk, respectively. The areas under the curves [AUC] to identify children with cardiometabolic risk were significant and similar among anthropometric indices (AUC's = 0.60-0.65). When stratified by BMI, both WC and WHtR demonstrated a fair-to-good ability for identifying those at cardiometabolic risk (AUC = 0.75-0.81). Findings suggest that the combination of BMI with either WC or WHtR may provide an added benefit in the assessment of cardiometabolic risk amongst pre-adolescents.

  5. Performance limitations of piezoelectric and force feedback electrostatic transducers in different applications

    International Nuclear Information System (INIS)

    Hadjiloucas, S; Walker, G C; Bowen, J W; Karatzas, L S

    2009-01-01

    Current limitations in piezoelectric and electrostatic transducers are discussed. A force-feedback electrostatic transducer capable of operating at bandwidths up to 20 kHz is described. Advantages of the proposed design are a linearised operation which simplifies the feedback control aspects and robustness of the performance characteristics to environmental perturbations. Applications in nanotechnology, optical sciences and acoustics are discussed.

  6. Development of high performance Schottky barrier diode and its application to plasma diagnostics

    International Nuclear Information System (INIS)

    Fujita, Junji; Kawahata, Kazuo; Okajima, Shigeki

    1993-10-01

    At the conclusion of the Supporting Collaboration Research on 'Development of High Performance Detectors in the Far Infrared Range' carried out from FY1990 to FY1992, the results of developing Schottky barrier diode and its application to plasma diagnostics are summarized. Some remarks as well as technical know-how for the correct use of diodes are also described. (author)

  7. Performance Evaluation and Community Application of Low-Cost Sensors for Ozone and Nitrogen Dioxide

    Science.gov (United States)

    This study reports on the performance of electrochemical-based low-cost sensors and their use in a community application. CairClip sensors were collocated with federal reference and equivalent methods and operated in a network of sites by citizen scientists (community members) in...

  8. Application of secondary ion mass spectrometry for the characterization of commercial high performance materials

    International Nuclear Information System (INIS)

    Gritsch, M.

    2000-09-01

    The industry today offers an uncounted number of high performance materials, that have to meet highest standards. Commercial high performance materials, though often sold in large quantities, still require ongoing research and development to keep up to date with increasing needs and decreasing tolerances. Furthermore, a variety of materials is on the market that are not fully understood in their microstructure, in the way they react under application conditions, and in which mechanisms are responsible for their degradation. Secondary Ion Mass Spectrometry (SIMS) is an analytical method that is now in commercial use for over 30 years. Its main advantages are the very high detection sensitivity (down to ppb), the ability to measure all elements with isotopic sensitivity, the ability of gaining laterally resolved images, and the inherent capability of depth-profiling. These features make it an ideal tool for a wide field of applications within advanced material science. The present work gives an introduction into the principles of SIMS and shows the successful application for the characterization of commercially used high performance materials. Finally, a selected collection of my publications in reviewed journals will illustrate the state of the art in applied materials research and development with dynamic SIMS. All publications focus on the application of dynamic SIMS to analytical questions that stem from questions arising during the production and improvement of high-performance materials. (author)

  9. Construction Project Performance Improvement through Radio Frequency Identification Technology Application on a Project Supply Chain

    Science.gov (United States)

    Wang, Heng

    2017-01-01

    Construction project productivity typically lags other industries and it has been the focus of numerous studies in order to improve the project performance. This research investigated the application of Radio Frequency Identification (RFID) technology on construction projects' supply chain and determined that RFID technology can improve the…

  10. Application of computational fluid dynamics in building performance simulation for the outdoor environment: an overview

    NARCIS (Netherlands)

    Blocken, B.J.E.; Stathopoulos, T.; Carmeliet, J.; Hensen, J.L.M.

    2011-01-01

    This paper provides an overview of the application of CFD in building performance simulation for the outdoor environment, focused on four topics: (1) pedestrian wind environment around buildings, (2) wind-driven rain on building facades, (3) convective heat transfer coefficients at exterior building

  11. Performance limitations of piezoelectric and force feedback electrostatic transducers in different applications

    Energy Technology Data Exchange (ETDEWEB)

    Hadjiloucas, S; Walker, G C; Bowen, J W [Cybernetics, School of Systems Engineering, University of Reading, RG6 6AY (United Kingdom); Karatzas, L S, E-mail: s.hadjiloucas@reading.ac.u [Temasek Polytechnic, School of Engineering, 21 Tampines Avenue 1, Singapore, 529757 (Singapore)

    2009-07-01

    Current limitations in piezoelectric and electrostatic transducers are discussed. A force-feedback electrostatic transducer capable of operating at bandwidths up to 20 kHz is described. Advantages of the proposed design are a linearised operation which simplifies the feedback control aspects and robustness of the performance characteristics to environmental perturbations. Applications in nanotechnology, optical sciences and acoustics are discussed.

  12. Bayesian Comparison of Alternative Graded Response Models for Performance Assessment Applications

    Science.gov (United States)

    Zhu, Xiaowen; Stone, Clement A.

    2012-01-01

    This study examined the relative effectiveness of Bayesian model comparison methods in selecting an appropriate graded response (GR) model for performance assessment applications. Three popular methods were considered: deviance information criterion (DIC), conditional predictive ordinate (CPO), and posterior predictive model checking (PPMC). Using…

  13. Application of performance assessment as a tool for guiding project work

    International Nuclear Information System (INIS)

    McCombie, C.; Zuidema, P.

    1992-01-01

    The ultimate aim of the performance assessment methodology developed over the last 10-15 years is to predict quantitatively the behavior of disposal systems over periods of time into the far future. The methodology can, however, also be applied in range of tasks during repository development and is in many programmes used as a tool for improving or optimizing the design of subsystem components of for guiding the course of project planning. In Swiss waste management program, there are several examples of the use of performance assessment as a tool in the manner mentioned above. The interaction between research models, assessment models and simplified models is considered to be of key importance and corresponding measures are taken to properly structure the process and to track the data: first, the results of all applications of the models are included in a consistent manner in the scenario analyses for the different sites and systems and, second, consistency in the underlying assumptions and in the data used in the different model calculations is assured by the consequent application of a configuration data management system (CDM). Almost all the applications of performance assessment have been included in Swiss work, but for this paper, only two examples have been selected: applications of performance assessment in both the HLW and the LLW program; and acceptance of specific waste types and their allocation to an appropriate repository on the basis of simplified safety analyses

  14. Applicability of an Indirect VO2max Test: Its Association with the 400 Meters Freestyle Performance

    Directory of Open Access Journals (Sweden)

    Adalberto Veronese da Costa

    Full Text Available Abstract The aim of this study was to evaluate the VO2max using a previously validated indirect test for non-expert adult swimmers and to verify its connection with the 400 meters freestyle test. A total of 17 non-expert male swimmers (21.5 ± 3.12 years were evaluated. Body composition measurements included body weight (74 ± 9.41 kg, height (172.9 ± 5.21 cm and body fat percentage (15.2 ± 4.15 %. Two tests were conducted on different days; the 400 meters freestyle (400 MF and the Progressive Swim Test (PSwT, respectively. The participant's heart rate frequency before and after the test (BHR and AHR was analyzed, as well as the subjective perception of effort (RPE, the number of laps covered (NLP, and the time of test execution measured in minutes. Significant differences were identified in all variables (p - 0.60 was found between AHR and execution time (r > - 0.70, as well as between the VO2max estimated by the PSwT and the 400 MF performance test (r > - 0.70. The Bland-Altman Plot showed that the values discovered were within the established concordance limits of 95% (±1.96 SD. A negative correlation between a swimming test and a test that estimates the VO2max occurred, and the PSwT showed results of greater approximation of the aerobic power of non-expert swimmers. In conclusion, the PSwT is applicable for non-expert adult swimmers.

  15. Performance of a novel micro force vector sensor and outlook into its biomedical applications

    Science.gov (United States)

    Meiss, Thorsten; Rossner, Tim; Minamisava Faria, Carlos; Völlmeke, Stefan; Opitz, Thomas; Werthschützky, Roland

    2011-05-01

    For the HapCath system, which provides haptic feedback of the forces acting on a guide wire's tip during vascular catheterization, very small piezoresistive force sensors of 200•200•640μm3 have been developed. This paper focuses on the characterization of the measurement performance and on possible new applications. Besides the determination of the dynamic measurement performance, special focus is put onto the results of the 3- component force vector calibration. This article addresses special advantageous characteristics of the sensor, but also the limits of applicability will be addressed. As for the special characteristics of the sensor, the second part of the article demonstrates new applications which can be opened up with the novel force sensor, like automatic navigation of medical or biological instruments without impacting surrounding tissue, surface roughness evaluation in biomedical systems, needle insertion with tactile or higher level feedback, or even building tactile hairs for artificial organisms.

  16. Technology, Performance, and Market Report of Wind-Diesel Applications for Remote and Island Communities: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, I.; Dabo, M.

    2009-02-01

    This paper describes the current status of wind-diesel technology and its applications, the current research activities, and the remaining system technical and commercial challenges. System architectures, dispatch strategies, and operating experience from a variety of wind-diesel systems will be discussed, as well as how recent development to explore distributed energy generation solutions for wind generation can benefit from the performance experience of operating systems. The paper also includes a detailed discussion of the performance of wind-diesel applications in Alaska, where 10 wind-diesel stations are operating and additional systems are currently being implemented. Additionally, because this application represents an international opportunity, a community of interest committed to sharing technical and operating developments is being formed. The authors hope to encourage this expansion while allowing communities and nations to investigate the wind-diesel option for reducing their dependence on diesel-driven energy sources.

  17. Technology, Performance, and Market Report of Wind-Diesel Applications for Remote and Island Communities: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Baring-Gould, I.; Dabo, M.

    2009-05-01

    This paper describes the current status of wind-diesel technology and its applications, the current research activities, and the remaining system technical and commercial challenges. System architectures, dispatch strategies, and operating experience from a variety of wind-diesel systems will be discussed, as well as how recent development to explore distributed energy generation solutions for wind generation can benefit from the performance experience of operating systems. The paper also includes a detailed discussion of the performance of wind-diesel applications in Alaska, where 10 wind-diesel stations are operating and additional systems are currently being implemented. Additionally, because this application represents an international opportunity, a community of interest committed to sharing technical and operating developments is being formed. The authors hope to encourage this expansion while allowing communities and nations to investigate the wind-diesel option for reducing their dependence on diesel-driven energy sources.

  18. Diagnostic performance of neck circumference to identify overweight and obesity as defined by body mass index in children and adolescents: systematic review and meta-analysis.

    Science.gov (United States)

    Ma, Chunming; Wang, Rui; Liu, Yue; Lu, Qiang; Liu, Xiaoli; Yin, Fuzai

    2017-05-01

    The neck circumference (NC) has been shown to be an accurate index for screening overweight and obesity in children and adolescents. To perform a meta-analysis to assess the performance of NC for the assessment of overweight and obesity. Data sources were PubMed and EMBASE up to March 2016. Studies providing measures of diagnostic performance of NC and using body mass index as reference standard were included. Six eligible studies that evaluated 11 214 children and adolescents aged 6-18 years were included in the meta-analysis. NC showed pooled sensitivity to detect high body mass index of 0.780 (95% confidence interval [CI] = 0.765-0.794), specificity of 0.746 (95% CI =  0.736-0.756) and a diagnostic odds ratio of 17.343 (95% CI =  8.743-34.405). The NC had moderate diagnostic accuracy for identifying overweight and obesity in children and adolescents.

  19. Evaluation of Software Quality to Improve Application Performance Using Mc Call Model

    Directory of Open Access Journals (Sweden)

    Inda D Lestantri

    2018-04-01

    Full Text Available The existence of software should have more value to improve the performance of the organization in addition to having the primary function to automate. Before being implemented in an operational environment, software must pass the test gradually to ensure that the software is functioning properly, meeting user needs and providing convenience for users to use it. This test is performed on a web-based application, by taking a test case in an e-SAP application. E-SAP is an application used to monitor teaching and learning activities used by a university in Jakarta. To measure software quality, testing can be done on users randomly. The user samples selected in this test are users with an age range of 18 years old up to 25 years, background information technology. This test was conducted on 30 respondents. This test is done by using Mc Call model. Model of testing Mc Call consists of 11 dimensions are grouped into 3 categories. This paper describes the testing with reference to the category of product operation, which includes 5 dimensions. The dimensions of testing performed include the dimensions of correctness, usability, efficiency, reliability, and integrity. This paper discusses testing on each dimension to measure software quality as an effort to improve performance. The result of research is e-SAP application has good quality with product operation value equal to 85.09%. This indicates that the e-SAP application has a great quality, so this application deserves to be examined in the next stage on the operational environment.

  20. Research and Application of New Type of High Performance Titanium Alloy

    Directory of Open Access Journals (Sweden)

    ZHU Zhishou

    2016-06-01

    Full Text Available With the continuous extension of the application quantity and range for titanium alloy in the fields of national aviation, space, weaponry, marine and chemical industry, etc., even more critical requirements to the comprehensive mechanical properties, low cost and process technological properties of titanium alloy have been raised. Through the alloying based on the microstructure parameters design, and the comprehensive strengthening and toughening technologies of fine grain strengthening, phase transformation and process control of high toughening, the new type of high performance titanium alloy which has good comprehensive properties of high strength and toughness, anti-fatigue, failure resistance and anti-impact has been researched and manufactured. The new titanium alloy has extended the application quantity and application level in the high end field, realized the industrial upgrading and reforming, and met the application requirements of next generation equipment.

  1. Performance analysis of InSb based QWFET for ultra high speed applications

    International Nuclear Information System (INIS)

    Subash, T. D.; Gnanasekaran, T.; Divya, C.

    2015-01-01

    An indium antimonide based QWFET (quantum well field effect transistor) with the gate length down to 50 nm has been designed and investigated for the first time for L-band radar applications at 230 GHz. QWFETs are designed at the high performance node of the International Technology Road Map for Semiconductors (ITRS) requirements of drive current (Semiconductor Industry Association 2010). The performance of the device is investigated using the SYNOPSYS CAD (TCAD) software. InSb based QWFET could be a promising device technology for very low power and ultra-high speed performance with 5–10 times low DC power dissipation. (semiconductor devices)

  2. Assessment of the performance of containment and surveillance equipment part 2: trial application

    International Nuclear Information System (INIS)

    Rezniczek, A.; Richter, B.; Jussofie, A.

    2009-01-01

    The adopted methodological approach for assessing the performance of Containment and Surveillance (C/S) equipment resulted from an account of work performed for and in cooperation with the ESARDA Working Group on C/S. It was applied on a trial basis to a dry storage facility for spent nuclear fuel and consisted of the following steps: (1) Acquisition and analysis of design information and operational characteristics of the facility under consideration, (2) assumptions on diversion and misuse scenarios, (3) assumptions on safeguards approach and definition of safeguards requirements, (4) compilation and characterisation of candidate C/S equipment, (5) performance assessment of C/S equipment. The candidate equipment taken into account was routinely used by the IAEA: DCM14-type camera, Type E capand- wire seal, COBRA fibre optic seal, and VACOSS electronic seal. Four applications were considered: camera mounted in the reception area, seal on secondary lid of transport and storage cask, seal on protective lid, and seal on group of casks. For these applications, requirements were defined and requirement levels were attributed. The assignment of performance levels was carried out by using the technical specifications and design basis tolerances provided by the equipment manufacturers. The results were entered into four performance assessment tables. Although the assessment methodology was not yet fully developed, its trial application yielded promising results with regard to the selection of appropriate C/S equipment.

  3. In search of novel, high performance and intelligent materials for applications in severe and unconditioned environments

    International Nuclear Information System (INIS)

    Gyeabour Ayensu, A. I.; Normeshie, C. M. K.

    2007-01-01

    For extreme operating conditions in aerospace, nuclear power plants and medical applications, novel materials have become more competitive over traditional materials because of the unique characteristics. Extensive research programmes are being undertaken to develop high performance and knowledge-intensive new materials, since existing materials cannot meet the stringent technological requirements of advanced materials for emerging industries. The technologies of intermetallic compounds, nanostructural materials, advanced composites, and photonics materials are presented. In addition, medical biomaterial implants of high functional performance based on biocompatibility, resistance against corrosion and degradation, and for applications in hostile environment of human body are discussed. The opportunities for African researchers to collaborate in international research programmes to develop local raw materials into high performance materials are also highlighted. (au)

  4. Screening applicants for risk of poor academic performance: a novel scoring system using preadmission grade point averages and graduate record examination scores.

    Science.gov (United States)

    Luce, David

    2011-01-01

    The purpose of this study was to develop an effective screening tool for identifying physician assistant (PA) program applicants at highest risk for poor academic performance. Prior to reviewing applications for the class of 2009, a retrospective analysis of preadmission data took place for the classes of 2006, 2007, and 2008. A single composite score was calculated for each student who matriculated (number of subjects, N=228) incorporating the total undergraduate grade point average (UGPA), the science GPA (SGPA), and the three component Graduate Record Examination (GRE) scores: verbal (GRE-V), quantitative (GRE-Q), analytical (GRE-A). Individual applicant scores for each of the five parameters were ranked in descending quintiles. Each applicant's five quintile scores were then added, yielding a total quintile score ranging from 25, which indicated an excellent performance, to 5, which indicated poorer performance. Thirteen of the 228 students had academic difficulty (dismissal, suspension, or one-quarter on academic warning or probation). Twelve of the 13 students having academic difficulty had a preadmission total quintile score 12 (range, 6-14). In response to this descriptive analysis, when selecting applicants for the class of 2009, the admissions committee used the total quintile score for screening applicants for interviews. Analysis of correlations in preadmission, graduate, and postgraduate performance data for the classes of 2009-2013 will continue and may help identify those applicants at risk for academic difficulty. Establishing a threshold total quintile score of applicant GPA and GRE scores may significantly decrease the number of entering PA students at risk for poor academic performance.

  5. Performance characteristics of the ferilab 15-foot bubble chamber with a 1/3-scale internal picket fence (IPF) and a two-plane external muon identifier (EMI)

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, M.L.

    1978-06-01

    The Fermilab 15-foot bubble chamber has been exposed to a quadrupole triplet neutrino beam. During this exposure, a 2-plane EMI and a 1/3-scale IPF, were in operation down-stream of the bubble chamber. The IPF consisted of sixteen 0.1 m/sup 2/ drift chambers (pickets) placed inside the vacuum tank of the bubble chamber to record temporal information from neutrino interactions. When a greater than or equal to 5-fold time coincidence between one or more of the pickets of the IPF and the EMI was formed, one was able to search the nagmetic tapes for dimuon candidates. Even with 1/3 geometrical coverage by the IPF, this system identified 70% of the dimuon candidates before the film was scanned. Other performance characteristics of the system will be presented with emphasis on the usefulness of the IPF.

  6. Performance characteristics of the ferilab 15-foot bubble chamber with a 1/3-scale internal picket fence (IPF) and a two-plane external muon identifier (EMI)

    International Nuclear Information System (INIS)

    Stevenson, M.L.

    1978-06-01

    The Fermilab 15-foot bubble chamber has been exposed to a quadrupole triplet neutrino beam. During this exposure, a 2-plane EMI and a 1/3-scale IPF, were in operation down-stream of the bubble chamber. The IPF consisted of sixteen 0.1 m 2 drift chambers (pickets) placed inside the vacuum tank of the bubble chamber to record temporal information from neutrino interactions. When a greater than or equal to 5-fold time coincidence between one or more of the pickets of the IPF and the EMI was formed, one was able to search the nagmetic tapes for dimuon candidates. Even with 1/3 geometrical coverage by the IPF, this system identified 70% of the dimuon candidates before the film was scanned. Other performance characteristics of the system will be presented with emphasis on the usefulness of the IPF

  7. Evaluation of Smartphone Inertial Sensor Performance for Cross-Platform Mobile Applications

    Directory of Open Access Journals (Sweden)

    Anton Kos

    2016-04-01

    Full Text Available Smartphone sensors are being increasingly used in mobile applications. The performance of sensors varies considerably among different smartphone models and the development of a cross-platform mobile application might be a very complex and demanding task. A publicly accessible resource containing real-life-situation smartphone sensor parameters could be of great help for cross-platform developers. To address this issue we have designed and implemented a pilot participatory sensing application for measuring, gathering, and analyzing smartphone sensor parameters. We start with smartphone accelerometer and gyroscope bias and noise parameters. The application database presently includes sensor parameters of more than 60 different smartphone models of different platforms. It is a modest, but important start, offering information on several statistical parameters of the measured smartphone sensors and insights into their performance. The next step, a large-scale cloud-based version of the application, is already planned. The large database of smartphone sensor parameters may prove particularly useful for cross-platform developers. It may also be interesting for individual participants who would be able to check-up and compare their smartphone sensors against a large number of similar or identical models.

  8. Evaluation of Smartphone Inertial Sensor Performance for Cross-Platform Mobile Applications

    Science.gov (United States)

    Kos, Anton; Tomažič, Sašo; Umek, Anton

    2016-01-01

    Smartphone sensors are being increasingly used in mobile applications. The performance of sensors varies considerably among different smartphone models and the development of a cross-platform mobile application might be a very complex and demanding task. A publicly accessible resource containing real-life-situation smartphone sensor parameters could be of great help for cross-platform developers. To address this issue we have designed and implemented a pilot participatory sensing application for measuring, gathering, and analyzing smartphone sensor parameters. We start with smartphone accelerometer and gyroscope bias and noise parameters. The application database presently includes sensor parameters of more than 60 different smartphone models of different platforms. It is a modest, but important start, offering information on several statistical parameters of the measured smartphone sensors and insights into their performance. The next step, a large-scale cloud-based version of the application, is already planned. The large database of smartphone sensor parameters may prove particularly useful for cross-platform developers. It may also be interesting for individual participants who would be able to check-up and compare their smartphone sensors against a large number of similar or identical models. PMID:27049391

  9. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  10. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Ginés D. Guerrero

    2014-01-01

    Full Text Available Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO. This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor.

  11. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    International Nuclear Information System (INIS)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-01-01

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroic effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster

  12. Application essays and future performance in medical school: are they related?

    Science.gov (United States)

    Dong, Ting; Kay, Allen; Artino, Anthony R; Gilliland, William R; Waechter, Donna M; Cruess, David; DeZee, Kent J; Durning, Steven J

    2013-01-01

    There is a paucity of research on whether application essays are a valid indicator of medical students' future performance. The goal is to score medical school application essays systematically and examine the correlations between these essay scores and several indicators of student performance during medical school and internship. A journalist created a scoring rubric based on the journalism literature and scored 2 required essays of students admitted to our university in 1 year (N = 145). We picked 7 indicators of medical school and internship performance and correlated these measures with overall essay scores: preclinical medical school grade point average (GPA), clinical medical school GPA, cumulative medical school GPA, U.S. Medical Licensing Exam (USMLE) Step 1 and 2 scores, and scores on a program director's evaluation measuring intern professionalism and expertise. We then examined the Pearson and Spearman correlations between essay scores and the outcomes. Essay scores did not vary widely. American Medical College Application Service essay scores ranged from 3.3 to 4.5 (M = 4.11, SD = 0.15), and Uniformed Services University of the Health Sciences essay scores ranged from 2.9 to 4.5 (M = 4.09, SD = 0.17). None of the medical school or internship performance indicators was significantly correlated with the essay scores. These findings raise questions about the utility of matriculation essays, a resource-intensive admission requirement.

  13. Environmental performance of electricity storage systems for grid applications, a life cycle approach

    International Nuclear Information System (INIS)

    Oliveira, L.; Messagie, M.; Mertens, J.; Laget, H.; Coosemans, T.; Van Mierlo, J.

    2015-01-01

    Highlights: • Large energy storage systems: environmental performance under different scenarios. • ReCiPe midpoint and endpoint impact assessment results are analyzed. • Energy storage systems can replace peak power generation units. • Energy storage systems and renewable energy have the best environmental scores. • Environmental performance of storage systems is application dependent. - Abstract: In this paper, the environmental performance of electricity storage technologies for grid applications is assessed. Using a life cycle assessment methodology we analyze the impacts of the construction, disposal/end of life, and usage of each of the systems. Pumped hydro and compressed air storage are studied as mechanical storage, and advanced lead acid, sodium sulfur, lithium-ion and nickel–sodium-chloride batteries are addressed as electrochemical storage systems. Hydrogen production from electrolysis and subsequent usage in a proton exchange membrane fuel cell are also analyzed. The selected electricity storage systems mimic real world installations in terms of capacity, power rating, life time, technology and application. The functional unit is one kW h of energy delivered back to the grid, from the storage system. The environmental impacts assessed are climate change, human toxicity, particulate matter formation, and fossil resource depletion. Different electricity mixes are used in order to exemplify scenarios where the selected technologies meet specific applications. Results indicate that the performance of the storage systems is tied to the electricity feedstocks used during use stage. Renewable energy sources have lower impacts throughout the use stage of the storage technologies. Using the Belgium electricity mix of 2011 as benchmark, the sodium sulfur battery is shown to be the best performer for all the impacts analyzed. Pumped hydro storage follows in second place. Regarding infrastructure and end of life, results indicate that battery systems

  14. Characterization of high performance silicon-based VMJ PV cells for laser power transmission applications

    Science.gov (United States)

    Perales, Mico; Yang, Mei-huan; Wu, Cheng-liang; Hsu, Chin-wei; Chao, Wei-sheng; Chen, Kun-hsien; Zahuranec, Terry

    2016-03-01

    Continuing improvements in the cost and power of laser diodes have been critical in launching the emerging fields of power over fiber (PoF), and laser power beaming. Laser power is transmitted either over fiber (for PoF), or through free space (power beaming), and is converted to electricity by photovoltaic cells designed to efficiently convert the laser light. MH GoPower's vertical multi-junction (VMJ) PV cell, designed for high intensity photovoltaic applications, is fueling the emergence of this market, by enabling unparalleled photovoltaic receiver flexibility in voltage, cell size, and power output. Our research examined the use of the VMJ PV cell for laser power transmission applications. We fully characterized the performance of the VMJ PV cell under various laser conditions, including multiple near IR wavelengths and light intensities up to tens of watts per cm2. Results indicated VMJ PV cell efficiency over 40% for 9xx nm wavelengths, at laser power densities near 30 W/cm2. We also investigated the impact of the physical dimensions (length, width, and height) of the VMJ PV cell on its performance, showing similarly high performance across a wide range of cell dimensions. We then evaluated the VMJ PV cell performance within the power over fiber application, examining the cell's effectiveness in receiver packages that deliver target voltage, intensity, and power levels. By designing and characterizing multiple receivers, we illustrated techniques for packaging the VMJ PV cell for achieving high performance (> 30%), high power (> 185 W), and target voltages for power over fiber applications.

  15. Accelerating Scientific Applications using High Performance Dense and Sparse Linear Algebra Kernels on GPUs

    KAUST Repository

    Abdelfattah, Ahmad

    2015-01-15

    High performance computing (HPC) platforms are evolving to more heterogeneous configurations to support the workloads of various applications. The current hardware landscape is composed of traditional multicore CPUs equipped with hardware accelerators that can handle high levels of parallelism. Graphical Processing Units (GPUs) are popular high performance hardware accelerators in modern supercomputers. GPU programming has a different model than that for CPUs, which means that many numerical kernels have to be redesigned and optimized specifically for this architecture. GPUs usually outperform multicore CPUs in some compute intensive and massively parallel applications that have regular processing patterns. However, most scientific applications rely on crucial memory-bound kernels and may witness bottlenecks due to the overhead of the memory bus latency. They can still take advantage of the GPU compute power capabilities, provided that an efficient architecture-aware design is achieved. This dissertation presents a uniform design strategy for optimizing critical memory-bound kernels on GPUs. Based on hierarchical register blocking, double buffering and latency hiding techniques, this strategy leverages the performance of a wide range of standard numerical kernels found in dense and sparse linear algebra libraries. The work presented here focuses on matrix-vector multiplication kernels (MVM) as repre- sentative and most important memory-bound operations in this context. Each kernel inherits the benefits of the proposed strategies. By exposing a proper set of tuning parameters, the strategy is flexible enough to suit different types of matrices, ranging from large dense matrices, to sparse matrices with dense block structures, while high performance is maintained. Furthermore, the tuning parameters are used to maintain the relative performance across different GPU architectures. Multi-GPU acceleration is proposed to scale the performance on several devices. The

  16. Assessment of applicability of portable HPGe detector with in situ object counting system based on performance evaluation of thyroid radiobioassays

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Seok; Kwon, Tae Eun; Pak, Min Jung; Park, Se Young; Ha, Wi Ho; Jin, Young Woo [National Radiation Emergency Medical Center, Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2017-06-15

    Different cases exist in the measurement of thyroid radiobioassays owing to the individual characteristics of the subjects, especially the potential variation in the counting efficiency. An In situ Object Counting System (ISOCS) was developed to perform an efficiency calibration based on the Monte Carlo calculation, as an alternative to conventional calibration methods. The purpose of this study is to evaluate the applicability of ISOCS to thyroid radiobioassays by comparison with a conventional thyroid monitoring system. The efficiency calibration of a portable high-purity germanium (HPGe) detector was performed using ISOCS software. In contrast, the conventional efficiency calibration, which needed a radioactive material, was applied to a scintillator-based thyroid monitor. Four radioiodine samples that contained 125I and 131I in both aqueous solution and gel forms were measured to evaluate radioactivity in the thyroid. ANSI/HPS N13.30 performance criteria, which included the relative bias, relative precision, and root-mean-squared error, were applied to evaluate the performance of the measurement system. The portable HPGe detector could measure both radioiodines with ISOCS but the thyroid monitor could not measure 125I because of the limited energy resolution of the NaI(Tl) scintillator. The 131I results from both detectors agreed to within 5% with the certified results. Moreover, the 125I results from the portable HPGe detector agreed to within 10% with the certified results. All measurement results complied with the ANSI/HPS N13.30 performance criteria. The results of the intercomparison program indicated the feasibility of applying ISOCS software to direct thyroid radiobioassays. The portable HPGe detector with ISOCS software can provide the convenience of efficiency calibration and higher energy resolution for identifying photopeaks, compared with a conventional thyroid monitor with a NaI(Tl) scintillator. The application of ISOCS software in a radiation

  17. Integrated Approach Towards the Application of Horizontal Wells to Improve Waterflooding Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kelkar, Mohan; Liner, Chris; Kerr, Dennis

    1999-10-15

    This final report describes the progress during the six year of the project on ''Integrated Approach Towards the Application of Horizontal Wells to Improve Waterflooding Performance.'' This report is funded under the Department of Energy's (DOE's) Class I program which is targeted towards improving the reservoir performance of mature oil fields located in fluvially-dominated deltaic deposits. The project involves using an integrated approach to characterize the reservoir followed by drilling of horizontal injection wells to improve production performance. The project was divided into two budget periods. In the first budget period, many modern technologies were used to develop a detailed reservoir management plan; whereas, in the second budget period, conventional data was used to develop a reservoir management plan. The idea was to determine the cost effectiveness of various technologies in improving the performance of mature oil fields.

  18. Quantitative performance targets by using balanced scorecard system: application to waste management and public administration.

    Science.gov (United States)

    Mendes, Paula; Nunes, Luis Miguel; Teixeira, Margarida Ribau

    2014-09-01

    This article demonstrates how decision-makers can be guided in the process of defining performance target values in the balanced scorecard system. We apply a method based on sensitivity analysis with Monte Carlo simulation to the municipal solid waste management system in Loulé Municipality (Portugal). The method includes two steps: sensitivity analysis of performance indicators to identify those performance indicators with the highest impact on the balanced scorecard model outcomes; and sensitivity analysis of the target values for the previously identified performance indicators. Sensitivity analysis shows that four strategic objectives (IPP1: Comply with the national waste strategy; IPP4: Reduce nonrenewable resources and greenhouse gases; IPP5: Optimize the life-cycle of waste; and FP1: Meet and optimize the budget) alone contribute 99.7% of the variability in overall balanced scorecard value. Thus, these strategic objectives had a much stronger impact on the estimated balanced scorecard outcome than did others, with the IPP1 and the IPP4 accounting for over 55% and 22% of the variance in overall balanced scorecard value, respectively. The remaining performance indicators contribute only marginally. In addition, a change in the value of a single indicator's target value made the overall balanced scorecard value change by as much as 18%. This may lead to involuntarily biased decisions by organizations regarding performance target-setting, if not prevented with the help of methods such as that proposed and applied in this study. © The Author(s) 2014.

  19. Energy technologies for distributed utility applications: Cost and performance trends, and implications for photovoltaics

    International Nuclear Information System (INIS)

    Eyer, J.M.

    1994-01-01

    Utilities are evaluating several electric generation and storage (G ampersand S) technologies for distributed utility (DU) applications. Attributes of leading DU technologies and implications for photovoltaics (PV) are described. Included is a survey of present and projected cost and performance for: (1) small, advanced combustion turbines (CTs); (2) advanced, natural gas-fired, diesel engines (diesel engines); and (3) advanced lead-acid battery systems (batteries). Technology drivers and relative qualitative benefits are described. A levelized energy cost-based cost target for PV for DU applications is provided. The analysis addresses only relative cost, for PV and for three selected alternative DU technologies. Comparable size, utility, and benefits are assumed, although relative value is application-specific and often technology- and site-specific

  20. The application of cloud computing to scientific workflows: a study of cost and performance.

    Science.gov (United States)

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.