WorldWideScience

Sample records for metrical task systems

  1. Eye metrics for task-dependent automation

    NARCIS (Netherlands)

    Imants, P.; Greef, T.E. de

    2014-01-01

    Future air traffic is expected to grow increasingly, opening up a gap for task dependent automation and adaptive interfaces, helping the Air Traffic Controller to cope with fluctuating workloads. One of the challenging factors in the application of such intelligent systems concerns the question what

  2. Eye Metrics for Task-Dependent Automation

    NARCIS (Netherlands)

    Imants, P.; de Greef, T.F.A.

    2014-01-01

    Future air traffic is expected to grow increasingly, opening up a gap for task dependent automation and adaptive interfaces, helping the Air Traffic Controller to cope with fluctuating workloads. One of the challenging factors in the application of such intelligent systems concerns the question what

  3. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  4. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  5. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  6. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  7. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Science.gov (United States)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  8. Metrics for border management systems.

    Energy Technology Data Exchange (ETDEWEB)

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  9. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  10. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  11. Validation of Metrics for Collaborative Systems

    OpenAIRE

    Ion IVAN; Cristian CIUREA

    2008-01-01

    This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  12. Analysis of Skeletal Muscle Metrics as Predictors of Functional Task Performance

    Science.gov (United States)

    Ryder, Jeffrey W.; Buxton, Roxanne E.; Redd, Elizabeth; Scott-Pandorf, Melissa; Hackney, Kyle J.; Fiedler, James; Ploutz-Snyder, Robert J.; Bloomberg, Jacob J.; Ploutz-Snyder, Lori L.

    2010-01-01

    PURPOSE: The ability to predict task performance using physiological performance metrics is vital to ensure that astronauts can execute their jobs safely and effectively. This investigation used a weighted suit to evaluate task performance at various ratios of strength, power, and endurance to body weight. METHODS: Twenty subjects completed muscle performance tests and functional tasks representative of those that would be required of astronauts during planetary exploration (see table for specific tests/tasks). Subjects performed functional tasks while wearing a weighted suit with additional loads ranging from 0-120% of initial body weight. Performance metrics were time to completion for all tasks except hatch opening, which consisted of total work. Task performance metrics were plotted against muscle metrics normalized to "body weight" (subject weight + external load; BW) for each trial. Fractional polynomial regression was used to model the relationship between muscle and task performance. CONCLUSION: LPMIF/BW is the best predictor of performance for predominantly lower-body tasks that are ambulatory and of short duration. LPMIF/BW is a very practical predictor of occupational task performance as it is quick and relatively safe to perform. Accordingly, bench press work best predicts hatch-opening work performance.

  13. A Metrics Approach for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2009-01-01

    Full Text Available This article presents different types of collaborative systems, their structure and classification. This paper defines the concept of virtual campus as a collaborative system. It builds architecture for virtual campus oriented on collaborative training processes. It analyses the quality characteristics of collaborative systems and propose techniques for metrics construction and validation in order to evaluate them. The article analyzes different ways to increase the efficiency and the performance level in collaborative banking systems.

  14. Alternative kinetic energy metrics for Lagrangian systems

    Science.gov (United States)

    Sarlet, W.; Prince, G.

    2010-11-01

    We examine Lagrangian systems on \\ {R}^n with standard kinetic energy terms for the possibility of additional, alternative Lagrangians with kinetic energy metrics different to the Euclidean one. Using the techniques of the inverse problem in the calculus of variations we find necessary and sufficient conditions for the existence of such Lagrangians. We illustrate the problem in two and three dimensions with quadratic and cubic potentials. As an aside we show that the well-known anomalous Lagrangians for the Coulomb problem can be removed by switching on a magnetic field, providing an appealing resolution of the ambiguous quantizations of the hydrogen atom.

  15. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  16. Chaos of discrete dynamical systems in complete metric spaces

    International Nuclear Information System (INIS)

    Shi Yuming; Chen Guanrong

    2004-01-01

    This paper is concerned with chaos of discrete dynamical systems in complete metric spaces. Discrete dynamical systems governed by continuous maps in general complete metric spaces are first discussed, and two criteria of chaos are then established. As a special case, two corresponding criteria of chaos for discrete dynamical systems in compact subsets of metric spaces are obtained. These results have extended and improved the existing relevant results of chaos in finite-dimensional Euclidean spaces

  17. 20 CFR 435.15 - Metric system of measurement.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Metric system of measurement. 435.15 Section 435.15 Employees' Benefits SOCIAL SECURITY ADMINISTRATION UNIFORM ADMINISTRATIVE REQUIREMENTS FOR... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  18. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  19. 22 CFR 226.15 - Metric system of measurement.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Metric system of measurement. 226.15 Section 226.15 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Pre-award Requirements § 226.15 Metric system of measurement. (a...

  20. Economic Metrics for Commercial Reusable Space Transportation Systems

    Science.gov (United States)

    Shaw, Eric J.; Hamaker, Joseph (Technical Monitor)

    2000-01-01

    The success of any effort depends upon the effective initial definition of its purpose, in terms of the needs to be satisfied and the goals to be fulfilled. If the desired product is "A System" that is well-characterized, these high-level need and goal statements can be transformed into system requirements by traditional systems engineering techniques. The satisfaction of well-designed requirements can be tracked by fairly straightforward cost, schedule, and technical performance metrics. Unfortunately, some types of efforts, including those that NASA terms "Programs," tend to resist application of traditional systems engineering practices. In the NASA hierarchy of efforts, a "Program" is often an ongoing effort with broad, high-level goals and objectives. A NASA "project" is a finite effort, in terms of budget and schedule, that usually produces or involves one System. Programs usually contain more than one project and thus more than one System. Special care must be taken in the formulation of NASA Programs and their projects, to ensure that lower-level project requirements are traceable to top-level Program goals, feasible with the given cost and schedule constraints, and measurable against top-level goals. NASA Programs and projects are tasked to identify the advancement of technology as an explicit goal, which introduces more complicating factors. The justification for funding of technology development may be based on the technology's applicability to more than one System, Systems outside that Program or even external to NASA. Application of systems engineering to broad-based technology development, leading to effective measurement of the benefits, can be valid, but it requires that potential beneficiary Systems be organized into a hierarchical structure, creating a "system of Systems." In addition, these Systems evolve with the successful application of the technology, which creates the necessity for evolution of the benefit metrics to reflect the changing

  1. Wireless Sensor Network Metrics for Real-Time Systems

    Science.gov (United States)

    2009-05-20

    Wireless Sensor Network Metrics for Real-Time Systems Phoebus Wei-Chih Chen Electrical Engineering and Computer Sciences University of California at...3. DATES COVERED 00-00-2009 to 00-00-2009 4. TITLE AND SUBTITLE Wireless Sensor Network Metrics for Real-Time Systems 5a. CONTRACT NUMBER 5b... wireless sensor networks (WSNs) is moving from studies of WSNs in isolation toward studies where the WSN is treated as a component of a larger system

  2. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    Energy Technology Data Exchange (ETDEWEB)

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  3. User Metrics in NASA Earth Science Data Systems

    Science.gov (United States)

    Lynnes, Chris

    2018-01-01

    This presentation the collection and use of user metrics in NASA's Earth Science data systems. A variety of collection methods is discussed, with particular emphasis given to the American Customer Satisfaction Index (ASCI). User sentiment on potential use of cloud computing is presented, with generally positive responses. The presentation also discusses various forms of automatically collected metrics, including an example of the relative usage of different functions within the Giovanni analysis system.

  4. Primer Control System Cyber Security Framework and Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Wayne F. Boyer; Miles A. McQueen

    2008-05-01

    The Department of Homeland Security National Cyber Security Division supported development of a control system cyber security framework and a set of technical metrics to aid owner-operators in tracking control systems security. The framework defines seven relevant cyber security dimensions and provides the foundation for thinking about control system security. Based on the developed security framework, a set of ten technical metrics are recommended that allow control systems owner-operators to track improvements or degradations in their individual control systems security posture.

  5. Mental workload and cognitive task automaticity: an evaluation of subjective and time estimation metrics.

    Science.gov (United States)

    Liu, Y; Wickens, C D

    1994-11-01

    The evaluation of mental workload is becoming increasingly important in system design and analysis. The present study examined the structure and assessment of mental workload in performing decision and monitoring tasks by focusing on two mental workload measurements: subjective assessment and time estimation. The task required the assignment of a series of incoming customers to the shortest of three parallel service lines displayed on a computer monitor. The subject was either in charge of the customer assignment (manual mode) or was monitoring an automated system performing the same task (automatic mode). In both cases, the subjects were required to detect the non-optimal assignments that they or the computer had made. Time pressure was manipulated by the experimenter to create fast and slow conditions. The results revealed a multi-dimensional structure of mental workload and a multi-step process of subjective workload assessment. The results also indicated that subjective workload was more influenced by the subject's participatory mode than by the factor of task speed. The time estimation intervals produced while performing the decision and monitoring tasks had significantly greater length and larger variability than those produced while either performing no other tasks or performing a well practised customer assignment task. This result seemed to indicate that time estimation was sensitive to the presence of perceptual/cognitive demands, but not to response related activities to which behavioural automaticity has developed.

  6. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  7. 41 CFR 105-72.205 - Metric system of measurement.

    Science.gov (United States)

    2010-07-01

    ... Management Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services Administration 72-UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER... system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act...

  8. [Clinical trial data management and quality metrics system].

    Science.gov (United States)

    Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan

    2015-11-01

    Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.

  9. Joining of Ukraine to the European scientific and metric systems

    Directory of Open Access Journals (Sweden)

    O.M. Sazonets

    2015-09-01

    Full Text Available At the present stage of development it is necessary to form the knowledge which structures knowledge as the object of management. In conditions of technological globalism there are structural changes in the information environment of countries. Scientific metrics is sufficiently developed in other countries, especially in the EU. The article contains the description of the first index calculation system of scientific references called Science Citation Index (SCI. The main advantage of this project was searching for information not only by the author and thematic categories, but also by the list of cited literature. The authors define the scientific and metric base in the following way: scientific and metric database (SMBD is the bibliographic and abstract database with the tools for tracking citations of articles published in scientific journals. The most prominent European scientific and metric bases are examined. The authors show that the bases have the performance assessment tools which track down the impact of scientific papers and publications of individual scientists and research institutions. The state of crisis in scientific and technological activities in Ukraine as well as the economy as a whole, needs immediate organization of national scientific and metric system.

  10. Evaluation of Subjective and Objective Performance Metrics for Haptically Controlled Robotic Systems

    Directory of Open Access Journals (Sweden)

    Cong Dung Pham

    2014-07-01

    Full Text Available This paper studies in detail how different evaluation methods perform when it comes to describing the performance of haptically controlled mobile manipulators. Particularly, we investigate how well subjective metrics perform compared to objective metrics. To find the best metrics to describe the performance of a control scheme is challenging when human operators are involved; how the user perceives the performance of the controller does not necessarily correspond to the directly measurable metrics normally used in controller evaluation. It is therefore important to study whether there is any correspondence between how the user perceives the performance of a controller, and how it performs in terms of directly measurable metrics such as the time used to perform a task, number of errors, accuracy, and so on. To perform these tests we choose a system that consists of a mobile manipulator that is controlled by an operator through a haptic device. This is a good system for studying different performance metrics as the performance can be determined by subjective metrics based on feedback from the users, and also as objective and directly measurable metrics. The system consists of a robotic arm which provides for interaction and manipulation, which is mounted on a mobile base which extends the workspace of the arm. The operator thus needs to perform both interaction and locomotion using a single haptic device. While the position of the on-board camera is determined by the base motion, the principal control objective is the motion of the manipulator arm. This calls for intelligent control allocation between the base and the manipulator arm in order to obtain intuitive control of both the camera and the arm. We implement three different approaches to the control allocation problem, i.e., whether the vehicle or manipulator arm actuation is applied to generate the desired motion. The performance of the different control schemes is evaluated, and our

  11. An entropy generation metric for non-energy systems assessments

    International Nuclear Information System (INIS)

    Sekulic, Dusan P.

    2009-01-01

    Processes in non-energy systems have not been as frequent a subject of sustainability studies based on Thermodynamics as have processes in energy systems. This paper offers insight into thermodynamic thinking devoted to selection of a sustainability energy-related metric based on entropy balancing of a non-energy system. An underlying objective in this sustainability oriented study is product quality involving thermal processing during manufacturing vs. resource utilization (say, energy). The product quality for the considered family of materials processing for manufacturing is postulated as inherently controlled by the imposed temperature non-uniformity margins. These temperature non-uniformities can be converted into a thermodynamic metric which can be related to either destruction of exergy of the available resource or, on a more fundamental level of process quality, to entropy generation inherent to the considered manufacturing system. Hence, a manufacturing system can be considered as if it were an energy system, although in the later case the system objective would be quite different. In a non-energy process, a metric may indicate the level of perfection of the process (not necessarily energy efficiency) and may be related to the sustainability footprint or, as advocated in this paper, it may be related to product quality. Controlled atmosphere brazing (CAB) of aluminum, a state-of-the-art manufacturing process involving mass production of compact heat exchangers for automotive, aerospace and process industries, has been used as an example.

  12. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    Science.gov (United States)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  13. Using Genetic Algorithms for Building Metrics of Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2011-01-01

    Full Text Available he paper objective is to reveal the importance of genetic algorithms in building robust metrics of collaborative systems. The main types of collaborative systems in economy are presented and some characteristics of genetic algorithms are described. A genetic algorithm was implemented in order to determine the local maximum and minimum points of the relative complexity function associated to a collaborative banking system. The intelligent collaborative systems based on genetic algorithms, representing the new generation of collaborative systems, are analyzed and the implementation of auto-adaptive interfaces in a banking application is described.

  14. Metrics required for Power System Resilient Operations and Protection

    Energy Technology Data Exchange (ETDEWEB)

    Eshghi, K.; Johnson, B. K.; Rieger, C. G.

    2016-08-01

    Today’s complex grid involves many interdependent systems. Various layers of hierarchical control and communication systems are coordinated, both spatially and temporally to achieve gird reliability. As new communication network based control system technologies are being deployed, the interconnected nature of these systems is becoming more complex. Deployment of smart grid concepts promises effective integration of renewable resources, especially if combined with energy storage. However, without a philosophical focus on resilience, a smart grid will potentially lead to higher magnitude and/or duration of disruptive events. The effectiveness of a resilient infrastructure depends upon its ability to anticipate, absorb, adapt to, and/or rapidly recover from a potentially catastrophic event. Future system operations can be enhanced with a resilient philosophy through architecting the complexity with state awareness metrics that recognize changing system conditions and provide for an agile and adaptive response. The starting point for metrics lies in first understanding the attributes of performance that will be qualified. In this paper, we will overview those attributes and describe how they will be characterized by designing a distributed agent that can be applied to the power grid.

  15. Metric-based approach and tool for modeling the I and C system using Markov chains

    International Nuclear Information System (INIS)

    Butenko, Valentyna; Kharchenko, Vyacheslav; Odarushchenko, Elena; Butenko, Dmitriy

    2015-01-01

    Markov's chains (MC) are well-know and widely applied in dependability and performability analysis of safety-critical systems, because of the flexible representation of system components dependencies and synchronization. There are few radblocks for greater application of the MC: accounting the additional system components increases the model state-space and complicates analysis; the non-numerically sophisticated user may find it difficult to decide between the variety of numerical methods to determine the most suitable and accurate for their application. Thus obtaining the high accurate and trusted modeling results becomes a nontrivial task. In this paper, we present the metric-based approach for selection of the applicable solution approach, based on the analysis of MCs stiffness, decomposability, sparsity and fragmentedness. Using this selection procedure the modeler can provide the verification of earlier obtained results. The presented approach was implemented in utility MSMC, which supports the MC construction, metric-based analysis, recommendations shaping and model solution. The model can be exported to the wall-known off-the-shelf mathematical packages for verification. The paper presents the case study of the industrial NPP I and C system, manufactured by RPC Radiy. The paper shows an application of metric-based approach and MSMC fool for dependability and safety analysis of RTS, and procedure of results verification. (author)

  16. 43 CFR 12.915 - Metric system of measurement.

    Science.gov (United States)

    2010-10-01

    ... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...

  17. Chart of conversion factors: From English to metric system and metric to English system

    Science.gov (United States)

    ,

    1976-01-01

    The conversion factors in the following tables are for conversion of our customary (English) units of measurement to SI*units, and for convenience, reciprocals are shown for converting SI units back to the English system. The first table contains rule-of-thumb figures, useful for "getting the feel" of SI units or mental estimation. The succeeding tables contain factors accurate to 3 or more significant figures. Please refer to known reference volumes for additional accuracy, as well as for factors dealing with other scientific notation involving SI units.

  18. An Introduction to the SI Metric System. Inservice Guide for Teaching Measurement, Kindergarten Through Grade Eight.

    Science.gov (United States)

    California State Dept. of Education, Sacramento.

    This handbook was designed to serve as a reference for teacher workshops that: (1) introduce the metric system and help teachers gain confidence with metric measurement, and (2) develop classroom measurement activities. One chapter presents the history and basic features of SI metrics. A second chapter presents a model for the measurement program.…

  19. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  20. 10 CFR 600.306 - Metric system of measurement.

    Science.gov (United States)

    2010-01-01

    ... cause significant inefficiencies or loss of markets to United States firms. (b) Recipients are... Requirements for Grants and Cooperative Agreements With For-Profit Organizations General § 600.306 Metric... Competitiveness Act of 1988 (15 U.S.C. 205) and implemented by Executive Order 12770, states that: (1) The metric...

  1. 48 CFR 611.002-70 - Metric system implementation.

    Science.gov (United States)

    2010-10-01

    ... with security, operations, economic, technical, logistical, training and safety requirements. (3) The... total cost of the retrofit, including redesign costs, exceeds $50,000; (ii) Metric is not the accepted... office with an explanation for the disapproval. (7) The in-house operating metric costs shall be...

  2. Flight Tasks and Metrics to Evaluate Laser Eye Protection in Flight Simulators

    Science.gov (United States)

    2017-07-07

    IFR ) IFR Instrument Flight Rules LED Light Emitting Diode LEP Laser Eye Protection MAPP Model Assessing Pilot Performance OD Optical Density...LEP and then use them to assess the impact of wearing LEP in a flight simulator environment. 2 Pending Distribution, A: Approved for public...2005). LEP has the potential to alter distinct characteristics of the visual environment, giving rise to concerns over the impact on flight tasks and

  3. Launch vehicle tracking enhancement through Global Positioning System Metric Tracking

    Science.gov (United States)

    Moore, T. C.; Li, Hanchu; Gray, T.; Doran, A.

    United Launch Alliance (ULA) initiated operational flights of both the Atlas V and Delta IV launch vehicle families in 2002. The Atlas V and Delta IV launch vehicles were developed jointly with the US Air Force (USAF) as part of the Evolved Expendable Launch Vehicle (EELV) program. Both Launch Vehicle (LV) families have provided 100% mission success since their respective inaugural launches and demonstrated launch capability from both Vandenberg Air Force Base (VAFB) on the Western Test Range and Cape Canaveral Air Force Station (CCAFS) on the Eastern Test Range. However, the current EELV fleet communications, tracking, & control architecture & technology, which date back to the origins of the space launch business, require support by a large and high cost ground footprint. The USAF has embarked on an initiative known as Future Flight Safety System (FFSS) that will significantly reduce Test Range Operations and Maintenance (O& M) cost by closing facilities and decommissioning ground assets. In support of the FFSS, a Global Positioning System Metric Tracking (GPS MT) System based on the Global Positioning System (GPS) satellite constellation has been developed for EELV which will allow both Ranges to divest some of their radar assets. The Air Force, ULA and Space Vector have flown the first 2 Atlas Certification vehicles demonstrating the successful operation of the GPS MT System. The first Atlas V certification flight was completed in February 2012 from CCAFS, the second Atlas V certification flight from VAFB was completed in September 2012 and the third certification flight on a Delta IV was completed October 2012 from CCAFS. The GPS MT System will provide precise LV position, velocity and timing information that can replace ground radar tracking resource functionality. The GPS MT system will provide an independent position/velocity S-Band telemetry downlink to support the current man-in-the-loop ground-based commanded destruct of an anomalous flight- The system

  4. Metrical theorems on systems of small inhomogeneous linear forms

    DEFF Research Database (Denmark)

    Hussain, Mumtaz; Kristensen, Simon

    In this paper we establish complete Khintchine-Groshev and Schmidt type theorems for inhomogeneous small linear forms in the so-called doubly metric case, in which the inhomogeneous parameter is not fixed.......In this paper we establish complete Khintchine-Groshev and Schmidt type theorems for inhomogeneous small linear forms in the so-called doubly metric case, in which the inhomogeneous parameter is not fixed....

  5. Metrical results on systems of small linear forms

    DEFF Research Database (Denmark)

    Hussain, M.; Kristensen, Simon

    In this paper the metric theory of Diophantine approximation associated with the small linear forms is investigated. Khintchine--Groshev theorems are established along with Hausdorff measure generalization without the monotonic assumption on the approximating function.......In this paper the metric theory of Diophantine approximation associated with the small linear forms is investigated. Khintchine--Groshev theorems are established along with Hausdorff measure generalization without the monotonic assumption on the approximating function....

  6. Task oriented evaluation system for maintenance robots

    International Nuclear Information System (INIS)

    Asame, Hajime; Endo, Isao; Kotosaka, Shin-ya; Takata, Shozo; Hiraoka, Hiroyuki; Kohda, Takehisa; Matsumoto, Akihiro; Yamagishi, Kiichiro.

    1994-01-01

    The adaptability evaluation of maintenance robots to autonomous plants has been discussed. In this paper, a new concept of autonomous plant with maintenance robots are introduced, and a framework of autonomous maintenance system is proposed. Then, task-oriented evaluation of robot arms is discussed for evaluating their adaptability to maintenance tasks, and a new criterion called operability is proposed for adaptability evaluation. The task-oriented evaluation system is implemented and applied to structural design of robot arms. Using genetic algorithm, an optimal structure adaptable to a pump disassembly task is obtained. (author)

  7. Grading the Metrics: Performance-Based Funding in the Florida State University System

    Science.gov (United States)

    Cornelius, Luke M.; Cavanaugh, Terence W.

    2016-01-01

    A policy analysis of Florida's 10-factor Performance-Based Funding system for state universities. The focus of the article is on the system of performance metrics developed by the state Board of Governors and their impact on institutions and their missions. The paper also discusses problems and issues with the metrics, their ongoing evolution, and…

  8. An Evaluation of the IntelliMetric[SM] Essay Scoring System

    Science.gov (United States)

    Rudner, Lawrence M.; Garcia, Veronica; Welch, Catherine

    2006-01-01

    This report provides a two-part evaluation of the IntelliMetric[SM] automated essay scoring system based on its performance scoring essays from the Analytic Writing Assessment of the Graduate Management Admission Test[TM] (GMAT[TM]). The IntelliMetric system performance is first compared to that of individual human raters, a Bayesian system…

  9. Survey of source code metrics for evaluating testability of object oriented systems

    OpenAIRE

    Shaheen , Muhammad Rabee; Du Bousquet , Lydie

    2010-01-01

    Software testing is costly in terms of time and funds. Testability is a software characteristic that aims at producing systems easy to test. Several metrics have been proposed to identify the testability weaknesses. But it is sometimes difficult to be convinced that those metrics are really related with testability. This article is a critical survey of the source-code based metrics proposed in the literature for object-oriented software testability. It underlines the necessity to provide test...

  10. The analysis of timing metrics for synchronisation purposes in OFDM systems

    NARCIS (Netherlands)

    Hoeksema, F.W.; Slump, Cornelis H.

    2006-01-01

    In joint timing and carrier o®set estimation algorithms for Time Division Duple- xing (TDD) OFDM systems, di®erent timing metrics are proposed to determine the beginning of a burst or symbol. In this contribution we present the di®erent timing metrics. Generally speaking, analysis is done by

  11. Task planning systems with natural language interface

    International Nuclear Information System (INIS)

    Kambayashi, Shaw; Uenaka, Junji

    1989-12-01

    In this report, a natural language analyzer and two different task planning systems are described. In 1988, we have introduced a Japanese language analyzer named CS-PARSER for the input interface of the task planning system in the Human Acts Simulation Program (HASP). For the purpose of a high speed analysis, we have modified a dictionary system of the CS-PARSER by using C language description. It is found that the new dictionary system is very useful for a high speed analysis and an efficient maintenance of the dictionary. For the study of the task planning problem, we have modified a story generating system named Micro TALE-SPIN to generate a story written in Japanese sentences. We have also constructed a planning system with natural language interface by using the CS-PARSER. Task planning processes and related knowledge bases of these systems are explained. A concept design for a new task planning system will be also discussed from evaluations of above mentioned systems. (author)

  12. Metrics Feedback Cycle: measuring and improving user engagement in gamified eLearning systems

    Directory of Open Access Journals (Sweden)

    Adam Atkins

    2017-12-01

    Full Text Available This paper presents the identification, design and implementation of a set of metrics of user engagement in a gamified eLearning application. The 'Metrics Feedback Cycle' (MFC is introduced as a formal process prescribing the iterative evaluation and improvement of application-wide engagement, using data collected from metrics as input to improve related engagement features. This framework was showcased using a gamified eLearning application as a case study. In this paper, we designed a prototype and tested it with thirty-six (N=36 students to validate the effectiveness of the MFC. The analysis and interpretation of metrics data shows that the gamification features had a positive effect on user engagement, and helped identify areas in which this could be improved. We conclude that the MFC has applications in gamified systems that seek to maximise engagement by iteratively evaluating implemented features against a set of evolving metrics.

  13. A lighting metric for quantitative evaluation of accent lighting systems

    Science.gov (United States)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  14. Development and Implementation of a Design Metric for Systems Containing Long-Term Fluid Loops

    Science.gov (United States)

    Steele, John W.

    2016-01-01

    John Steele, a chemist and technical fellow from United Technologies Corporation, provided a water quality module to assist engineers and scientists with a metric tool to evaluate risks associated with the design of space systems with fluid loops. This design metric is a methodical, quantitative, lessons-learned based means to evaluate the robustness of a long-term fluid loop system design. The tool was developed by a cross-section of engineering disciplines who had decades of experience and problem resolution.

  15. Integrated Metrics for Improving the Life Cycle Approach to Assessing Product System Sustainability

    Directory of Open Access Journals (Sweden)

    Wesley Ingwersen

    2014-03-01

    Full Text Available Life cycle approaches are critical for identifying and reducing environmental burdens of products. While these methods can indicate potential environmental impacts of a product, current Life Cycle Assessment (LCA methods fail to integrate the multiple impacts of a system into unified measures of social, economic or environmental performance related to sustainability. Integrated metrics that combine multiple aspects of system performance based on a common scientific or economic principle have proven to be valuable for sustainability evaluation. In this work, we propose methods of adapting four integrated metrics for use with LCAs of product systems: ecological footprint, emergy, green net value added, and Fisher information. These metrics provide information on the full product system in land, energy, monetary equivalents, and as a unitless information index; each bundled with one or more indicators for reporting. When used together and for relative comparison, integrated metrics provide a broader coverage of sustainability aspects from multiple theoretical perspectives that is more likely to illuminate potential issues than individual impact indicators. These integrated metrics are recommended for use in combination with traditional indicators used in LCA. Future work will test and demonstrate the value of using these integrated metrics and combinations to assess product system sustainability.

  16. Heimdall System for MSSS Sensor Tasking

    Science.gov (United States)

    Herz, A.; Jones, B.; Herz, E.; George, D.; Axelrad, P.; Gehly, S.

    In Norse Mythology, Heimdall uses his foreknowledge and keen eyesight to keep watch for disaster from his home near the Rainbow Bridge. Orbit Logic and the Colorado Center for Astrodynamics Research (CCAR) at the University of Colorado (CU) have developed the Heimdall System to schedule observations of known and uncharacterized objects and search for new objects from the Maui Space Surveillance Site. Heimdall addresses the current need for automated and optimized SSA sensor tasking driven by factors associated with improved space object catalog maintenance. Orbit Logic and CU developed an initial baseline prototype SSA sensor tasking capability for select sensors at the Maui Space Surveillance Site (MSSS) using STK and STK Scheduler, and then added a new Track Prioritization Component for FiSST-inspired computations for predicted Information Gain and Probability of Detection, and a new SSA-specific Figure-of-Merit (FOM) for optimized SSA sensor tasking. While the baseline prototype addresses automation and some of the multi-sensor tasking optimization, the SSA-improved prototype addresses all of the key elements required for improved tasking leading to enhanced object catalog maintenance. The Heimdall proof-of-concept was demonstrated for MSSS SSA sensor tasking for a 24 hour period to attempt observations of all operational satellites in the unclassified NORAD catalog, observe a small set of high priority GEO targets every 30 minutes, make a sky survey of the GEO belt region accessible to MSSS sensors, and observe particular GEO regions that have a high probability of finding new objects with any excess sensor time. This Heimdall prototype software paves the way for further R&D that will integrate this technology into the MSSS systems for operational scheduling, improve the software's scalability, and further tune and enhance schedule optimization. The Heimdall software for SSA sensor tasking provides greatly improved performance over manual tasking, improved

  17. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  18. Calculations of two new dose metrics proposed by AAPM Task Group 111 using the measurements with standard CT dosimetry phantoms

    International Nuclear Information System (INIS)

    Li, Xinhua; Zhang, Da; Liu, Bob

    2013-01-01

    Purpose: AAPM Task Group 111 proposed to measure the equilibrium dose-pitch product D-caret eq for scan modes involving table translation and the midpoint dose D L (0) for stationary-table modes on the central and peripheral axes of sufficiently long (e.g., at least 40 cm) phantoms. This paper presents an alternative approach to calculate both metrics using the measurements of scanning the standard computed tomographic (CT) dosimetry phantoms on CT scanners.Methods: D-caret eq was calculated from CTDI 100 and ε(CTDI 100 ) (CTDI 100 efficiency), and D L (0) was calculated from D-caret eq and the approach to equilibrium function H(L) =D L (0)/D eq , where D eq was the equilibrium dose. CTDI 100 may be directly obtained from several sources (such as medical physicist's CT scanner performance evaluation or the IMPACT CT patient dosimetry calculator), or be derived from CTDI Vol using the central to peripheral CTDI 100 ratio (R 100 ). The authors have provided the required ε(CTDI 100 ) and H(L) data in two previous papers [X. Li, D. Zhang, and B. Liu, Med. Phys. 39, 901–905 (2012); and ibid. 40, 031903 (10pp.) (2013)]. R 100 was assessed for a series of GE, Siemens, Philips, and Toshiba CT scanners with multiple settings of scan field of view, tube voltage, and bowtie filter.Results: The calculated D L (0) and D L (0)/D eq in PMMA and water cylinders were consistent with the measurements on two GE CT scanners (LightSpeed 16 and VCT) by Dixon and Ballard [Med. Phys. 34, 3399–3413 (2007)], the measurements on a Siemens CT scanner (SOMATOM Spirit Power) by Descamps et al. [J. Appl. Clin. Med. Phys. 13, 293–302 (2012)], and the Monte Carlo simulations by Boone [Med. Phys. 36, 4547–4554 (2009)].Conclusions: D-caret eq and D L (0) can be calculated using the alternative approach. The authors have provided the required ε(CTDI 100 ) and H(L) data in two previous papers. R 100 is presented for a majority of multidetector CT scanners currently on the market, and can be

  19. Generic metrics and quantitative approaches for system resilience as a function of time

    International Nuclear Information System (INIS)

    Henry, Devanandham; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Resilience is generally understood as the ability of an entity to recover from an external disruptive event. In the system domain, a formal definition and quantification of the concept of resilience has been elusive. This paper proposes generic metrics and formulae for quantifying system resilience. The discussions and graphical examples illustrate that the quantitative model is aligned with the fundamental concept of resilience. Based on the approach presented it is possible to analyze resilience as a time dependent function in the context of systems. The paper describes the metrics of network and system resilience, time for resilience and total cost of resilience. Also the paper describes the key parameters necessary to analyze system resilience such as the following: disruptive events, component restoration and overall resilience strategy. A road network example is used to demonstrate the applicability of the proposed resilience metrics and how these analyses form the basis for developing effective resilience design strategies. The metrics described are generic enough to be implemented in a variety of applications as long as appropriate figures-of-merit and the necessary system parameters, system decomposition and component parameters are defined. - Highlights: ► Propose a graphical model for the understanding of the resilience process. ► Mathematical description of resilience as a function of time. ► Identification of necessary concepts to define and evaluate network resilience. ► Development of cost and time to recovery metrics based on resilience formulation.

  20. Resilience Metrics for the Electric Power System: A Performance-Based Approach.

    Energy Technology Data Exchange (ETDEWEB)

    Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Castillo, Andrea R [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva-Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    Grid resilience is a concept related to a power system's ability to continue operating and delivering power even in the event that low probability, high-consequence disruptions such as hurricanes, earthquakes, and cyber-attacks occur. Grid resilience objectives focus on managing and, ideally, minimizing potential consequences that occur as a result of these disruptions. Currently, no formal grid resilience definitions, metrics, or analysis methods have been universally accepted. This document describes an effort to develop and describe grid resilience metrics and analysis methods. The metrics and methods described herein extend upon the Resilience Analysis Process (RAP) developed by Watson et al. for the 2015 Quadrennial Energy Review. The extension allows for both outputs from system models and for historical data to serve as the basis for creating grid resilience metrics and informing grid resilience planning and response decision-making. This document describes the grid resilience metrics and analysis methods. Demonstration of the metrics and methods is shown through a set of illustrative use cases.

  1. A metric and frameworks for resilience analysis of engineered and infrastructure systems

    International Nuclear Information System (INIS)

    Francis, Royce; Bekera, Behailu

    2014-01-01

    In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports

  2. Strategic Human Resource Metrics: A Perspective of the General Systems Theory

    Directory of Open Access Journals (Sweden)

    Chux Gervase Iwu

    2016-04-01

    Full Text Available Measuring and quantifying strategic human resource outcomes in relation to key performance criteria is essential to developing value-adding metrics. Objectives This paper posits (using a general systems lens that strategic human resource metrics should interpret the relationship between attitudinal human resource outcomes and performance criteria such as profitability, quality or customer service. Approach Using the general systems model as underpinning theory, the study assesses the variation in response to a Likert type questionnaire with twenty-four (24 items measuring the major attitudinal dispositions of HRM outcomes (employee commitment, satisfaction, engagement and embeddedness. Results A Chi-square test (Chi-square test statistic = 54.898, p=0.173 showed that variation in responses to the attitudinal statements occurred due to chance. This was interpreted to mean that attitudinal human resource outcomes influence performance as a unit of system components. The neutral response was found to be associated with the ‘reject’ response than the ‘acceptance’ response. Value The study offers suggestion on the determination of strategic HR metrics and recommends the use of systems theory in HRM related studies. Implications This study provides another dimension to human resource metrics by arguing that strategic human resource metrics should measure the relationship between attitudinal human resource outcomes and performance using a systems perspective.

  3. Implementation of the Automated Numerical Model Performance Metrics System

    Science.gov (United States)

    2011-09-26

    question. As of this writing, the DSRC IBM AIX machines DaVinci and Pascal, and the Cray XT Einstein all use the PBS batch queuing system for...3.3). 12 Appendix A – General Automation System This system provides general purpose tools and a general way to automatically run

  4. SU-E-I-40: New Method for Measurement of Task-Specific, High-Resolution Detector System Performance

    Energy Technology Data Exchange (ETDEWEB)

    Loughran, B; Singh, V; Jain, A; Bednarek, D; Rudin, S [University at Buffalo, Buffalo, NY (United States)

    2014-06-01

    Purpose: Although generalized linear system analytic metrics such as GMTF and GDQE can evaluate performance of the whole imaging system including detector, scatter and focal-spot, a simplified task-specific measured metric may help to better compare detector systems. Methods: Low quantum-noise images of a neuro-vascular stent with a modified ANSI head phantom were obtained from the average of many exposures taken with the high-resolution Micro-Angiographic Fluoroscope (MAF) and with a Flat Panel Detector (FPD). The square of the Fourier Transform of each averaged image, equivalent to the measured product of the system GMTF and the object function in spatial-frequency space, was then divided by the normalized noise power spectra (NNPS) for each respective system to obtain a task-specific generalized signal-to-noise ratio. A generalized measured relative object detectability (GM-ROD) was obtained by taking the ratio of the integral of the resulting expressions for each detector system to give an overall metric that enables a realistic systems comparison for the given detection task. Results: The GM-ROD provides comparison of relative performance of detector systems from actual measurements of the object function as imaged by those detector systems. This metric includes noise correlations and spatial frequencies relevant to the specific object. Additionally, the integration bounds for the GM-ROD can be selected to emphasis the higher frequency band of each detector if high-resolution image details are to be evaluated. Examples of this new metric are discussed with a comparison of the MAF to the FPD for neuro-vascular interventional imaging. Conclusion: The GM-ROD is a new direct-measured task-specific metric that can provide clinically relevant comparison of the relative performance of imaging systems. Supported by NIH Grant: 2R01EB002873 and an equipment grant from Toshiba Medical Systems Corporation.

  5. Metric for Calculation of System Complexity based on its Connections

    Directory of Open Access Journals (Sweden)

    João Ricardo Braga de Paiva

    2017-02-01

    Full Text Available This paper proposes a methodology based on system connections to calculate its complexity. Two study cases are proposed: the dining Chinese philosophers’ problem and the distribution center. Both studies are modeled using the theory of Discrete Event Systems and simulations in different contexts were performed in order to measure their complexities. The obtained results present i the static complexity as a limiting factor for the dynamic complexity, ii the lowest cost in terms of complexity for each unit of measure of the system performance and iii the output sensitivity to the input parameters. The associated complexity and performance measures aggregate knowledge about the system.

  6. Methods and metrics challenges of delivery-system research

    Directory of Open Access Journals (Sweden)

    Alexander Jeffrey A

    2012-03-01

    Full Text Available Abstract Background Many delivery-system interventions are fundamentally about change in social systems (both planned and unplanned. This systems perspective raises a number of methodological challenges for studying the effects of delivery-system change--particularly for answering questions related to whether the change will work under different conditions and how the change is integrated (or not into the operating context of the delivery system. Methods The purpose of this paper is to describe the methodological and measurement challenges posed by five key issues in delivery-system research: (1 modeling intervention context; (2 measuring readiness for change; (3 assessing intervention fidelity and sustainability; (4 assessing complex, multicomponent interventions; and (5 incorporating time in delivery-system models to discuss recommendations for addressing these issues. For each issue, we provide recommendations for how research may be designed and implemented to overcome these challenges. Results and conclusions We suggest that a more refined understanding of the mechanisms underlying delivery-system interventions (treatment theory and the ways in which outcomes for different classes of individuals change over time are fundamental starting points for capturing the heterogeneity in samples of individuals exposed to delivery-system interventions. To support the research recommendations outlined in this paper and to advance understanding of the "why" and "how" questions of delivery-system change and their effects, funding agencies should consider supporting studies with larger organizational sample sizes; longer duration; and nontraditional, mixed-methods designs. A version of this paper was prepared under contract with the Agency for Healthcare Research and Quality (AHRQ, US Department of Health and Human Services for presentation and discussion at a meeting on "The Challenge and Promise of Delivery System Research," held in Sterling, VA, on

  7. Transactive System: Part II: Analysis of Two Pilot Transactive Systems using Foundational Theory and Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lian, Jianming [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sun, Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kalsi, Karanjit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wu, Di [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ren, Huiying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2018-01-24

    This document is the second of a two-part report. Part 1 reviewed several demonstrations of transactive control and compared them in terms of their payoff functions, control decisions, information privacy, and mathematical solution concepts. It was suggested in Part 1 that these four listed components should be adopted for meaningful comparison and design of future transactive systems. Part 2 proposes qualitative and quantitative metrics that will be needed to compare alternative transactive systems. It then uses the analysis and design principles from Part 1 while conducting more in-depth analysis of two transactive demonstrations: the American Electric Power (AEP) gridSMART Demonstration, which used a double –auction market mechanism, and a consensus method like that used in the Pacific Northwest Smart Grid Demonstration. Ultimately, metrics must be devised and used to meaningfully compare alternative transactive systems. One significant contribution of this report is an observation that the decision function used for thermostat control in the AEP gridSMART Demonstration has superior performance if its decision function is recast to more accurately reflect the power that will be used under for thermostatic control under alternative market outcomes.

  8. Metrics for Systems Thinking in the Human Dimension

    Science.gov (United States)

    2016-11-01

    proportion serves a proxy for the potential for systems thinking . This methodology can also be used to survey and visualize a collection of research...topic in table form and Fig. 3 visualizes the topic in a word cloud. While the dimensions of systems thinking as enumerated by Whitehead et al. do not...Requirements 515 Cost 496 Development 491 Factors 449 Decision 427 Thinking 372 B-12 Fig. 4 visualizes the modeled graded corpus with seed

  9. Guidelines and Metrics for Assessing Space System Cost Estimates

    Science.gov (United States)

    2008-01-01

    dump momentum from mechanical reaction control systems, and de-orbit at the end of the mission. Various approaches are used to accelerate the...launch vehicle and cargo power system-the necessary generation, storage, and distribution of electrical power and signals, hydraulic power, and any other...service, transport, hoist , repair, overhaul, assemble, disassemble, test, inspect, or otherwise maintain mission equipment any production of

  10. Synchronization of multi-agent systems with metric-topological interactions.

    Science.gov (United States)

    Wang, Lin; Chen, Guanrong

    2016-09-01

    A hybrid multi-agent systems model integrating the advantages of both metric interaction and topological interaction rules, called the metric-topological model, is developed. This model describes planar motions of mobile agents, where each agent can interact with all the agents within a circle of a constant radius, and can furthermore interact with some distant agents to reach a pre-assigned number of neighbors, if needed. Some sufficient conditions imposed only on system parameters and agent initial states are presented, which ensure achieving synchronization of the whole group of agents. It reveals the intrinsic relationships among the interaction range, the speed, the initial heading, and the density of the group. Moreover, robustness against variations of interaction range, density, and speed are investigated by comparing the motion patterns and performances of the hybrid metric-topological interaction model with the conventional metric-only and topological-only interaction models. Practically in all cases, the hybrid metric-topological interaction model has the best performance in the sense of achieving highest frequency of synchronization, fastest convergent rate, and smallest heading difference.

  11. Engineering task plan for purged light system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    A purged, closed circuit television system is currently used to video inside of waste tanks. The video is used to support inspection and assessment of the tank interiors, waste residues, and deployed hardware. The system is also used to facilitate deployment of new equipment. A new light source has been requested by Characterization Project Operations (CPO) for the video system. The current light used is mounted on the camera and provides 75 watts of light, which is insufficient for clear video. Other light sources currently in use on the Hanford site either can not be deployed in a 4-inch riser or do not meet the ignition source controls. The scope of this Engineering Task Plan is to address all activities associated with the specification and procurement of a light source for use with the existing CPO video equipment. The installation design change to tank farm facilities is not within the scope of this ETP

  12. Benchmarking the performance of fixed-image receptor digital radiography systems. Part 2: system performance metric.

    Science.gov (United States)

    Lee, Kam L; Bernardo, Michael; Ireland, Timothy A

    2016-06-01

    This is part two of a two-part study in benchmarking system performance of fixed digital radiographic systems. The study compares the system performance of seven fixed digital radiography systems based on quantitative metrics like modulation transfer function (sMTF), normalised noise power spectrum (sNNPS), detective quantum efficiency (sDQE) and entrance surface air kerma (ESAK). It was found that the most efficient image receptors (greatest sDQE) were not necessarily operating at the lowest ESAK. In part one of this study, sMTF is shown to depend on system configuration while sNNPS is shown to be relatively consistent across systems. Systems are ranked on their signal-to-noise ratio efficiency (sDQE) and their ESAK. Systems using the same equipment configuration do not necessarily have the same system performance. This implies radiographic practice at the site will have an impact on the overall system performance. In general, systems are more dose efficient at low dose settings.

  13. Distributed consensus for metamorphic systems using a gossip algorithm for CAT(0) metric spaces

    Science.gov (United States)

    Bellachehab, Anass; Jakubowicz, Jérémie

    2015-01-01

    We present an application of distributed consensus algorithms to metamorphic systems. A metamorphic system is a set of identical units that can self-assemble to form a rigid structure. For instance, one can think of a robotic arm composed of multiple links connected by joints. The system can change its shape in order to adapt to different environments via reconfiguration of its constituting units. We assume in this work that several metamorphic systems form a network: two systems are connected whenever they are able to communicate with each other. The aim of this paper is to propose a distributed algorithm that synchronizes all the systems in the network. Synchronizing means that all the systems should end up having the same configuration. This aim is achieved in two steps: (i) we cast the problem as a consensus problem on a metric space and (ii) we use a recent distributed consensus algorithm that only make use of metrical notions.

  14. 41 CFR 101-29.102 - Use of metric system of measurement in Federal product descriptions.

    Science.gov (United States)

    2010-07-01

    ... PROCUREMENT 29-FEDERAL PRODUCT DESCRIPTIONS 29.1-General § 101-29.102 Use of metric system of measurement in... measurement in Federal product descriptions. 101-29.102 Section 101-29.102 Public Contracts and Property... Federal agencies to: (a) Maintain close liaison with other Federal agencies, State and local governments...

  15. 48 CFR 3410.701 - Policy of the Department of Education with respect to use of the metric system.

    Science.gov (United States)

    2010-10-01

    ... of Education with respect to use of the metric system. 3410.701 Section 3410.701 Federal Acquisition Regulations System DEPARTMENT OF EDUCATION ACQUISITION REGULATION COMPETITION AND ACQUISITION PLANNING... of Education with respect to use of the metric system. It is the policy of the Department of...

  16. 48 CFR 3410.703 - Responsibilities of the Department of Education with respect to use of the metric system.

    Science.gov (United States)

    2010-10-01

    ... Department of Education with respect to use of the metric system. 3410.703 Section 3410.703 Federal Acquisition Regulations System DEPARTMENT OF EDUCATION ACQUISITION REGULATION COMPETITION AND ACQUISITION... Responsibilities of the Department of Education with respect to use of the metric system. (a) Consistent with the...

  17. Assessing precision, bias and sigma-metrics of 53 measurands of the Alinity ci system.

    Science.gov (United States)

    Westgard, Sten; Petrides, Victoria; Schneider, Sharon; Berman, Marvin; Herzogenrath, Jörg; Orzechowski, Anthony

    2017-12-01

    Assay performance is dependent on the accuracy and precision of a given method. These attributes can be combined into an analytical Sigma-metric, providing a simple value for laboratorians to use in evaluating a test method's capability to meet its analytical quality requirements. Sigma-metrics were determined for 37 clinical chemistry assays, 13 immunoassays, and 3 ICT methods on the Alinity ci system. Analytical Performance Specifications were defined for the assays, following a rationale of using CLIA goals first, then Ricos Desirable goals when CLIA did not regulate the method, and then other sources if the Ricos Desirable goal was unrealistic. A precision study was conducted at Abbott on each assay using the Alinity ci system following the CLSI EP05-A2 protocol. Bias was estimated following the CLSI EP09-A3 protocol using samples with concentrations spanning the assay's measuring interval tested in duplicate on the Alinity ci system and ARCHITECT c8000 and i2000 SR systems, where testing was also performed at Abbott. Using the regression model, the %bias was estimated at an important medical decisions point. Then the Sigma-metric was estimated for each assay and was plotted on a method decision chart. The Sigma-metric was calculated using the equation: Sigma-metric=(%TEa-|%bias|)/%CV. The Sigma-metrics and Normalized Method Decision charts demonstrate that a majority of the Alinity assays perform at least at five Sigma or higher, at or near critical medical decision levels. More than 90% of the assays performed at Five and Six Sigma. None performed below Three Sigma. Sigma-metrics plotted on Normalized Method Decision charts provide useful evaluations of performance. The majority of Alinity ci system assays had sigma values >5 and thus laboratories can expect excellent or world class performance. Laboratorians can use these tools as aids in choosing high-quality products, further contributing to the delivery of excellent quality healthcare for patients

  18. A Tale of Three District Energy Systems: Metrics and Future Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Pass, Rebecca Zarin; Wetter, Michael; Piette, Mary Ann

    2017-08-01

    Improving the sustainability of cities is crucial for meeting climate goals in the next several decades. One way this is being tackled is through innovation in district energy systems, which can take advantage of local resources and economies of scale to improve the performance of whole neighborhoods in ways infeasible for individual buildings. These systems vary in physical size, end use services, primary energy resources, and sophistication of control. They also vary enormously in their choice of optimization metrics while all under the umbrella-goal of improved sustainability. This paper explores the implications of choice of metric on district energy systems using three case studies: Stanford University, the University of California at Merced, and the Richmond Bay campus of the University of California at Berkeley. They each have a centralized authority to implement large-scale projects quickly, while maintaining data records, which makes them relatively effective at achieving their respective goals. Comparing the systems using several common energy metrics reveals significant differences in relative system merit. Additionally, a novel bidirectional heating and cooling system is presented. This system is highly energy-efficient, and while more analysis is required, may be the basis of the next generation of district energy systems.

  19. An Investigation of the Relationship Between Automated Machine Translation Evaluation Metrics and User Performance on an Information Extraction Task

    Science.gov (United States)

    2007-01-01

    more reliable than BLEU and that it is easier to understand in terms familiar to NLP researchers. 19 2.2.3 METEOR Researchers at Carnegie Mellon...essential elements of infor- mation from output generated by three types of Arabic -English MT engines. The information extraction experiment was one of three...reviewing the task hierarchy and examining the MT output of several engines. A small, prior pilot experiment to evaluate Arabic -English MT engines for

  20. Findings of the 2010 Joint Workshop on Statistical Machine Translation and Metrics for Machine Translation

    NARCIS (Netherlands)

    Callison-Burch, C.; Koehn, P.; Monz, C.; Peterson, K.; Przybocki, M.; Zaidan, O.F.

    2010-01-01

    This paper presents the results of the WMT10 and MetricsMATR10 shared tasks, which included a translation task, a system combination task, and an evaluation task. We conducted a large-scale manual evaluation of 104 machine translation systems and 41 system combination entries. We used the ranking of

  1. Timing Metrics of Joint Timing and Carrier-Frequency Offset Estimation Algorithms for TDD-based OFDM systems

    NARCIS (Netherlands)

    Hoeksema, F.W.; Srinivasan, R.; Schiphorst, Roelof; Slump, Cornelis H.

    2004-01-01

    In joint timing and carrier offset estimation algorithms for Time Division Duplexing (TDD) OFDM systems, different timing metrics are proposed to determine the beginning of a burst or symbol. In this contribution we investigated the different timing metrics in order to establish their impact on the

  2. Systems resilience for multihazard environments: definition, metrics, and valuation for decision making.

    Science.gov (United States)

    Ayyub, Bilal M

    2014-02-01

    The United Nations Office for Disaster Risk Reduction reported that the 2011 natural disasters, including the earthquake and tsunami that struck Japan, resulted in $366 billion in direct damages and 29,782 fatalities worldwide. Storms and floods accounted for up to 70% of the 302 natural disasters worldwide in 2011, with earthquakes producing the greatest number of fatalities. Average annual losses in the United States amount to about $55 billion. Enhancing community and system resilience could lead to massive savings through risk reduction and expeditious recovery. The rational management of such reduction and recovery is facilitated by an appropriate definition of resilience and associated metrics. In this article, a resilience definition is provided that meets a set of requirements with clear relationships to the metrics of the relevant abstract notions of reliability and risk. Those metrics also meet logically consistent requirements drawn from measure theory, and provide a sound basis for the development of effective decision-making tools for multihazard environments. Improving the resiliency of a system to meet target levels requires the examination of system enhancement alternatives in economic terms, within a decision-making framework. Relevant decision analysis methods would typically require the examination of resilience based on its valuation by society at large. The article provides methods for valuation and benefit-cost analysis based on concepts from risk analysis and management. © 2013 Society for Risk Analysis.

  3. Automatic Generation of Safe Handlers for Multi-Task Systems

    OpenAIRE

    Rutten , Éric; Marchand , Hervé

    2004-01-01

    We are interested in the programming of real-time control systems, such as in robotic, automotive or avionic systems. They are designed with multiple tasks, each with multiple modes. It is complex to design task handlers that control the switching of activities in order to insure safety properties of the global system. We propose a model of tasks in terms of transition systems, designed especially with the purpose of applying existing discrete controller synthesis techniques. This provides us...

  4. Relevance of motion-related assessment metrics in laparoscopic surgery.

    Science.gov (United States)

    Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J

    2013-06-01

    Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.

  5. Metric Learning Method Aided Data-Driven Design of Fault Detection Systems

    Directory of Open Access Journals (Sweden)

    Guoyang Yan

    2014-01-01

    Full Text Available Fault detection is fundamental to many industrial applications. With the development of system complexity, the number of sensors is increasing, which makes traditional fault detection methods lose efficiency. Metric learning is an efficient way to build the relationship between feature vectors with the categories of instances. In this paper, we firstly propose a metric learning-based fault detection framework in fault detection. Meanwhile, a novel feature extraction method based on wavelet transform is used to obtain the feature vector from detection signals. Experiments on Tennessee Eastman (TE chemical process datasets demonstrate that the proposed method has a better performance when comparing with existing methods, for example, principal component analysis (PCA and fisher discriminate analysis (FDA.

  6. Generalized two-dimensional (2D) linear system analysis metrics (GMTF, GDQE) for digital radiography systems including the effect of focal spot, magnification, scatter, and detector characteristics.

    Science.gov (United States)

    Jain, Amit; Kuhls-Gilcrist, Andrew T; Gupta, Sandesh K; Bednarek, Daniel R; Rudin, Stephen

    2010-03-01

    The MTF, NNPS, and DQE are standard linear system metrics used to characterize intrinsic detector performance. To evaluate total system performance for actual clinical conditions, generalized linear system metrics (GMTF, GNNPS and GDQE) that include the effect of the focal spot distribution, scattered radiation, and geometric unsharpness are more meaningful and appropriate. In this study, a two-dimensional (2D) generalized linear system analysis was carried out for a standard flat panel detector (FPD) (194-micron pixel pitch and 600-micron thick CsI) and a newly-developed, high-resolution, micro-angiographic fluoroscope (MAF) (35-micron pixel pitch and 300-micron thick CsI). Realistic clinical parameters and x-ray spectra were used. The 2D detector MTFs were calculated using the new Noise Response method and slanted edge method and 2D focal spot distribution measurements were done using a pin-hole assembly. The scatter fraction, generated for a uniform head equivalent phantom, was measured and the scatter MTF was simulated with a theoretical model. Different magnifications and scatter fractions were used to estimate the 2D GMTF, GNNPS and GDQE for both detectors. Results show spatial non-isotropy for the 2D generalized metrics which provide a quantitative description of the performance of the complete imaging system for both detectors. This generalized analysis demonstrated that the MAF and FPD have similar capabilities at lower spatial frequencies, but that the MAF has superior performance over the FPD at higher frequencies even when considering focal spot blurring and scatter. This 2D generalized performance analysis is a valuable tool to evaluate total system capabilities and to enable optimized design for specific imaging tasks.

  7. Field installation versus local integration of photovoltaic systems and their effect on energy evaluation metrics

    International Nuclear Information System (INIS)

    Halasah, Suleiman A.; Pearlmutter, David; Feuermann, Daniel

    2013-01-01

    In this study we employ Life-Cycle Assessment to evaluate the energy-related impacts of photovoltaic systems at different scales of integration, in an arid region with especially high solar irradiation. Based on the electrical output and embodied energy of a selection of fixed and tracking systems and including concentrator photovoltaic (CPV) and varying cell technology, we calculate a number of energy evaluation metrics, including the energy payback time (EPBT), energy return factor (ERF), and life-cycle CO 2 emissions offset per unit aperture and land area. Studying these metrics in the context of a regionally limited setting, it was found that utilizing existing infrastructure such as existing building roofs and shade structures does significantly reduce the embodied energy requirements (by 20–40%) and in turn the EPBT of flat-plate PV systems due to the avoidance of energy-intensive balance of systems (BOS) components like foundations. Still, high-efficiency CPV field installations were found to yield the shortest EPBT, the highest ERF and the largest life-cycle CO 2 offsets—under the condition that land availability is not a limitation. A greater life-cycle energy return and carbon offset per unit land area is yielded by locally-integrated non-concentrating systems, despite their lower efficiency per unit module area. - Highlights: ► We evaluate life-cycle energy impacts of PV systems at different scales. ► We calculate the energy payback time, return factor and CO 2 emissions offset. ► Utilizing existing structures significantly improves metrics of flat-plate PV. ► High-efficiency CPV installations yield best return and offset per aperture area. ► Locally-integrated flat-plate systems yield best return and offset per land area.

  8. Integrating Robot Task Planning into Off-Line Programming Systems

    DEFF Research Database (Denmark)

    Sun, Hongyan; Kroszynski, Uri

    1988-01-01

    a system architecture for integrated robot task planning. It identifies and describes the components considered necessary for implementation. The focus is on functionality of these elements as well as on the information flow. A pilot implementation of such an integrated system architecture for a robot......The addition of robot task planning in off-line programming systems aims at improving the capability of current state-of-the-art commercially available off-line programming systems, by integrating modeling, task planning, programming and simulation together under one platform. This article proposes...... assembly task is discussed....

  9. Overall Environmental Equipment Effectiveness as a Metric of a Lean and Green Manufacturing System

    Directory of Open Access Journals (Sweden)

    Rosario Domingo

    2015-07-01

    Full Text Available This paper presents a new metric for describing the sustainability improvements achieved, relative to the company’s initial situation, after implementing a lean and green manufacturing system. The final value of this metric is identified as the Overall Environmental Equipment Effectiveness (OEEE, which is used to analyze the evolution between two identified states of the Overall Equipment Effectiveness (OEE and the sustainability together, and references, globally and individually, the production steps. The OEE is a known measure of equipment utilization, which includes the availability, quality and performance of each production step, In addition to these factors, the OEEE incorporates the concept of sustainability based on the calculated environmental impact of the complete product life cycle. Action research based on the different manufacturing processes of a tube fabrication company is conducted to assess the potential impact of this new indicator. The case study demonstrates the compatibility between green and lean manufacturing, using a common metric. The OEEE allows sustainability to be integrated into business decisions, and compares the environmental impact of two states, by identifying the improvements undertaken within the company’s processes.

  10. MetrIntSimil—An Accurate and Robust Metric for Comparison of Similarity in Intelligence of Any Number of Cooperative Multiagent Systems

    Directory of Open Access Journals (Sweden)

    Laszlo Barna Iantovics

    2018-02-01

    Full Text Available Intelligent cooperative multiagent systems are applied for solving a large range of real-life problems, including in domains like biology and healthcare. There are very few metrics able to make an effective measure of the machine intelligence quotient. The most important drawbacks of the designed metrics presented in the scientific literature consist in the limitation in universality, accuracy, and robustness. In this paper, we propose a novel universal metric called MetrIntSimil capable of making an accurate and robust symmetric comparison of the similarity in intelligence of any number of cooperative multiagent systems specialized in difficult problem solving. The universality is an important necessary property based on the large variety of designed intelligent systems. MetrIntSimil makes a comparison by taking into consideration the variability in intelligence in the problem solving of the compared cooperative multiagent systems. It allows a classification of the cooperative multiagent systems based on their similarity in intelligence. A cooperative multiagent system has variability in the problem solving intelligence, and it can manifest lower or higher intelligence in different problem solving tasks. More cooperative multiagent systems with similar intelligence can be included in the same class. For the evaluation of the proposed metric, we conducted a case study for more intelligent cooperative multiagent systems composed of simple computing agents applied for solving the Symmetric Travelling Salesman Problem (STSP that is a class of NP-hard problems. STSP is the problem of finding the shortest Hamiltonian cycle/tour in a weighted undirected graph that does not have loops or multiple edges. The distance between two cities is the same in each opposite direction. Two classes of similar intelligence denoted IntClassA and IntClassB were identified. The experimental results show that the agent belonging to IntClassA intelligence class is less

  11. Realization of parking task based on affine system modeling

    International Nuclear Information System (INIS)

    Kim, Young Woo; Narikiyo, Tatsuo

    2007-01-01

    This paper presents a motion control system of an unmanned vehicle, where parallel parking task is realized based on a self-organizing affine system modeling and a quadratic programming based robust controller. Because of non-linearity of the vehicle system and complexity of the task to realize, control objective is not always realized with a single algorithm or control mode. This paper presents a hybrid model for parallel parking task in which seven modes for describing sub-tasks constitute an entire model

  12. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  13. Real-time multi-task operators support system

    International Nuclear Information System (INIS)

    Wang He; Peng Minjun; Wang Hao; Cheng Shouyu

    2005-01-01

    The development in computer software and hardware technology and information processing as well as the accumulation in the design and feedback from Nuclear Power Plant (NPP) operation created a good opportunity to develop an integrated Operator Support System. The Real-time Multi-task Operator Support System (RMOSS) has been built to support the operator's decision making process during normal and abnormal operations. RMOSS consists of five system subtasks such as Data Collection and Validation Task (DCVT), Operation Monitoring Task (OMT), Fault Diagnostic Task (FDT), Operation Guideline Task (OGT) and Human Machine Interface Task (HMIT). RMOSS uses rule-based expert system and Artificial Neural Network (ANN). The rule-based expert system is used to identify the predefined events in static conditions and track the operation guideline through data processing. In dynamic status, Back-Propagation Neural Network is adopted for fault diagnosis, which is trained with the Genetic Algorithm. Embedded real-time operation system VxWorks and its integrated environment Tornado II are used as the RMOSS software cross-development. VxGUI is used to design HMI. All of the task programs are designed in C language. The task tests and function evaluation of RMOSS have been done in one real-time full scope simulator. Evaluation results show that each task of RMOSS is capable of accomplishing its functions. (authors)

  14. Software Architecture Coupling Metric for Assessing Operational Responsiveness of Trading Systems

    Directory of Open Access Journals (Sweden)

    Claudiu VINTE

    2012-01-01

    Full Text Available The empirical observation that motivates our research relies on the difficulty to assess the performance of a trading architecture beyond a few synthetic indicators like response time, system latency, availability or volume capacity. Trading systems involve complex software architectures of distributed resources. However, in the context of a large brokerage firm, which offers a global coverage from both, market and client perspectives, the term distributed gains a critical significance indeed. Offering a low latency ordering system by nowadays standards is relatively easily achievable, but integrating it in a flexible manner within the broader information system architecture of a broker/dealer requires operational aspects to be factored in. We propose a metric for measuring the coupling level within software architecture, and employ it to identify architectural designs that can offer a higher level of operational responsiveness, which ultimately would raise the overall real-world performance of a trading system.

  15. Experiment with expert system guidance of an engineering analysis task

    International Nuclear Information System (INIS)

    Ransom, V.H.; Fink, R.K.; Callow, R.A.

    1986-01-01

    An experiment is being conducted in which expert systems are used to guide the performance of an engineering analysis task. The task chosen for experimentation is the application of a large thermal hydraulic transient simulation computer code. The expectation from this work is that the expert system will result in an improved analytical result with a reduction in the amount of human labor and expertise required. The code associated functions of model formulation, data input, code execution, and analysis of the computed output have all been identified as candidate tasks that could benefit from the use of expert systems. Expert system modules have been built for the model building and data input task. Initial results include the observation that human expectations of an intelligent environment rapidly escalate and structured or stylized tasks that are tolerated in the unaided system are frustrating within the intelligent environment

  16. Maintaining a Distributed File System by Collection and Analysis of Metrics

    Science.gov (United States)

    Bromberg, Daniel

    1997-01-01

    AFS(originally, Andrew File System) is a widely-deployed distributed file system product used by companies, universities, and laboratories world-wide. However, it is not trivial to operate: runing an AFS cell is a formidable task. It requires a team of dedicated and experienced system administratores who must manage a user base numbring in the thousands, rather than the smaller range of 10 to 500 faced by the typical system administrator.

  17. On the use of LDA performance as a metric of feature extraction methods for a P300 BCI classification task

    International Nuclear Information System (INIS)

    Gareis, Iván; Atum, Yanina; Gentiletti, Gerardo; Acevedo, Rubén; Bañuelos, Verónica Medina; Rufiner, Leonardo

    2011-01-01

    Brain computer interfaces (BCIs) translate brain activity into computer commands. To enhance the performance of a BCI, it is necessary to improve the feature extraction techniques being applied to decode the users' intentions. Objective comparison methods are needed to analyze different feature extraction techniques. One possibility is to use the classifier performance as a comparative measure. In this work the effect of several variables that affect the behaviour of linear discriminant analysis (LDA) has been studied when used to distinguish between electroencephalographic signals with and without the presence of event related potentials (ERPs). The error rate (ER) and the area under the receiver operating characteristic curve (AUC) were used as performance estimators of LDA. The results show that the number of characteristics, the degree of balance of the training patterns set and the number of averaged trials affect the classifier's performance and therefore, must be considered in the design of the integrated system.

  18. The software product assurance metrics study: JPL's software systems quality and productivity

    Science.gov (United States)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  19. Moment-based metrics for global sensitivity analysis of hydrological systems

    Directory of Open Access Journals (Sweden)

    A. Dell'Oca

    2017-12-01

    Full Text Available We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE, other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.

  20. Techniques and Methods to Improve the Audit Process of the Distributed Informatics Systems Based on Metric System

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2011-01-01

    Full Text Available The paper presents how an assessment system is implemented to evaluate the IT&C audit process quality. Issues regarding theoretical and practical terms are presented together with a brief presentation of the metrics and indicators developed in previous researches. The implementation process of an indicator system is highlighted and linked to specification stated in international standards regarding the measurement process. Also, the effects of an assessment system on the IT&C audit process quality are emphasized to demonstrate the importance of such assessment system. The audit process quality is an iterative process consisting of repetitive improvements based on objective measures established on analytical models of the indicators.

  1. Task Management in the New ATLAS Production System

    CERN Document Server

    De, K; The ATLAS collaboration; Klimentov, A; Potekhin, M; Vaniachine, A

    2013-01-01

    The ATLAS Production System is the top level workflow manager which translates physicists' needs for production level processing into actual workflows executed across about a hundred processing sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. Providing a front-end and a management layer for petascale data processing and analysis, the new Production System contains generic subsystems that can be used in a wider range of applications. The main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, the DEFT subsystem manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. Th...

  2. Task Management in the New ATLAS Production System

    CERN Document Server

    De, K; The ATLAS collaboration; Klimentov, A; Potekhin, M; Vaniachine, A

    2014-01-01

    The ATLAS Production System is the top level workflow manager which translates physicists' needs for production level processing into actual workflows executed across about a hundred processing sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. Providing a front-end and a management layer for petascale data processing and analysis, the new Production System contains generic subsystems that can be used in a wider range of applications. The main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, the DEFT subsystem manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. Th...

  3. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  4. Systems Engineering Design Via Experimental Operation Research: Complex Organizational Metric for Programmatic Risk Environments (COMPRE)

    Science.gov (United States)

    Mog, Robert A.

    1999-01-01

    Unique and innovative graph theory, neural network, organizational modeling, and genetic algorithms are applied to the design and evolution of programmatic and organizational architectures. Graph theory representations of programs and organizations increase modeling capabilities and flexibility, while illuminating preferable programmatic/organizational design features. Treating programs and organizations as neural networks results in better system synthesis, and more robust data modeling. Organizational modeling using covariance structures enhances the determination of organizational risk factors. Genetic algorithms improve programmatic evolution characteristics, while shedding light on rulebase requirements for achieving specified technological readiness levels, given budget and schedule resources. This program of research improves the robustness and verifiability of systems synthesis tools, including the Complex Organizational Metric for Programmatic Risk Environments (COMPRE).

  5. Task-Specific Optimization of Mammographic Systems

    National Research Council Canada - National Science Library

    Saunders, Robert

    2005-01-01

    .... This model was verified by a human observer performance experiment. The next objective explored the physical properties of a digital mammographic system, including resolution, noise, efficiency, and lag...

  6. Large solar energy systems within IEA task 14

    NARCIS (Netherlands)

    Geus, A.C. de; Isakson, P.; Bokhoven, T.P.; Vanoli, K.; Tepe, R.

    1996-01-01

    Within IEA Task 14 (Advanced Solar Systems) a working group was established dealing with large advanced solar energy systems (the Large Systems Working group). The goal of this working group was to generate a common base of experiences for the design and construction of advanced large solar systems.

  7. Modeling of Task Planning for Multirobot System Using Reputation Mechanism

    Directory of Open Access Journals (Sweden)

    Zhiguo Shi

    2014-01-01

    Full Text Available Modeling of task planning for multirobot system is developed from two parts: task decomposition and task allocation. In the part of task decomposition, the conditions and processes of decomposition are elaborated. In the part of task allocation, the collaboration strategy, the framework of reputation mechanism, and three types of reputations are defined in detail, which include robot individual reputation, robot group reputation, and robot direct reputation. A time calibration function and a group calibration function are designed to improve the effectiveness of the proposed method and proved that they have the characteristics of time attenuation, historical experience related, and newly joined robot reward. Tasks attempt to be assigned to the robot with higher overall reputation, which can help to increase the success rate of the mandate implementation, thereby reducing the time of task recovery and redistribution. Player/Stage is used as the simulation platform, and three biped-robots are established as the experimental apparatus. The experimental results of task planning are compared with the other allocation methods. Simulation and experiment results illustrate the effectiveness of the proposed method for multi-robot collaboration system.

  8. Testing Quality and Metrics for the LHC Magnet Powering System throughout Past and Future Commissioning

    CERN Document Server

    Anderson, D; Charifoulline, Z; Dragu, M; Fuchsberger, K; Garnier, JC; Gorzawski, AA; Koza, M; Krol, K; Rowan, S; Stamos, K; Zerlauth, M

    2014-01-01

    The LHC magnet powering system is composed of thousands of individual components to assure a safe operation when operating with stored energies as high as 10GJ in the superconducting LHC magnets. Each of these components has to be thoroughly commissioned following interventions and machine shutdown periods to assure their protection function in case of powering failures. As well as having dependable tracking of test executions it is vital that the executed commissioning steps and applied analysis criteria adequately represent the operational state of each component. The Accelerator Testing (AccTesting) framework in combination with a domain specific analysis language provides the means to quantify and improve the quality of analysis for future campaigns. Dedicated tools were developed to analyse in detail the reasons for failures and success of commissioning steps in past campaigns and to compare the results with newly developed quality metrics. Observed shortcomings and discrepancies are used to propose addi...

  9. Gap-metric-based robustness analysis of nonlinear systems with full and partial feedback linearisation

    Science.gov (United States)

    Al-Gburi, A.; Freeman, C. T.; French, M. C.

    2018-06-01

    This paper uses gap metric analysis to derive robustness and performance margins for feedback linearising controllers. Distinct from previous robustness analysis, it incorporates the case of output unstructured uncertainties, and is shown to yield general stability conditions which can be applied to both stable and unstable plants. It then expands on existing feedback linearising control schemes by introducing a more general robust feedback linearising control design which classifies the system nonlinearity into stable and unstable components and cancels only the unstable plant nonlinearities. This is done in order to preserve the stabilising action of the inherently stabilising nonlinearities. Robustness and performance margins are derived for this control scheme, and are expressed in terms of bounds on the plant nonlinearities and the accuracy of the cancellation of the unstable plant nonlinearity by the controller. Case studies then confirm reduced conservatism compared with standard methods.

  10. Task Delegation Based Access Control Models for Workflow Systems

    Science.gov (United States)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  11. Task management in the new ATLAS production system

    International Nuclear Information System (INIS)

    De, K; Golubkov, D; Klimentov, A; Potekhin, M; Vaniachine, A

    2014-01-01

    This document describes the design of the new Production System of the ATLAS experiment at the LHC [1]. The Production System is the top level workflow manager which translates physicists' needs for production level processing and analysis into actual workflows executed across over a hundred Grid sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. In the new design, the main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, DEFT manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. The JEDI component then dynamically translates the task definitions from DEFT into actual workload jobs executed in the PanDA Workload Management System [2]. We present the requirements, design parameters, basics of the object model and concrete solutions utilized in building the new Production System and its components.

  12. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  13. Deliverable No. 1.3: Sustainability metrics for the EU food system: a review across economic, environmental and social considerations

    NARCIS (Netherlands)

    Zurek, Monika; Leip, Adrian; Kuijsten, Anneleen; Wijnands, Jo; Terluin, Ida; Shutes, Lindsay; Hebinck, Aniek; Zimmermann, Andrea; Götz, Christian; Hornborg, Sara; Zanten, van Hannah; Ziegler, Friederike; Havlik, Petr; Garrone, Maria; Geleijnse, Marianne; Kuiper, Marijke; Turrini, Aida; Dofkova, Marcela; Trolle, Ellen; Mistura, Lorenza; Dubuisson, Carine; Veer, van 't Pieter; Achterbosch, Thom; Ingram, John; Brem-Wilson, Joshua; Franklin, Alex; Fried, Jana; Guzman Rodriguez, Paola; Owen, Luke; Saxena, Lopa; Trenchard, Liz; Wright, Julia

    2017-01-01

    One of the main objectives of the SUSFANS project is to develop a set of concepts and tools to help policy and decision makers across Europe make sense of the outcomes and trends of the EU food system. This paper proposes a set of metrics for assessing the performance of the EU food system in

  14. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  15. Resilience Attributes of Social-Ecological Systems: Framing Metrics for Management

    Directory of Open Access Journals (Sweden)

    David A. Kerner

    2014-12-01

    Full Text Available If resilience theory is to be of practical value for policy makers and resource managers, the theory must be translated into sensible decision-support tools. We present herein a set of resilience attributes, developed to characterize human-managed systems, that helps system stakeholders to make practical use of resilience concepts in tangible applications. In order to build and maintain resilience, these stakeholders must be able to understand what qualities or attributes enhance—or detract from—a system’s resilience. We describe standardized resilience terms that can be incorporated into resource management plans and decision-support tools to derive metrics that help managers assess the current resilience status of their systems, make rational resource allocation decisions, and track progress toward meeting goals. Our intention is to provide an approachable set of terms for both specialists and non-specialists alike to apply to programs that would benefit from a resilience perspective. These resilience terms can facilitate the modeling of resilience behavior within systems, as well as support those lacking access to sophisticated models. Our goal is to enable policy makers and resource managers to put resilience theory to work in the real world.

  16. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.

  17. Systems Maintenance Automated Repair Tasks (SMART)

    Science.gov (United States)

    Schuh, Joseph; Mitchell, Brent; Locklear, Louis; Belson, Martin A.; Al-Shihabi, Mary Jo Y.; King, Nadean; Norena, Elkin; Hardin, Derek

    2010-01-01

    SMART is a uniform automated discrepancy analysis and repair-authoring platform that improves technical accuracy and timely delivery of repair procedures for a given discrepancy (see figure a). SMART will minimize data errors, create uniform repair processes, and enhance the existing knowledge base of engineering repair processes. This innovation is the first tool developed that links the hardware specification requirements with the actual repair methods, sequences, and required equipment. SMART is flexibly designed to be useable by multiple engineering groups requiring decision analysis, and by any work authorization and disposition platform (see figure b). The organizational logic creates the link between specification requirements of the hardware, and specific procedures required to repair discrepancies. The first segment in the SMART process uses a decision analysis tree to define all the permutations between component/ subcomponent/discrepancy/repair on the hardware. The second segment uses a repair matrix to define what the steps and sequences are for any repair defined in the decision tree. This segment also allows for the selection of specific steps from multivariable steps. SMART will also be able to interface with outside databases and to store information from them to be inserted into the repair-procedure document. Some of the steps will be identified as optional, and would only be used based on the location and the current configuration of the hardware. The output from this analysis would be sent to a work authoring system in the form of a predefined sequence of steps containing required actions, tools, parts, materials, certifications, and specific requirements controlling quality, functional requirements, and limitations.

  18. Entropy as a Metric Generator of Dissipation in Complete Metriplectic Systems

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2016-08-01

    Full Text Available This lecture is a short review on the role entropy plays in those classical dissipative systems whose equations of motion may be expressed via a Leibniz Bracket Algebra (LBA. This means that the time derivative of any physical observable f of the system is calculated by putting this f in a “bracket” together with a “special observable” F, referred to as a Leibniz generator of the dynamics. While conservative dynamics is given an LBA formulation in the Hamiltonian framework, so that F is the Hamiltonian H of the system that generates the motion via classical Poisson brackets or quantum commutation brackets, an LBA formulation can be given to classical dissipative dynamics through the Metriplectic Bracket Algebra (MBA: the conservative component of the dynamics is still generated via Poisson algebra by the total energy H, while S, the entropy of the degrees of freedom statistically encoded in friction, generates dissipation via a metric bracket. The motivation of expressing through a bracket algebra and a motion-generating function F is to endow the theory of the system at hand with all the powerful machinery of Hamiltonian systems in terms of symmetries that become evident and readable. Here a (necessarily partial overview of the types of systems subject to MBA formulation is presented, and the physical meaning of the quantity S involved in each is discussed. Here the aim is to review the different MBAs for isolated systems in a synoptic way. At the end of this collection of examples, the fact that dissipative dynamics may be constructed also in the absence of friction with microscopic degrees of freedom is stressed. This reasoning is a hint to introduce dissipation at a more fundamental level.

  19. Automated personnel data base system specifications, Task V. Final report

    International Nuclear Information System (INIS)

    Bartley, H.J.; Bocast, A.K.; Deppner, F.O.; Harrison, O.J.; Kraas, I.W.

    1978-11-01

    The full title of this study is 'Development of Qualification Requirements, Training Programs, Career Plans, and Methodologies for Effective Management and Training of Inspection and Enforcement Personnel.' Task V required the development of an automated personnel data base system for NRC/IE. This system is identified as the NRC/IE Personnel, Assignment, Qualifications, and Training System (PAQTS). This Task V report provides the documentation for PAQTS including the Functional Requirements Document (FRD), the Data Requirements Document (DRD), the Hardware and Software Capabilities Assessment, and the Detailed Implementation Schedule. Specific recommendations to facilitate implementation of PAQTS are also included

  20. An intention driven hand functions task training robotic system.

    Science.gov (United States)

    Tong, K Y; Ho, S K; Pang, P K; Hu, X L; Tam, W K; Fung, K L; Wei, X J; Chen, P N; Chen, M

    2010-01-01

    A novel design of a hand functions task training robotic system was developed for the stroke rehabilitation. It detects the intention of hand opening or hand closing from the stroke person using the electromyography (EMG) signals measured from the hemiplegic side. This training system consists of an embedded controller and a robotic hand module. Each hand robot has 5 individual finger assemblies capable to drive 2 degrees of freedom (DOFs) of each finger at the same time. Powered by the linear actuator, the finger assembly achieves 55 degree range of motion (ROM) at the metacarpophalangeal (MCP) joint and 65 degree range of motion (ROM) at the proximal interphalangeal (PIP) joint. Each finger assembly can also be adjusted to fit for different finger length. With this task training system, stroke subject can open and close their impaired hand using their own intention to carry out some of the daily living tasks.

  1. Czech Verse Processing System KVĚTA: Phonetic and Metrical Components

    Czech Academy of Sciences Publication Activity Database

    Plecháč, Petr

    2016-01-01

    Roč. 7, č. 2 (2016), s. 159-174 ISSN 1337-7892 Institutional support: RVO:68378068 Keywords : Verse Processing * KVĚTA * Czech language * phonetic and metrical annotation Subject RIV: AJ - Letters, Mass-media, Audiovision

  2. Evidence-based Metrics Toolkit for Measuring Safety and Efficiency in Human-Automation Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — APRIL 2016 NOTE: Principal Investigator moved to Rice University in mid-2015. Project continues at Rice with the same title (Evidence-based Metrics Toolkit for...

  3. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Titov, Mikhail; The ATLAS collaboration

    2017-01-01

    Every scientific workflow involves an organizational part which purpose is to plan an analysis process thoroughly according to defined schedule, thus to keep work progress efficient. Having such information as an estimation of the processing time or possibility of system outage (abnormal behaviour) will improve the planning process, provide an assistance to monitor system performance and predict its next state. The ATLAS Production System is an automated scheduling system that is responsible for central production of Monte-Carlo data, highly specialized production for physics groups, as well as data pre-processing and analysis using such facilities as grid infrastructures, clouds and supercomputers. With its next generation (ProdSys2) the processing rate is around 2M tasks per year that is more than 365M jobs per year. ProdSys2 evolves to accommodate a growing number of users and new requirements from the ATLAS Collaboration, physics groups and individual users. ATLAS Distributed Computing in its current stat...

  4. Quantifying the Metrics That Characterize Safety Culture of Three Engineered Systems

    International Nuclear Information System (INIS)

    Tucker, Julie; Ernesti, Mary; Tokuhiro, Akira

    2002-01-01

    With potential energy shortages and increasing electricity demand, the nuclear energy option is being reconsidered in the United States. Public opinion will have a considerable voice in policy decisions that will 'road-map' the future of nuclear energy in this country. This report is an extension of the last author's work on the 'safety culture' associated with three engineered systems (automobiles, commercial airplanes, and nuclear power plants) in Japan and the United States. Safety culture, in brief is defined as a specifically developed culture based on societal and individual interpretations of the balance of real, perceived, and imagined risks versus the benefits drawn from utilizing a given engineered systems. The method of analysis is a modified scale analysis, with two fundamental Eigen-metrics, time- (t) and number-scales (N) that describe both engineered systems and human factors. The scale analysis approach is appropriate because human perception of risk, perception of benefit and level of (technological) acceptance are inherently subjective, therefore 'fuzzy' and rarely quantifiable in exact magnitude. Perception of risk, expressed in terms of the psychometric factors 'dread risk' and 'unknown risk', contains both time- and number-scale elements. Various engineering system accidents with fatalities, reported by mass media are characterized by t and N, and are presented in this work using the scale analysis method. We contend that level of acceptance infers a perception of benefit at least two orders larger magnitude than perception of risk. The 'amplification' influence of mass media is also deduced as being 100- to 1000-fold the actual number of fatalities/serious injuries in a nuclear-related accident. (authors)

  5. NASA Aviation Safety Program Systems Analysis/Program Assessment Metrics Review

    Science.gov (United States)

    Louis, Garrick E.; Anderson, Katherine; Ahmad, Tisan; Bouabid, Ali; Siriwardana, Maya; Guilbaud, Patrick

    2003-01-01

    The goal of this project is to evaluate the metrics and processes used by NASA's Aviation Safety Program in assessing technologies that contribute to NASA's aviation safety goals. There were three objectives for reaching this goal. First, NASA's main objectives for aviation safety were documented and their consistency was checked against the main objectives of the Aviation Safety Program. Next, the metrics used for technology investment by the Program Assessment function of AvSP were evaluated. Finally, other metrics that could be used by the Program Assessment Team (PAT) were identified and evaluated. This investigation revealed that the objectives are in fact consistent across organizational levels at NASA and with the FAA. Some of the major issues discussed in this study which should be further investigated, are the removal of the Cost and Return-on-Investment metrics, the lack of the metrics to measure the balance of investment and technology, the interdependencies between some of the metric risk driver categories, and the conflict between 'fatal accident rate' and 'accident rate' in the language of the Aviation Safety goal as stated in different sources.

  6. Alarm handling systems and techniques developed to match operator tasks

    Energy Technology Data Exchange (ETDEWEB)

    Bye, A; Moum, B R [Institutt for Energiteknikk, Halden (Norway). OECD Halden Reaktor Projekt

    1997-09-01

    This paper covers alarm handling methods and techniques explored at the Halden Project, and describes current status on the research activities on alarm systems. Alarm systems are often designed by application of a bottom-up strategy, generating alarms at component level. If no structuring of the alarms is applied, this may result in alarm avalanches in major plant disturbances, causing cognitive overload of the operator. An alarm structuring module should be designed using a top-down approach, analysing operator`s tasks, plant states, events and disturbances. One of the operator`s main tasks during plant disturbances is status identification, including determination of plant status and detection of plant anomalies. The main support of this is provided through the alarm systems, the process formats, the trends and possible diagnosis systems. The alarm system should both physically and conceptually be integrated with all these systems. 9 refs, 5 figs.

  7. Alarm handling systems and techniques developed to match operator tasks

    International Nuclear Information System (INIS)

    Bye, A.; Moum, B.R.

    1997-01-01

    This paper covers alarm handling methods and techniques explored at the Halden Project, and describes current status on the research activities on alarm systems. Alarm systems are often designed by application of a bottom-up strategy, generating alarms at component level. If no structuring of the alarms is applied, this may result in alarm avalanches in major plant disturbances, causing cognitive overload of the operator. An alarm structuring module should be designed using a top-down approach, analysing operator's tasks, plant states, events and disturbances. One of the operator's main tasks during plant disturbances is status identification, including determination of plant status and detection of plant anomalies. The main support of this is provided through the alarm systems, the process formats, the trends and possible diagnosis systems. The alarm system should both physically and conceptually be integrated with all these systems. 9 refs, 5 figs

  8. Assessing the performance of macroinvertebrate metrics in the Challhuaco-Ñireco System (Northern Patagonia, Argentina

    Directory of Open Access Journals (Sweden)

    Melina Mauad

    2015-09-01

    Full Text Available ABSTRACT Seven sites were examined in the Challhuaco-Ñireco system, located in the reserve of the Nahuel Huapi National Park, however part of the catchment is urbanized, being San Carlos de Bariloche (150,000 inhabitants placed in the lower part of the basin. Physico-chemical variables were measured and benthic macroinvertebrates were collected during three consecutive years at seven sites from the headwater to the river outlet. Sites near the source of the river were characterised by Plecoptera, Ephemeroptera, Trichoptera and Diptera, whereas sites close to the river mouth were dominated by Diptera, Oligochaeta and Mollusca. Regarding functional feeding groups, collector-gatherers were dominant at all sites and this pattern was consistent among years. Ordination Analysis (RDA revealed that species assemblages distribution responded to the climatic and topographic gradient (temperature and elevation, but also were associated with variables related to human impact (conductivity, nitrate and phosphate contents. Species assemblages at headwaters were mostly represented by sensitive insects, whereas tolerant taxa such as Tubificidae, Lumbriculidae, Chironomidae and crustacean Aegla sp. were dominant at urbanised sites. Regarding macroinvertebrate metrics employed, total richness, EPT taxa, Shannon diversity index and Biotic Monitoring Patagonian Stream index resulted fairly consistent and evidenced different levels of disturbances at the stream, meaning that this measures are suitable for evaluation of the status of Patagonian mountain streams.

  9. A Metrics-Based Approach to Intrusion Detection System Evaluation for Distributed Real-Time Systems

    Science.gov (United States)

    2002-04-01

    Based Approach to Intrusion Detection System Evaluation for Distributed Real - Time Systems Authors: G. A. Fink, B. L. Chappell, T. G. Turner, and...Distributed, Security. 1 Introduction Processing and cost requirements are driving future naval combat platforms to use distributed, real - time systems of...distributed, real - time systems . As these systems grow more complex, the timing requirements do not diminish; indeed, they may become more constrained

  10. Monitoring User-System Performance in Interactive Retrieval Tasks

    NARCIS (Netherlands)

    Boldareva, L.; de Vries, A.P.; Hiemstra, Djoerd

    Monitoring user-system performance in interactive search is a challenging task. Traditional measures of retrieval evaluation, based on recall and precision, are not of any use in real time, for they require a priori knowledge of relevant documents. This paper shows how a Shannon entropy-based

  11. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  12. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  13. Using proliferation risk as a design metric in the development of nuclear systems

    International Nuclear Information System (INIS)

    Beard, C.; Lebouf, R.

    2001-01-01

    The necessity has arisen for newly proposed nuclear systems to be evaluated with regard to their potential aid to any proliferation. Thus, a mechanism is needed to introduce nonproliferation as a measure in the design phase of a new nuclear system. To accomplish this, a methodology for quantifying and measuring the proliferation risk of proposed system options is required. Such quantification has its difficulties due to inherent uncertainty, e.g. what is the probability that a quantity of material will be stolen in a given situation? Also, the lack of data on such occurrences makes the task of quantification nearly insurmountable. A systematic approach is necessary to estimate the proliferation risk. Currently, an advanced nuclear power system, the Accelerator Transmutation of Waste (ATW) program has been initiated to develop a system that will concurrently generate electricity while destroying long-lived radioactive isotopes. Therefore, because of the issues noted above, an effort to introduce proliferation risk into the design phase has been started. The purpose of this paper is to review previous work in quantification of proliferation risk in an effort to develop the proper basis for the current work. It should be noted that while proliferation on a national level has been studied extensively, efforts to quantify proliferation risk of individual nuclear systems or processes have been limited. Consequently, the available literature base is relatively sparse. (author)

  14. Using proliferation risk as a design metric in the development of nuclear systems

    Energy Technology Data Exchange (ETDEWEB)

    Beard, C.; Lebouf, R. [Texas Univ., Austin, TX (United States). Nuclear Engineering Teaching Lab.

    2001-07-01

    The necessity has arisen for newly proposed nuclear systems to be evaluated with regard to their potential aid to any proliferation. Thus, a mechanism is needed to introduce nonproliferation as a measure in the design phase of a new nuclear system. To accomplish this, a methodology for quantifying and measuring the proliferation risk of proposed system options is required. Such quantification has its difficulties due to inherent uncertainty, e.g. what is the probability that a quantity of material will be stolen in a given situation? Also, the lack of data on such occurrences makes the task of quantification nearly insurmountable. A systematic approach is necessary to estimate the proliferation risk. Currently, an advanced nuclear power system, the Accelerator Transmutation of Waste (ATW) program has been initiated to develop a system that will concurrently generate electricity while destroying long-lived radioactive isotopes. Therefore, because of the issues noted above, an effort to introduce proliferation risk into the design phase has been started. The purpose of this paper is to review previous work in quantification of proliferation risk in an effort to develop the proper basis for the current work. It should be noted that while proliferation on a national level has been studied extensively, efforts to quantify proliferation risk of individual nuclear systems or processes have been limited. Consequently, the available literature base is relatively sparse. (author)

  15. Degraded visual environment image/video quality metrics

    Science.gov (United States)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  16. Global Surgery System Strengthening: It Is All About the Right Metrics.

    Science.gov (United States)

    Watters, David A; Guest, Glenn D; Tangi, Viliami; Shrime, Mark G; Meara, John G

    2018-04-01

    Progress in achieving "universal access to safe, affordable surgery, and anesthesia care when needed" is dependent on consensus not only about the key messages but also on what metrics should be used to set goals and measure progress. The Lancet Commission on Global Surgery not only achieved consensus on key messages but also recommended 6 key metrics to inform national surgical plans and monitor scale-up toward 2030. These metrics measure access to surgery, as well as its timeliness, safety, and affordability: (1) Two-hour access to the 3 Bellwether procedures (cesarean delivery, emergency laparotomy, and management of an open fracture); (2) Surgeon, Anesthetist, and Obstetrician workforce >20/100,000; (3) Surgical volume of 5000 procedures/100,000; (4) Reporting of perioperative mortality rate; and (5 and 6) Risk rates of catastrophic expenditure and impoverishment when requiring surgery. This article discusses the definition, validity, feasibility, relevance, and progress with each of these metrics. The authors share their experience of introducing the metrics in the Pacific and sub-Saharan Africa. We identify appropriate messages for each potential stakeholder-the patients, practitioners, providers (health services and hospitals), public (community), politicians, policymakers, and payers. We discuss progress toward the metrics being included in core indicator lists by the World Health Organization and the World Bank and how they have been, or may be, used to inform National Surgical Plans in low- and middle-income countries to scale-up the delivery of safe, affordable, and timely surgical and anesthesia care to all who need it.

  17. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  18. Goals for a waste management system: a task force report

    International Nuclear Information System (INIS)

    Bishop, W.

    1976-01-01

    This task force set out in a holistic way to study societal concerns regarding nuclear waste management, and to seek places where the technology interacts with our social system. The procedures involved in the goals for safe waste management are outlined and the organizations needed to carry them out are considered. The task force concluded that the needs for disposing of the present waste should not dictate the nature of the systems to be designed for the future wastes, and that budgetary considerations should not slow down the waste management in the second time frame (wastes no longer being produced). Other desirable goals, such as independence of waste management system regarding the stability of social institutions, are also discussed

  19. Robotics/Automated Systems Task Analysis and Description of Required Job Competencies Report. Task Analysis and Description of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…

  20. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    Sched Deliver Sched Delinquent Contracts Current Metrics PQDR/SDRs Forecasting Accuracy Reliability Demand Management Asset Mgmt Strategies Pipeline...are identified and characterized by statistical analysis. The study proposed a framework and tool for inventory management based on factors such as

  1. Unique sensor fusion system for coordinate-measuring machine tasks

    Science.gov (United States)

    Nashman, Marilyn; Yoshimi, Billibon; Hong, Tsai Hong; Rippey, William G.; Herman, Martin

    1997-09-01

    This paper describes a real-time hierarchical system that fuses data from vision and touch sensors to improve the performance of a coordinate measuring machine (CMM) used for dimensional inspection tasks. The system consists of sensory processing, world modeling, and task decomposition modules. It uses the strengths of each sensor -- the precision of the CMM scales and the analog touch probe and the global information provided by the low resolution camera -- to improve the speed and flexibility of the inspection task. In the experiment described, the vision module performs all computations in image coordinate space. The part's boundaries are extracted during an initialization process and then the probe's position is continuously updated as it scans and measures the part surface. The system fuses the estimated probe velocity and distance to the part boundary in image coordinates with the estimated velocity and probe position provided by the CMM controller. The fused information provides feedback to the monitor controller as it guides the touch probe to scan the part. We also discuss integrating information from the vision system and the probe to autonomously collect data for 2-D to 3-D calibration, and work to register computer aided design (CAD) models with images of parts in the workplace.

  2. A Web-Based Graphical Food Frequency Assessment System: Design, Development and Usability Metrics.

    Science.gov (United States)

    Franco, Rodrigo Zenun; Alawadhi, Balqees; Fallaize, Rosalind; Lovegrove, Julie A; Hwang, Faustina

    2017-05-08

    Food frequency questionnaires (FFQs) are well established in the nutrition field, but there remain important questions around how to develop online tools in a way that can facilitate wider uptake. Also, FFQ user acceptance and evaluation have not been investigated extensively. This paper presents a Web-based graphical food frequency assessment system that addresses challenges of reproducibility, scalability, mobile friendliness, security, and usability and also presents the utilization metrics and user feedback from a deployment study. The application design employs a single-page application Web architecture with back-end services (database, authentication, and authorization) provided by Google Firebase's free plan. Its design and responsiveness take advantage of the Bootstrap framework. The FFQ was deployed in Kuwait as part of the EatWellQ8 study during 2016. The EatWellQ8 FFQ contains 146 food items (including drinks). Participants were recruited in Kuwait without financial incentive. Completion time was based on browser timestamps and usability was measured using the System Usability Scale (SUS), scoring between 0 and 100. Products with a SUS higher than 70 are considered to be good. A total of 235 participants created accounts in the system, and 163 completed the FFQ. Of those 163 participants, 142 reported their gender (93 female, 49 male) and 144 reported their date of birth (mean age of 35 years, range from 18-65 years). The mean completion time for all FFQs (n=163), excluding periods of interruption, was 14.2 minutes (95% CI 13.3-15.1 minutes). Female participants (n=93) completed in 14.1 minutes (95% CI 12.9-15.3 minutes) and male participants (n=49) completed in 14.3 minutes (95% CI 12.6-15.9 minutes). Participants using laptops or desktops (n=69) completed the FFQ in an average of 13.9 minutes (95% CI 12.6-15.1 minutes) and participants using smartphones or tablets (n=91) completed in an average of 14.5 minutes (95% CI 13.2-15.8 minutes). The median SUS

  3. Machine Learning for ATLAS DDM Network Metrics

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Vamosi, Ralf

    2016-01-01

    The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  4. Non-metric close range photogrammetric system for mapping geologic structures in mines

    Energy Technology Data Exchange (ETDEWEB)

    Brandow, V D

    1976-01-01

    A stereographic close-range photogrammetric method of obtaining structural data for mine roof stability analyses is described. Stereo pairs were taken with 70 mm and 35 mm non-metric cameras. Photo co-ordinates were measured with a stereo-comparator and reduced by the direct linear transformation method. Field trials demonstrate that the technique is sufficiently accurate for geological work and is a practical method of mapping.

  5. MRS systems study, Task F: Transportation impacts of a monitored retrievable storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Brentlinger, L.A.; Gupta, S.; Plummer, A.M.; Smith, L.A.; Tzemos, S.

    1989-05-01

    The passage of the Nuclear Waste Policy Amendments Act of 1987 (NWPAA) modified the basis from which the Office of Civilian Radioactive Waste Management (OCRWM) had derived and developed the configuration of major elements of the waste system (repository, monitored retrievable storage, and transportation). While the key aspects of the Nuclear Waste Policy Act of 1982 remain unaltered, NWPAA provisions focusing site characterization solely at Yucca Mountain, authorizing a monitored retrievable storage (MRS) facility with specific linkages to the repository, and establishing an MRS Review Commission make it prudent for OCRWM to update its analysis of the role of the MRS in the overall waste system configuration. This report documents the differences in transportation costs and radiological dose under alternative scenarios pertaining to a nuclear waste management system with and without an MRS, to include the effect of various MRS packaging functions and locations. The analysis is limited to the impacts of activities related directly to the hauling of high-level radioactive waste (HLW), including the capital purchase and maintenance costs of the transportation cask system. Loading and unloading impacts are not included in this study because they are treated as facility costs in the other task reports. Transportation costs are based on shipments of 63,000 metric tons of uranium (MTU) of spent nuclear fuel and 7,000 MTU equivalent of HLW. 10 refs., 41 tabs.

  6. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  7. Group performance and group learning at dynamic system control tasks

    International Nuclear Information System (INIS)

    Drewes, Sylvana

    2013-01-01

    Proper management of dynamic systems (e.g. cooling systems of nuclear power plants or production and warehousing) is important to ensure public safety and economic success. So far, research has provided broad evidence for systematic shortcomings in individuals' control performance of dynamic systems. This research aims to investigate whether groups manifest synergy (Larson, 2010) and outperform individuals and if so, what processes lead to these performance advantages. In three experiments - including simulations of a nuclear power plant and a business setting - I compare the control performance of three-person-groups to the average individual performance and to nominal groups (N = 105 groups per experiment). The nominal group condition captures the statistical advantage of aggregated group judgements not due to social interaction. First, results show a superior performance of groups compared to individuals. Second, a meta-analysis across all three experiments shows interaction-based process gains in dynamic control tasks: Interacting groups outperform the average individual performance as well as the nominal group performance. Third, group interaction leads to stable individual improvements of group members that exceed practice effects. In sum, these results provide the first unequivocal evidence for interaction-based performance gains of groups in dynamic control tasks and imply that employers should rely on groups to provide opportunities for individual learning and to foster dynamic system control at its best.

  8. TASK ALLOCATION IN GEO-DISTRIBUTATED CYBER-PHYSICAL SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    Aggarwal, Rachel; Smidts, Carol

    2017-03-01

    This paper studies the task allocation algorithm for a distributed test facility (DTF), which aims to assemble geo-distributed cyber (software) and physical (hardware in the loop components into a prototype cyber-physical system (CPS). This allows low cost testing on an early conceptual prototype (ECP) of the ultimate CPS (UCPS) to be developed. The DTF provides an instrumentation interface for carrying out reliability experiments remotely such as fault propagation analysis and in-situ testing of hardware and software components in a simulated environment. Unfortunately, the geo-distribution introduces an overhead that is not inherent to the UCPS, i.e. a significant time delay in communication that threatens the stability of the ECP and is not an appropriate representation of the behavior of the UCPS. This can be mitigated by implementing a task allocation algorithm to find a suitable configuration and assign the software components to appropriate computational locations, dynamically. This would allow the ECP to operate more efficiently with less probability of being unstable due to the delays introduced by geo-distribution. The task allocation algorithm proposed in this work uses a Monte Carlo approach along with Dynamic Programming to identify the optimal network configuration to keep the time delays to a minimum.

  9. MO-G-BRE-06: Metrics of Success: Measuring Participation and Attitudes Related to Near-Miss Incident Learning Systems

    International Nuclear Information System (INIS)

    Nyflot, MJ; Kusano, AS; Zeng, J; Carlson, JC; Novak, A; Sponseller, P; Jordan, L; Kane, G; Ford, EC

    2014-01-01

    Purpose: Interest in incident learning systems (ILS) for improving safety and quality in radiation oncology is growing, as evidenced by the upcoming release of the national ILS. However, an institution implementing such a system would benefit from quantitative metrics to evaluate performance and impact. We developed metrics to measure volume of reporting, severity of reported incidents, and changes in staff attitudes over time from implementation of our institutional ILS. Methods: We analyzed 2023 incidents from our departmental ILS from 2/2012–2/2014. Incidents were prospectively assigned a near-miss severity index (NMSI) at multidisciplinary review to evaluate the potential for error ranging from 0 to 4 (no harm to critical). Total incidents reported, unique users reporting, and average NMSI were evaluated over time. Additionally, departmental safety attitudes were assessed through a 26 point survey adapted from the AHRQ Hospital Survey on Patient Safety Culture before, 12 months, and 24 months after implementation of the incident learning system. Results: Participation in the ILS increased as demonstrated by total reports (approximately 2.12 additional reports/month) and unique users reporting (0.51 additional users reporting/month). Also, the average NMSI of reports trended lower over time, significantly decreasing after 12 months of reporting (p<0.001) but with no significant change at months 18 or 24. In survey data significant improvements were noted in many dimensions, including perceived barriers to reporting incidents such as concern of embarrassment (37% to 18%; p=0.02) as well as knowledge of what incidents to report, how to report them, and confidence that these reports were used to improve safety processes. Conclusion: Over a two-year period, our departmental ILS was used more frequently, incidents became less severe, and staff confidence in the system improved. The metrics used here may be useful for other institutions seeking to create or evaluate

  10. MO-G-BRE-06: Metrics of Success: Measuring Participation and Attitudes Related to Near-Miss Incident Learning Systems

    Energy Technology Data Exchange (ETDEWEB)

    Nyflot, MJ; Kusano, AS; Zeng, J; Carlson, JC; Novak, A; Sponseller, P; Jordan, L; Kane, G; Ford, EC [University of Washington, Seattle, WA (United States)

    2014-06-15

    Purpose: Interest in incident learning systems (ILS) for improving safety and quality in radiation oncology is growing, as evidenced by the upcoming release of the national ILS. However, an institution implementing such a system would benefit from quantitative metrics to evaluate performance and impact. We developed metrics to measure volume of reporting, severity of reported incidents, and changes in staff attitudes over time from implementation of our institutional ILS. Methods: We analyzed 2023 incidents from our departmental ILS from 2/2012–2/2014. Incidents were prospectively assigned a near-miss severity index (NMSI) at multidisciplinary review to evaluate the potential for error ranging from 0 to 4 (no harm to critical). Total incidents reported, unique users reporting, and average NMSI were evaluated over time. Additionally, departmental safety attitudes were assessed through a 26 point survey adapted from the AHRQ Hospital Survey on Patient Safety Culture before, 12 months, and 24 months after implementation of the incident learning system. Results: Participation in the ILS increased as demonstrated by total reports (approximately 2.12 additional reports/month) and unique users reporting (0.51 additional users reporting/month). Also, the average NMSI of reports trended lower over time, significantly decreasing after 12 months of reporting (p<0.001) but with no significant change at months 18 or 24. In survey data significant improvements were noted in many dimensions, including perceived barriers to reporting incidents such as concern of embarrassment (37% to 18%; p=0.02) as well as knowledge of what incidents to report, how to report them, and confidence that these reports were used to improve safety processes. Conclusion: Over a two-year period, our departmental ILS was used more frequently, incidents became less severe, and staff confidence in the system improved. The metrics used here may be useful for other institutions seeking to create or evaluate

  11. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  12. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  13. Shared Task System Description: Frustratingly Hard Compositionality Prediction

    DEFF Research Database (Denmark)

    Johannsen, Anders Trærup; Martinez Alonso, Hector; Rishøj, Christian

    2011-01-01

    , and the likelihood of long translation equivalents in other languages. Many of the features we considered correlated significantly with human compositionality scores, but in support vector regression experiments we obtained the best results using only COALS-based endocentricity scores. Our system was nevertheless......We considered a wide range of features for the DiSCo 2011 shared task about compositionality prediction for word pairs, including COALS-based endocentricity scores, compositionality scores based on distributional clusters, statistics about wordnet-induced paraphrases, hyphenation...

  14. TSORT - an automated tool for allocating tasks to training strategies

    International Nuclear Information System (INIS)

    Carter, R.J.; Jorgensen, C.C.

    1986-01-01

    An automated tool (TSORT) that can aid training system developers in determining which training strategy should be applied to a particular task and in grouping similar tasks into training categories has been developed. This paper describes the rationale for TSORT's development and addresses its structure, including training categories, task description dimensions, and categorization metrics. It also provides some information on TSORT's application

  15. Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Samuel [O8953; Baker, Gavin Matthew; Gamell, Marc [Rutgers U; Hollman, David [08953; Sjaardema, Gregor [SNL; Kolla, Hemanth [SNL; Teranishi, Keita; Wilke, Jeremiah J; Slattengren, Nicole [SNL; Bennett, Janine Camille

    2015-10-01

    Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leading candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.

  16. Operating status of TARN vacuum system and future tasks

    International Nuclear Information System (INIS)

    Chida, Katsuhisa; Tsujikawa, Hiroshi; Mizobuchi, Akira

    1981-01-01

    TARN (Test Accumulation Ring for Numatron) was constructed for the purpose of obtaining the fundamental data for high energy heavy ion accelerator (Numatron) project, which accelerates heavy ions up to uranium to 1 GeV/nucleon. Its vacuum is required to be 1 x 10 - 10 Torr or less on beam. In February, 1972, only the vacuum system was temporarily assembled, and the vacuum of 2 x 10 - 11 Torr was realized by baking at 300 deg C alone. In July, 1972, the assembling of the vacuum chamber into magnets was completed, and several test experiments were performed using the H 2+ beam from the SF cyclotron. In this report, first, the outline of the vacuum system, and next, its operation are described. For the reason of the purpose of the ring, the vacuum system is required to be atmospheric pressure to attach beam monitors and other measuring instruments just before the machine time. Therefore, it is an important task to make the evacuation time as short as possible. As future tasks, the examination on the material and shape of the chamber, the investigation of pump system (appropriate combination of ion pump, titanium sublimation pump, cryo-pump, molecular pump, etc.), the study on the measuring and control systems (accurate measurement of total pressure and partial pressure and the feedback to the protecting system), the studies of problems on the vacuum wall surface (surface treatment prior to assembling the chamber into the ring and the methods and the effects of baking and electric discharge cleaning) are included. (Wakatsuki, Y.)

  17. Instrument Motion Metrics for Laparoscopic Skills Assessment in Virtual Reality and Augmented Reality.

    Science.gov (United States)

    Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A

    2016-11-01

    To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.

  18. Efficient control of mechatronic systems in dynamic motion tasks

    Directory of Open Access Journals (Sweden)

    Despotova Desislava

    2018-01-01

    Full Text Available Robots and powered exoskeletons have often complex and non-linear dynamics due to friction, elasticity, and changing load. The proposed study addresses various-type robots that have to perform dynamic point-to-point motion tasks (PTPMT. The performance demands are for faster motion, higher positioning accuracy, and lower energy consumption. With given motion task, it is of primary importance to study the structure and controllability of the corresponding controlled system. The following natural decentralized controllability condition is assumed: the signs of any control input and the corresponding output (the acceleration are the same, at least when the control input is at its maximum absolute value. Then we find explicit necessary and sufficient conditions on the control transfer matrix that can guarantee robust controllability in the face of arbitrary, but bounded disturbances. Further on, we propose a generic optimisation approach for control learning synthesis of various type robotic systems in PTPMT. Our procedure for iterative learning control (LC has the following main steps: (1 choose a set of appropriate test control functions; (2 define the most relevant input-output pairs; and (3 solve shooting equations and perform control parameter optimisation. We will give several examples to explain our controllability and optimisation concepts.

  19. System structure and cognitive ability as predictors of performance in dynamic system control tasks

    Directory of Open Access Journals (Sweden)

    Jan Hundertmark

    2015-12-01

    Full Text Available In dynamic system control, cognitive mechanisms and abilities underlying performance may vary depending on the nature of the task. We therefore investigated the effects of system structure and its interaction with cognitive abilities on system control performance. A sample of 127 university students completed a series of different system control tasks that were manipulated in terms of system size and recurrent feedback, either with or without a cognitive load manipulation. Cognitive abilities assessed included reasoning ability, working memory capacity, and cognitive reflection. System size and recurrent feedback affected overall performance as expected. Overall, the results support that cognitive ability is a good predictor of performance in dynamic system control tasks but predictiveness is reduced when the system structure contains recurrent feedback. We discuss this finding from a cognitive processing perspective as well as its implications for individual differences research in dynamic systems.

  20. Method and system for assigning a confidence metric for automated determination of optic disc location

    Science.gov (United States)

    Karnowski, Thomas P [Knoxville, TN; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya [Knoxville, TN; Chaum, Edward [Memphis, TN

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  1. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  2. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  3. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  4. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  5. Distributed Task Rescheduling With Time Constraints for the Optimization of Total Task Allocations in a Multirobot System.

    Science.gov (United States)

    Turner, Joanna; Meng, Qinggang; Schaefer, Gerald; Whitbrook, Amanda; Soltoggio, Andrea

    2017-09-28

    This paper considers the problem of maximizing the number of task allocations in a distributed multirobot system under strict time constraints, where other optimization objectives need also be considered. It builds upon existing distributed task allocation algorithms, extending them with a novel method for maximizing the number of task assignments. The fundamental idea is that a task assignment to a robot has a high cost if its reassignment to another robot creates a feasible time slot for unallocated tasks. Multiple reassignments among networked robots may be required to create a feasible time slot and an upper limit to this number of reassignments can be adjusted according to performance requirements. A simulated rescue scenario with task deadlines and fuel limits is used to demonstrate the performance of the proposed method compared with existing methods, the consensus-based bundle algorithm and the performance impact (PI) algorithm. Starting from existing (PI-generated) solutions, results show up to a 20% increase in task allocations using the proposed method.

  6. A practical exposure-equivalent metric for instrumentation noise in x-ray imaging systems

    International Nuclear Information System (INIS)

    Yadava, G K; Kuhls-Gilcrist, A T; Rudin, S; Patel, V K; Hoffmann, K R; Bednarek, D R

    2008-01-01

    The performance of high-sensitivity x-ray imagers may be limited by additive instrumentation noise rather than by quantum noise when operated at the low exposure rates used in fluoroscopic procedures. The equipment-invasive instrumentation noise measures (in terms of electrons) are generally difficult to make and are potentially not as helpful in clinical practice as would be a direct radiological representation of such noise that may be determined in the field. In this work, we define a clinically relevant representation for instrumentation noise in terms of noise-equivalent detector entrance exposure, termed the instrumentation noise-equivalent exposure (INEE), which can be determined through experimental measurements of noise-variance or signal-to-noise ratio (SNR). The INEE was measured for various detectors, thus demonstrating its usefulness in terms of providing information about the effective operating range of the various detectors. A simulation study is presented to demonstrate the robustness of this metric against post-processing, and its dependence on inherent detector blur. These studies suggest that the INEE may be a practical gauge to determine and compare the range of quantum-limited performance for clinical x-ray detectors of different design, with the implication that detector performance at exposures below the INEE will be instrumentation-noise limited rather than quantum-noise limited

  7. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  8. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Ha, J. H.; Kim, M. K.; Chung, B. S.; Oh, H. C.; Seo, M. R.

    2007-01-01

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  9. Development of a Cognitive Robotic System for Simple Surgical Tasks

    Directory of Open Access Journals (Sweden)

    Riccardo Muradore

    2015-04-01

    Full Text Available The introduction of robotic surgery within the operating rooms has significantly improved the quality of many surgical procedures. Recently, the research on medical robotic systems focused on increasing the level of autonomy in order to give them the possibility to carry out simple surgical actions autonomously. This paper reports on the development of technologies for introducing automation within the surgical workflow. The results have been obtained during the ongoing FP7 European funded project Intelligent Surgical Robotics (I-SUR. The main goal of the project is to demonstrate that autonomous robotic surgical systems can carry out simple surgical tasks effectively and without major intervention by surgeons. To fulfil this goal, we have developed innovative solutions (both in terms of technologies and algorithms for the following aspects: fabrication of soft organ models starting from CT images, surgical planning and execution of movement of robot arms in contact with a deformable environment, designing a surgical interface minimizing the cognitive load of the surgeon supervising the actions, intra-operative sensing and reasoning to detect normal transitions and unexpected events. All these technologies have been integrated using a component-based software architecture to control a novel robot designed to perform the surgical actions under study. In this work we provide an overview of our system and report on preliminary results of the automatic execution of needle insertion for the cryoablation of kidney tumours.

  10. A comparison of community and trophic structure in five marine ecosystems based on energy budgets and system metrics

    Science.gov (United States)

    Gaichas, Sarah; Skaret, Georg; Falk-Petersen, Jannike; Link, Jason S.; Overholtz, William; Megrey, Bernard A.; Gjøsæter, Harald; Stockhausen, William T.; Dommasnes, Are; Friedland, Kevin D.; Aydin, Kerim

    2009-04-01

    Energy budget models for five marine ecosystems were compared to identify differences and similarities in trophic and community structure. We examined the Gulf of Maine and Georges Bank in the northwest Atlantic Ocean, the combined Norwegian/Barents Seas in the northeast Atlantic Ocean, and the eastern Bering Sea and the Gulf of Alaska in the northeast Pacific Ocean. Comparable energy budgets were constructed for each ecosystem by aggregating information for similar species groups into consistent functional groups. Several ecosystem indices (e.g., functional group production, consumption and biomass ratios, cumulative biomass, food web macrodescriptors, and network metrics) were compared for each ecosystem. The comparative approach clearly identified data gaps for each ecosystem, an important outcome of this work. Commonalities across the ecosystems included overall high primary production and energy flow at low trophic levels, high production and consumption by carnivorous zooplankton, and similar proportions of apex predator to lower trophic level biomass. Major differences included distinct biomass ratios of pelagic to demersal fish, ranging from highest in the combined Norwegian/Barents ecosystem to lowest in the Alaskan systems, and notable differences in primary production per unit area, highest in the Alaskan and Georges Bank/Gulf of Maine ecosystems, and lowest in the Norwegian ecosystems. While comparing a disparate group of organisms across a wide range of marine ecosystems is challenging, this work demonstrates that standardized metrics both elucidate properties common to marine ecosystems and identify key distinctions useful for fisheries management.

  11. Stability of Switched Feedback Time-Varying Dynamic Systems Based on the Properties of the Gap Metric for Operators

    Directory of Open Access Journals (Sweden)

    M. De la Sen

    2012-01-01

    Full Text Available The stabilization of dynamic switched control systems is focused on and based on an operator-based formulation. It is assumed that the controlled object and the controller are described by sequences of closed operator pairs (L,C on a Hilbert space H of the input and output spaces and it is related to the existence of the inverse of the resulting input-output operator being admissible and bounded. The technical mechanism addressed to get the results is the appropriate use of the fact that closed operators being sufficiently close to bounded operators, in terms of the gap metric, are also bounded. That philosophy is followed for the operators describing the input-output relations in switched feedback control systems so as to guarantee the closed-loop stabilization.

  12. The Pharmacogenomics Research Network Translational Pharmacogenetics Program: Outcomes and Metrics of Pharmacogenetic Implementations Across Diverse Healthcare Systems.

    Science.gov (United States)

    Luzum, J A; Pakyz, R E; Elsey, A R; Haidar, C E; Peterson, J F; Whirl-Carrillo, M; Handelman, S K; Palmer, K; Pulley, J M; Beller, M; Schildcrout, J S; Field, J R; Weitzel, K W; Cooper-DeHoff, R M; Cavallari, L H; O'Donnell, P H; Altman, R B; Pereira, N; Ratain, M J; Roden, D M; Embi, P J; Sadee, W; Klein, T E; Johnson, J A; Relling, M V; Wang, L; Weinshilboum, R M; Shuldiner, A R; Freimuth, R R

    2017-09-01

    Numerous pharmacogenetic clinical guidelines and recommendations have been published, but barriers have hindered the clinical implementation of pharmacogenetics. The Translational Pharmacogenetics Program (TPP) of the National Institutes of Health (NIH) Pharmacogenomics Research Network was established in 2011 to catalog and contribute to the development of pharmacogenetic implementations at eight US healthcare systems, with the goal to disseminate real-world solutions for the barriers to clinical pharmacogenetic implementation. The TPP collected and normalized pharmacogenetic implementation metrics through June 2015, including gene-drug pairs implemented, interpretations of alleles and diplotypes, numbers of tests performed and actionable results, and workflow diagrams. TPP participant institutions developed diverse solutions to overcome many barriers, but the use of Clinical Pharmacogenetics Implementation Consortium (CPIC) guidelines provided some consistency among the institutions. The TPP also collected some pharmacogenetic implementation outcomes (scientific, educational, financial, and informatics), which may inform healthcare systems seeking to implement their own pharmacogenetic testing programs. © 2017, The American Society for Clinical Pharmacology and Therapeutics.

  13. Tank waste remediation system retrieval authorization basis amendment task plan

    International Nuclear Information System (INIS)

    Goetz, T.G.

    1998-01-01

    This task plan is a documented agreement between Nuclear Safety and Licensing and the Process Development group within the Waste Feed Delivery organization. The purpose of this task plan is to identify the scope of work, tasks and deliverables, responsibilities, manpower, and schedules associated with an authorization basis amendment as a result of the Waste Feed Waste Delivery Program, Project W-211, and Project W-TBD

  14. Component reliability criticality or importance metrics for systems with degrading components

    NARCIS (Netherlands)

    Peng, H.; Coit, D.W.; Feng, Q.

    2012-01-01

    This paper proposes two new importance measures: one new importance measure for systems with -independent degrading components, and another one for systems with -correlated degrading components. Importance measures in previous research are inadequate for systems with degrading components because

  15. Clustering of maintenance tasks for the danish railway system

    DEFF Research Database (Denmark)

    M. Pour, Shahrzad; Benlic, Una

    2017-01-01

    standards. In this paper, we present a mathematical model for allocation of maintenance tasks to maintenance team members, which is a variant of the Generalized Assignment Problem. The aim is to optimise the following three criteria: (i) the total distance travelled from depots to tasks, (ii) the maximal...... distance between any maintenance task and its allocated crew member, and (iii) the imbalance in workload among crew members. As test cases, we use a set of instances that simulate the distribution of tasks in the Jutland peninsula, the largest region of Denmark....

  16. Connectivity of Multi-Channel Fluvial Systems: A Comparison of Topology Metrics for Braided Rivers and Delta Networks

    Science.gov (United States)

    Tejedor, A.; Marra, W. A.; Addink, E. A.; Foufoula-Georgiou, E.; Kleinhans, M. G.

    2016-12-01

    Advancing quantitative understanding of the structure and dynamics of complex networks has transformed research in many fields as diverse as protein interactions in a cell to page connectivity in the World Wide Web and relationships in human societies. However, Geosciences have not benefited much from this new conceptual framework, although connectivity is at the center of many processes in hydro-geomorphology. One of the first efforts in this direction was the seminal work of Smart and Moruzzi (1971), proposing the use of graph theory for studying the intricate structure of delta channel networks. In recent years, this preliminary work has precipitated in a body of research that examines the connectivity of multiple-channel fluvial systems, such as delta networks and braided rivers. In this work, we compare two approaches recently introduced in the literature: (1) Marra et al. (2014) utilized network centrality measures to identify important channels in a braided section of the Jamuna River, and used the changes of bifurcations within the network over time to explain the overall river evolution; and (2) Tejedor et al. (2015a,b) developed a set of metrics to characterize the complexity of deltaic channel networks, as well as defined a vulnerability index that quantifies the relative change of sediment and water delivery to the shoreline outlets in response to upstream perturbations. Here we present a comparative analysis of metrics of centrality and vulnerability applied to both braided and deltaic channel networks to depict critical channels in those systems, i.e., channels where a change would contribute more substantially to overall system changes, and to understand what attributes of interest in a channel network are most succinctly depicted in what metrics. Marra, W. A., Kleinhans, M. G., & Addink, E. A. (2014). Earth Surface Processes and Landforms, doi:10.1002/esp.3482Smart, J. S., and V. L. Moruzzi (1971), Quantitative properties of delta channel networks

  17. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  18. Engineering Task Plan for Fourth Generation Hanford Corrosion Monitoring System

    International Nuclear Information System (INIS)

    NORMAN, E.C.

    2000-01-01

    This Engineering Task Plan (ETP) describes the activities associated with the installation of cabinets containing corrosion monitoring equipment on tanks 241-AN-102 and 241-AN-107. The new cabinets (one per tank) will be installed adjacent to existing corrosion probes already installed in riser WST-RISER-016 on both tanks. The corrosion monitoring equipment to be installed utilizes the technique of electrochemical noise (EN) for monitoring waste tank corrosion. Typically, EN consists of low frequency (4 Hz) and small amplitude signals that are spontaneously generated by electrochemical reactions occurring at corroding or other surfaces. EN analysis is well suited for monitoring and identifying the onset of localized corrosion, and for measuring uniform corrosion rates. A typical EN based corrosion-monitoring system measures instantaneous fluctuations in corrosion current and potential between three nominally identical electrodes of the material of interest immersed in the environment of interest. Time-dependent fluctuations in corrosion current are described by electrochemical current noise, and time-dependent fluctuations of corrosion potential are described by electrochemical noise. The corrosion monitoring systems are designed to detect the onset of localized corrosion phenomena if tank conditions should change to allow these phenomena to occur. In addition to the EN technique, the systems also facilitate the use of the Linear Polarization Resistance (LPR) technique to collect uniform corrosion rate information. LPR measures the linearity at the origin of the polarization curve for overvoltages up to a few millivolts away from the rest potential or natural corrosion potential. The slope of the current vs. voltage plot gives information on uniform corrosion rates

  19. A support system for water system isolation task in NPP by using augmented reality and RFID

    Energy Technology Data Exchange (ETDEWEB)

    Shimoda, Hiroshi; Ishii, Hirotake; Yamazaki, Yuichiro [Kyoto Univ., Uji (Japan). Graduate School of Energy Science; Wu, Wei [Mitsubishi Electric Corp., Amagasaki, Hyogo (Japan); Yoshikawa, Hidekazu [Kyoto Univ., Kyoto (Japan). Graduate School of Energy Science

    2004-07-01

    Aiming at improvement of task performance and reduction of human error of water system isolation task in NPP periodic maintenance, a support system using state-of-art information technology, Augmented Reality (AR) and Radio Frequency Identification (RFID) has been proposed, and a prototype system has been developed. The system has navigation function of which an indication is superimposed directly on the user's view to help to find the designated valves by AR. It also has valve confirmation function by scanning RFID tag attached on the valve. In case of applying it to practical use, its information presentation device is important because it affects the task performance. In this study, therefore, a suitable information presentation device has been pursued by conducting subject experiments employing psychological experimental technique. The candidates of the devices are one-eye video see-through HMD (SCOPO) and both-eye video see-through HMD (Glasstron) as wearable system configuration, and tablet PC and compact TV as handheld system configuration. In the experiment, task completion time, number of errors, NASA-TLX score as subjects' mental workload and subjective usability questionnaire were measured when using the above devices. As the results, it was found that one-eye video see-through head mounted display, SCOPO was suitable device as wearable system configuration, and compact TV was suitable device as handheld system configuration. (author)

  20. A support system for water system isolation task in NPP by using augmented reality and RFID

    International Nuclear Information System (INIS)

    Shimoda, Hiroshi; Ishii, Hirotake; Yamazaki, Yuichiro; Yoshikawa, Hidekazu

    2004-01-01

    Aiming at improvement of task performance and reduction of human error of water system isolation task in NPP periodic maintenance, a support system using state-of-art information technology, Augmented Reality (AR) and Radio Frequency Identification (RFID) has been proposed, and a prototype system has been developed. The system has navigation function of which an indication is superimposed directly on the user's view to help to find the designated valves by AR. It also has valve confirmation function by scanning RFID tag attached on the valve. In case of applying it to practical use, its information presentation device is important because it affects the task performance. In this study, therefore, a suitable information presentation device has been pursued by conducting subject experiments employing psychological experimental technique. The candidates of the devices are one-eye video see-through HMD (SCOPO) and both-eye video see-through HMD (Glasstron) as wearable system configuration, and tablet PC and compact TV as handheld system configuration. In the experiment, task completion time, number of errors, NASA-TLX score as subjects' mental workload and subjective usability questionnaire were measured when using the above devices. As the results, it was found that one-eye video see-through head mounted display, SCOPO was suitable device as wearable system configuration, and compact TV was suitable device as handheld system configuration. (author)

  1. Assessment of multi-version NPP I and C systems safety. Metric-based approach, technique and tool

    International Nuclear Information System (INIS)

    Kharchenko, Vyacheslav; Volkovoy, Andrey; Bakhmach, Eugenii; Siora, Alexander; Duzhyi, Vyacheslav

    2011-01-01

    The challenges related to problem of assessment of actual diversity level and evaluation of diversity-oriented NPP I and C systems safety are analyzed. There are risks of inaccurate assessment and problems of insufficient decreasing probability of CCFs. CCF probability of safety-critical systems may be essentially decreased due to application of several different types of diversity (multi-diversity). Different diversity types of FPGA-based NPP I and C systems, general approach and stages of diversity and safety assessment as a whole are described. Objectives of the report are: (a) analysis of the challenges caused by use of diversity approach in NPP I and C systems in context of FPGA and other modern technologies application; (b) development of multi-version NPP I and C systems assessment technique and tool based on check-list and metric-oriented approach; (c) case-study of the technique: assessment of multi-version FPGA-based NPP I and C developed by use of Radiy TM Platform. (author)

  2. System and method for seamless task-directed autonomy for robots

    Science.gov (United States)

    Nielsen, Curtis; Bruemmer, David; Few, Douglas; Walton, Miles

    2012-09-18

    Systems, methods, and user interfaces are used for controlling a robot. An environment map and a robot designator are presented to a user. The user may place, move, and modify task designators on the environment map. The task designators indicate a position in the environment map and indicate a task for the robot to achieve. A control intermediary links task designators with robot instructions issued to the robot. The control intermediary analyzes a relative position between the task designators and the robot. The control intermediary uses the analysis to determine a task-oriented autonomy level for the robot and communicates target achievement information to the robot. The target achievement information may include instructions for directly guiding the robot if the task-oriented autonomy level indicates low robot initiative and may include instructions for directing the robot to determine a robot plan for achieving the task if the task-oriented autonomy level indicates high robot initiative.

  3. When Unbiased Probabilistic Learning Is Not Enough: Acquiring a Parametric System of Metrical Phonology

    Science.gov (United States)

    Pearl, Lisa S.

    2011-01-01

    Parametric systems have been proposed as models of how humans represent knowledge about language, motivated in part as a way to explain children's rapid acquisition of linguistic knowledge. Given this, it seems reasonable to examine if children with knowledge of parameters could in fact acquire the adult system from the data available to them.…

  4. A Runtime Testability Metric for Dynamic High-Availability Component-based Systems

    NARCIS (Netherlands)

    Gonzales-Sanchez, A.; Piel, E.A.B.; Gross, H.G.; Van Gemund, A.J.C.

    2011-01-01

    Runtime testing is emerging as the solution for the integration and assessment of highly dynamic, high availability software systems where traditional development-time integration testing cannot be performed. A prerequisite for runtime testing is the knowledge about to which extent the system can be

  5. Using Chronic Absence in a Multi-Metric Accountability System. Policy Memo 16-1

    Science.gov (United States)

    Hough, Heather

    2016-01-01

    With the passage of the Every Student Succeeds Act (ESSA) of 2015, California must integrate additional measures of student and school performance into the state-wide accountability system. To support the conversation as policymakers consider if/how to include chronic absenteeism data in the state's accountability system, PACE has conducted an…

  6. Cognitive Styles, Demographic Attributes, Task Performance and Affective Experiences: An Empirical Investigation into Astrophysics Data System (ADS) Core Users

    Science.gov (United States)

    Tong, Rong

    As a primary digital library portal for astrophysics researchers, SAO/NASA ADS (Astrophysics Data System) 2.0 interface features several visualization tools such as Author Network and Metrics. This research study involves 20 ADS long term users who participated in a usability and eye tracking research session. Participants first completed a cognitive test, and then performed five tasks in ADS 2.0 where they explored its multiple visualization tools. Results show that over half of the participants were Imagers and half of the participants were Analytic. Cognitive styles were found to have significant impacts on several efficiency-based measures. Analytic-oriented participants were observed to spent shorter time on web pages and apps, made fewer web page changes than less-Analytic-driving participants in performing common tasks, whereas AI (Analytic-Imagery) participants also completed their five tasks faster than non-AI participants. Meanwhile, self-identified Imagery participants were found to be more efficient in their task completion through multiple measures including total time on task, number of mouse clicks, and number of query revisions made. Imagery scores were negatively associated with frequency of confusion and the observed counts of being surprised. Compared to those who did not claimed to be a visual person, self-identified Imagery participants were observed to have significantly less frequency in frustration and hesitation during their task performance. Both demographic variables and past user experiences were found to correlate with task performance; query revision also correlated with multiple time-based measurements. Considered as an indicator of efficiency, query revisions were found to correlate negatively with the rate of complete with ease, and positively with several time-based efficiency measures, rate of complete with some difficulty, and the frequency of frustration. These results provide rich insights into the cognitive styles of ADS' core

  7. Plastics piping systems for industrial applications – Acrylonitrile-butadiene-styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) – Specifications for components and the systemMetric series

    CERN Document Server

    Deutsches Institut für Normung. Berlin

    2003-01-01

    Plastics piping systems for industrial applications – Acrylonitrile-butadiene-styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) – Specifications for components and the systemMetric series

  8. Plastics piping systems for industrial applications : acrylonitrile-butadiene- styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) : specifications for components and the system : metric series

    CERN Document Server

    International Organization for Standardization. Geneva

    2003-01-01

    Plastics piping systems for industrial applications : acrylonitrile-butadiene- styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) : specifications for components and the system : metric series

  9. Studies on load metric and communication for a load balancing algorithm in a distributed data acquisition system

    International Nuclear Information System (INIS)

    Simon, M; Kozielski, S; Sakulin, H

    2011-01-01

    The proposed method is designed for a data acquisition system acquiring data from n independent sources. The data sources are supposed to produce fragments that together constitute some logical wholeness. These fragments are produced with the same frequency and in the same sequence. The discussed algorithm aims to balance the data dynamically between m logically autonomous processing units (consisting of computing nodes) in case of variation in their processing power which could be caused by some faults like failing computing nodes, or broken network connections. As a case study we consider the Data Acquisition System of the Compact Muon Solenoid Experiment at CERN's new Large Hadron Collider. The system acquires data from about 500 sources and combines them into full events. Each data source is expected to deliver event fragments of an average size of 2 kB with 100 kHz frequency. In this paper we present the results of applying proposed load metric and load communication pattern. Moreover, we discuss their impact on the algorithm's overall efficiency and scalability, as well as on fault tolerance of the whole system. We also propose a general concept of an algorithm that allows for choosing the destination processing unit in all source nodes asynchronously and asserts that all fragments of same logical data always go to same unit.

  10. Outage performance of two-way DF relaying systems with a new relay selection metric

    KAUST Repository

    Hyadi, Amal; Benjillali, Mustapha; Alouini, Mohamed-Slim

    2012-01-01

    This paper investigates a new constrained relay selection scheme for two-way relaying systems where two end terminals communicate simultaneously via a relay. The introduced technique is based on the maximization of the weighted sum rate of both

  11. Compact Wireless BioMetric Monitoring and Real Time Processing System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — BioWATCH is a modular ambulatory compact wireless biomedical data acquisition system. More specifically, it is a data acquisition unit for acquiring signals from...

  12. Reliability Evaluation Metrics for Internet of Things, Car Tracking System: A Review

    OpenAIRE

    Michael Onuoha Thomas; Babak Bashari Rad

    2017-01-01

    As technology continues to advance, the need to create benchmark or standards for systems becomes a necessity so as to ensure that these new advanced systems functions at its maximum capacity over a long period of time without any failure, fault or errors occurring. The internet of things technology promises a broad range of exciting products and services, with car tracking technology as part of the broad range of technological concept under the internet of things para...

  13. An Integrated Information System for Supporting Quality Management Tasks

    Science.gov (United States)

    Beyer, N.; Helmreich, W.

    2004-08-01

    In a competitive environment, well defined processes become the strategic advantage of a company. Hence, targeted Quality Management ensures efficiency, trans- parency and, ultimately, customer satisfaction. In the particular context of a Space Test Centre, a num- ber of specific Quality Management standards have to be applied. According to the revision of ISO 9001 dur- ing 2000, and due to the adaptation of ECSS-Q20-07, process orientation and data analysis are key tasks for ensuring and evaluating the efficiency of a company's processes. In line with these requirements, an integrated management system for accessing the necessary infor- mation to support Quality Management and other proc- esses has been established. Some of its test-related fea- tures are presented here. Easy access to the integrated management system from any work place at IABG's Space Test Centre is ensured by means of an intranet portal. It comprises a full set of quality-related process descriptions, information on test facilities, emergency procedures, and other relevant in- formation. The portal's web interface provides direct access to a couple of external applications. Moreover, easy updating of all information and low cost mainte- nance are features of this integrated information system. The timely and transparent management of non- conformances is covered by a dedicated NCR database which incorporates full documentation capability, elec- tronic signature and e-mail notification of concerned staff. A search interface allows for queries across all documented non-conformances. Furthermore, print ver- sions can be generated at any stage in the process, e.g. for distribution to customers. Feedback on customer satisfaction is sought through a web-based questionnaire. The process is initiated by the responsible test manager through submission of an e- mail that contains a hyperlink to a secure website, ask- ing the customer to complete the brief online form, which is directly fed to a database

  14. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    Science.gov (United States)

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  15. Outage performance of two-way DF relaying systems with a new relay selection metric

    KAUST Repository

    Hyadi, Amal

    2012-04-01

    This paper investigates a new constrained relay selection scheme for two-way relaying systems where two end terminals communicate simultaneously via a relay. The introduced technique is based on the maximization of the weighted sum rate of both users. To evaluate the performance of the proposed system, the outage probability is derived in a general case (where an arbitrary channel is considered), and then over independently but not necessarily identically distributed (i.n.i.d.) Rayleigh fading channels. The analytical results are verified through simulations. © 2012 IEEE.

  16. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  17. Observationally-based Metrics of Ocean Carbon and Biogeochemical Variables are Essential for Evaluating Earth System Model Projections

    Science.gov (United States)

    Russell, J. L.; Sarmiento, J. L.

    2017-12-01

    The Southern Ocean is central to the climate's response to increasing levels of atmospheric greenhouse gases as it ventilates a large fraction of the global ocean volume. Global coupled climate models and earth system models, however, vary widely in their simulations of the Southern Ocean and its role in, and response to, the ongoing anthropogenic forcing. Due to its complex water-mass structure and dynamics, Southern Ocean carbon and heat uptake depend on a combination of winds, eddies, mixing, buoyancy fluxes and topography. Understanding how the ocean carries heat and carbon into its interior and how the observed wind changes are affecting this uptake is essential to accurately projecting transient climate sensitivity. Observationally-based metrics are critical for discerning processes and mechanisms, and for validating and comparing climate models. As the community shifts toward Earth system models with explicit carbon simulations, more direct observations of important biogeochemical parameters, like those obtained from the biogeochemically-sensored floats that are part of the Southern Ocean Carbon and Climate Observations and Modeling project, are essential. One goal of future observing systems should be to create observationally-based benchmarks that will lead to reducing uncertainties in climate projections, and especially uncertainties related to oceanic heat and carbon uptake.

  18. Toward quantifying metrics for rail-system resilience : Identification and analysis of performance weak resilience signals

    NARCIS (Netherlands)

    Regt, A. de; Siegel, A.W.; Schraagen, J.M.C.

    2016-01-01

    This paper aims to enhance tangibility of the resilience engineering concept by facilitating understanding and operationalization of weak resilience signals (WRSs) in the rail sector. Within complex socio-technical systems, accidents can be seen as unwanted outcomes emerging from uncontrolled

  19. Engineering Task Plan for a vapor treatment system on Tank 241-C-103

    International Nuclear Information System (INIS)

    Conrad, R.B.

    1995-01-01

    This Engineering Task Plan describes tasks and responsibilities for the design, fabrication, test, and installation of a vapor treatment system (mixing system) on Tank 241-C-103. The mixing system is to be installed downstream of the breather filter and will use a mixing blower to reduce the chemical concentrations to below allowable levels

  20. Task-Oriented Spoken Dialog System for Second-Language Learning

    Science.gov (United States)

    Kwon, Oh-Woog; Kim, Young-Kil; Lee, Yunkeun

    2016-01-01

    This paper introduces a Dialog-Based Computer Assisted second-Language Learning (DB-CALL) system using task-oriented dialogue processing technology. The system promotes dialogue with a second-language learner for a specific task, such as purchasing tour tickets, ordering food, passing through immigration, etc. The dialog system plays a role of a…

  1. Status Report on Activities of the Systems Assessment Task Force, OECD-NEA Expert Group on Accident Tolerant Fuels for LWRs

    Energy Technology Data Exchange (ETDEWEB)

    Bragg-Sitton, Shannon Michelle [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The Organization for Economic Cooperation and Development /Nuclear Energy Agency (OECD/NEA) Nuclear Science Committee approved the formation of an Expert Group on Accident Tolerant Fuel (ATF) for LWRs (EGATFL) in 2014. Chaired by Kemal Pasamehmetoglu, INL Associate Laboratory Director for Nuclear Science and Technology, the mandate for the EGATFL defines work under three task forces: (1) Systems Assessment, (2) Cladding and Core Materials, and (3) Fuel Concepts. Scope for the Systems Assessment task force (TF1) includes definition of evaluation metrics for ATF, technology readiness level definition, definition of illustrative scenarios for ATF evaluation, and identification of fuel performance and system codes applicable to ATF evaluation. The Cladding and Core Materials (TF2) and Fuel Concepts (TF3) task forces will identify gaps and needs for modeling and experimental demonstration; define key properties of interest; identify the data necessary to perform concept evaluation under normal conditions and illustrative scenarios; identify available infrastructure (internationally) to support experimental needs; and make recommendations on priorities. Where possible, considering proprietary and other export restrictions (e.g., International Traffic in Arms Regulations), the Expert Group will facilitate the sharing of data and lessons learned across the international group membership. The Systems Assessment task force is chaired by Shannon Bragg-Sitton (Idaho National Laboratory [INL], U.S.), the Cladding Task Force is chaired by Marie Moatti (Electricite de France [EdF], France), and the Fuels Task Force is chaired by a Masaki Kurata (Japan Atomic Energy Agency [JAEA], Japan). The original Expert Group mandate was established for June 2014 to June 2016. In April 2016 the Expert Group voted to extend the mandate one additional year to June 2017 in order to complete the task force deliverables; this request was subsequently approved by the Nuclear Science Committee. This

  2. Status Report on Activities of the Systems Assessment Task Force, OECD-NEA Expert Group on Accident Tolerant Fuels for LWRs

    International Nuclear Information System (INIS)

    Bragg-Sitton, Shannon Michelle

    2016-01-01

    The Organization for Economic Cooperation and Development /Nuclear Energy Agency (OECD/NEA) Nuclear Science Committee approved the formation of an Expert Group on Accident Tolerant Fuel (ATF) for LWRs (EGATFL) in 2014. Chaired by Kemal Pasamehmetoglu, INL Associate Laboratory Director for Nuclear Science and Technology, the mandate for the EGATFL defines work under three task forces: (1) Systems Assessment, (2) Cladding and Core Materials, and (3) Fuel Concepts. Scope for the Systems Assessment task force (TF1) includes definition of evaluation metrics for ATF, technology readiness level definition, definition of illustrative scenarios for ATF evaluation, and identification of fuel performance and system codes applicable to ATF evaluation. The Cladding and Core Materials (TF2) and Fuel Concepts (TF3) task forces will identify gaps and needs for modeling and experimental demonstration; define key properties of interest; identify the data necessary to perform concept evaluation under normal conditions and illustrative scenarios; identify available infrastructure (internationally) to support experimental needs; and make recommendations on priorities. Where possible, considering proprietary and other export restrictions (e.g., International Traffic in Arms Regulations), the Expert Group will facilitate the sharing of data and lessons learned across the international group membership. The Systems Assessment task force is chaired by Shannon Bragg-Sitton (Idaho National Laboratory [INL], U.S.), the Cladding Task Force is chaired by Marie Moatti (Electricite de France [EdF], France), and the Fuels Task Force is chaired by a Masaki Kurata (Japan Atomic Energy Agency [JAEA], Japan). The original Expert Group mandate was established for June 2014 to June 2016. In April 2016 the Expert Group voted to extend the mandate one additional year to June 2017 in order to complete the task force deliverables; this request was subsequently approved by the Nuclear Science Committee. This

  3. Testing Quality and Metrics for the LHC Magnet Powering System throughout Past and Future Commissioning

    OpenAIRE

    Anderson, D; Audrain, M; Charifoulline, Z; Dragu, M; Fuchsberger, K; Garnier, JC; Gorzawski, AA; Koza, M; Krol, K; Rowan, S; Stamos, K; Zerlauth, M

    2014-01-01

    The LHC magnet powering system is composed of thousands of individual components to assure a safe operation when operating with stored energies as high as 10GJ in the superconducting LHC magnets. Each of these components has to be thoroughly commissioned following interventions and machine shutdown periods to assure their protection function in case of powering failures. As well as having dependable tracking of test executions it is vital that the executed commissioning steps and applied anal...

  4. Performance metric optimization advocates CPFR in supply chains: A system dynamics model based study

    OpenAIRE

    Balaji Janamanchi; James R. Burns

    2016-01-01

    Background: Supply Chain partners often find themselves in rather helpless positions, unable to improve their firm’s performance and profitability because their partners although willing to share production information do not fully collaborate in tackling customer order variations as they don’t seem to appreciate the benefits of such collaboration. Methods: We use a two-player (supplier-manufacturer) System Dynamics model to study the dynamics to assess the impact and usefulness of supply cha...

  5. Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-11-01

    When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

  6. Overall Bike Effectiveness as a Sustainability Metric for Bike Sharing Systems

    Directory of Open Access Journals (Sweden)

    Bernardo Nugroho Yahya

    2017-11-01

    Full Text Available Bike sharing systems (BSS have been widely accepted as an urban transport scheme in many cities around the world. The concept is recently expanded and followed by many cities to offer citizen a “green” and flexible transportation scheme in urban areas. Many works focus on the issues of bike availability while the bike performance, i.e., life cycle issues and its sustainability, for better management has been abandoned. As a consequence, mismanagement of BSS would lead to cost inefficiency and, the worst case, end with operation termination. This study proposes a design science approach by developing an Overall Bike Effectiveness (OBE framework. By incorporating the concept of overall equipment analysis (OEE, the proposed framework is used to measure the bike utilization. Accordingly, the OBE is extended into Theoretical OBE to measure the sustainability of the early-stage of BSS. The framework has been verified and evaluated using a real dataset of BSS. The proposed method provides valuable results for benchmarking, life cycle analysis, system expansion and strategy planning toward sustainability. The paper concludes with a discussion to show the impact of the proposed approach into the real practices of BSS including an outlook toward sustainability of BSS.

  7. Statistical analysis and decoding of neural activity in the rodent geniculate ganglion using a metric-based inference system.

    Directory of Open Access Journals (Sweden)

    Wei Wu

    Full Text Available We analyzed the spike discharge patterns of two types of neurons in the rodent peripheral gustatory system, Na specialists (NS and acid generalists (AG to lingual stimulation with NaCl, acetic acid, and mixtures of the two stimuli. Previous computational investigations found that both spike rate and spike timing contribute to taste quality coding. These studies used commonly accepted computational methods, but they do not provide a consistent statistical evaluation of spike trains. In this paper, we adopted a new computational framework that treated each spike train as an individual data point for computing summary statistics such as mean and variance in the spike train space. We found that these statistical summaries properly characterized the firing patterns (e. g. template and variability and quantified the differences between NS and AG neurons. The same framework was also used to assess the discrimination performance of NS and AG neurons and to remove spontaneous background activity or "noise" from the spike train responses. The results indicated that the new metric system provided the desired decoding performance and noise-removal improved stimulus classification accuracy, especially of neurons with high spontaneous rates. In summary, this new method naturally conducts statistical analysis and neural decoding under one consistent framework, and the results demonstrated that individual peripheral-gustatory neurons generate a unique and reliable firing pattern during sensory stimulation and that this pattern can be reliably decoded.

  8. Affective feedback in a tutoring system for procedural tasks

    NARCIS (Netherlands)

    Heylen, Dirk K.J.; André, E.; Vissers, M.; Dybkjaer, L.; Minker, W.; op den Akker, Hendrikus J.A.; Heisterkamp, P.; Nijholt, Antinus

    2004-01-01

    We discuss the affective aspects of tutoring dialogues in an ITS -called INES- that helps students to practice nursing tasks using a haptic device and a virtual environment. Special attention is paid to affective control in the tutoring process by means of selecting the appropriate feedback, taking

  9. A flocking algorithm for multi-agent systems with connectivity preservation under hybrid metric-topological interactions.

    Science.gov (United States)

    He, Chenlong; Feng, Zuren; Ren, Zhigang

    2018-01-01

    In this paper, we propose a connectivity-preserving flocking algorithm for multi-agent systems in which the neighbor set of each agent is determined by the hybrid metric-topological distance so that the interaction topology can be represented as the range-limited Delaunay graph, which combines the properties of the commonly used disk graph and Delaunay graph. As a result, the proposed flocking algorithm has the following advantages over the existing ones. First, range-limited Delaunay graph is sparser than the disk graph so that the information exchange among agents is reduced significantly. Second, some links irrelevant to the connectivity can be dynamically deleted during the evolution of the system. Thus, the proposed flocking algorithm is more flexible than existing algorithms, where links are not allowed to be disconnected once they are created. Finally, the multi-agent system spontaneously generates a regular quasi-lattice formation without imposing the constraint on the ratio of the sensing range of the agent to the desired distance between two adjacent agents. With the interaction topology induced by the hybrid distance, the proposed flocking algorithm can still be implemented in a distributed manner. We prove that the proposed flocking algorithm can steer the multi-agent system to a stable flocking motion, provided the initial interaction topology of multi-agent systems is connected and the hysteresis in link addition is smaller than a derived upper bound. The correctness and effectiveness of the proposed algorithm are verified by extensive numerical simulations, where the flocking algorithms based on the disk and Delaunay graph are compared.

  10. A flocking algorithm for multi-agent systems with connectivity preservation under hybrid metric-topological interactions.

    Directory of Open Access Journals (Sweden)

    Chenlong He

    Full Text Available In this paper, we propose a connectivity-preserving flocking algorithm for multi-agent systems in which the neighbor set of each agent is determined by the hybrid metric-topological distance so that the interaction topology can be represented as the range-limited Delaunay graph, which combines the properties of the commonly used disk graph and Delaunay graph. As a result, the proposed flocking algorithm has the following advantages over the existing ones. First, range-limited Delaunay graph is sparser than the disk graph so that the information exchange among agents is reduced significantly. Second, some links irrelevant to the connectivity can be dynamically deleted during the evolution of the system. Thus, the proposed flocking algorithm is more flexible than existing algorithms, where links are not allowed to be disconnected once they are created. Finally, the multi-agent system spontaneously generates a regular quasi-lattice formation without imposing the constraint on the ratio of the sensing range of the agent to the desired distance between two adjacent agents. With the interaction topology induced by the hybrid distance, the proposed flocking algorithm can still be implemented in a distributed manner. We prove that the proposed flocking algorithm can steer the multi-agent system to a stable flocking motion, provided the initial interaction topology of multi-agent systems is connected and the hysteresis in link addition is smaller than a derived upper bound. The correctness and effectiveness of the proposed algorithm are verified by extensive numerical simulations, where the flocking algorithms based on the disk and Delaunay graph are compared.

  11. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  12. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    Energy Technology Data Exchange (ETDEWEB)

    Tsourtis, Anastasios, E-mail: tsourtis@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, Crete (Greece); Pantazis, Yannis, E-mail: pantazis@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, and Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), GR-70013 Heraklion, Crete (Greece)

    2015-07-07

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities.

  13. Sustainability Assessment of a Military Installation: A Template for Developing a Mission Sustainability Framework, Goals, Metrics and Reporting System

    Science.gov (United States)

    2009-08-01

    integration across base MSF Category: Neighbors and Stakeholders (NS) No. Conceptual Metric No. Conceptual Metric NS1 “ Walkable ” on-base community...34 Walkable " on- base community design 1 " Walkable " community Design – on-base: clustering of facilities, presence of sidewalks, need for car...access to public transit LEED for Neighborhood Development (ND) 0-100 index based on score of walkable community indicators Adapt LEED-ND

  14. The wasted energy: A metric to set up appropriate targets in our path towards fully renewable energy systems

    International Nuclear Information System (INIS)

    Vinagre Díaz, Juan José; Wilby, Mark Richard; Rodríguez González, Ana Belén

    2015-01-01

    By 2020 Europe has to increase its energy efficiency and share of renewables in 20%. However, even accomplishing these challenging objectives Europe will be effectively wasting energy as we demonstrate in this paper. In our way towards a fully renewable scenario, we need at least to stop wasting energy in order to guarantee the energy supply needed for growth and comfort. We waste energy when we employ more primary energy than the final energy we ultimately use and this excess cannot be reutilized. In this paper we propose the WE (wasted energy) as a novel metric to measure the performance of energy systems and set up appropriate targets. The WE incorporates information about energy efficiency and renewable sources. Unlike European legislation, the WE considers them in an integrated way. This approach will help Member States to exploit their intrinsic capabilities and design their optimum strategy to reach their objectives. Using the information in Eurostat, we calculate the WE of Member States in EU-28 and their evolution. We also analyze illustrative examples to highlight strategies to reduce the WE, study the connection between economic development and WE, and provide a tool to diagnose the potential of improvement of an energy system. - Highlights: • Even achieving the 2020 objectives, Europe will still be wasting energy. • We need to reduce wasted energy in our way towards 100% renewable energy systems. • The WE (wasted energy) integrates efficiency and renewable in a single target. • We provide the empirical WE of Member States in EU-28 and their evolution. • Finally we highlight best practices of real energy systems.

  15. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  16. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  17. Status Report on Activities of the Systems Assessment Task Force, OECD-NEA Expert Group on Accident Tolerant Fuels for LWRs

    Energy Technology Data Exchange (ETDEWEB)

    Bragg-Sitton, Shannon Michelle [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The Organization for Economic Cooperation and Development /Nuclear Energy Agency (OECD/NEA) Nuclear Science Committee approved the formation of an Expert Group on Accident Tolerant Fuel (ATF) for LWRs (EGATFL) in 2014. Chaired by Kemal Pasamehmetoglu, INL Associate Laboratory Director for Nuclear Science and Technology, the mandate for the EGATFL defines work under three task forces: (1) Systems Assessment, (2) Cladding and Core Materials, and (3) Fuel Concepts. Scope for the Systems Assessment task force includes definition of evaluation metrics for ATF, technology readiness level definition, definition of illustrative scenarios for ATF evaluation, parametric studies, and selection of system codes. The Cladding and Core Materials and Fuel Concepts task forces will identify gaps and needs for modeling and experimental demonstration; define key properties of interest; identify the data necessary to perform concept evaluation under normal conditions and illustrative scenarios; identify available infrastructure (internationally) to support experimental needs; and make recommendations on priorities. Where possible, considering proprietary and other export restrictions (e.g., International Traffic in Arms Regulations), the Expert Group will facilitate the sharing of data and lessons learned across the international group membership. The Systems Assessment Task Force is chaired by Shannon Bragg-Sitton (INL), while the Cladding Task Force will be chaired by a representative from France (Marie Moatti, Electricite de France [EdF]) and the Fuels Task Force will be chaired by a representative from Japan (Masaki Kurata, Japan Atomic Energy Agency [JAEA]). This report provides an overview of the Systems Assessment Task Force charter and status of work accomplishment.

  18. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  19. Informatization tools (means of study effectiveness checking based on hierarchy system of tasks

    Directory of Open Access Journals (Sweden)

    Сергей Викторович Криволапов

    2010-09-01

    Full Text Available Study effectiveness checking system, based on tasks hierarchy system is considered in this article. Introduced performance score model, can not only helps along time saving and teacher's work facilitation but gives more objective appraisal of student's knowledge.

  20. Safety and Mission Assurance (SMA) Automated Task Order Management System (ATOMS) Operation Manual

    Science.gov (United States)

    Wallace, Shawn; Fikes, Lou A.

    2016-01-01

    This document describes operational aspects of the ATOMS system. The information provided is limited to the functionality provided by ATOMS and does not include information provided in the contractor's proprietary financial and task management system.

  1. A farm platform approach to optimizing temperate grazing-livestock systems: metrics for trade-off assessments and future innovations

    Science.gov (United States)

    Harris, Paul; Takahashi, Taro; Blackwell, Martin; Cardenas, Laura; Collins, Adrian; Dungait, Jennifer; Eisler, Mark; Hawkins, Jane; Misselbrook, Tom; Mcauliffe, Graham; Mcfadzean, Jamie; Murray, Phil; Orr, Robert; Jordana Rivero, M.; Wu, Lianhai; Lee, Michael

    2017-04-01

    data on hydrology, emissions, nutrient cycling, biodiversity, productivity and livestock welfare/health for 2 years (April 2011 to March 2013). Since April 2013, the platform has been progressively modified across three distinct ca. 22 ha farmlets with the underlying principle being to improve the sustainability (economic, social and environmental) by comparing contrasting pasture-based systems (permanent pasture, grass and clover swards, and reseeding of high quality germplasm on a regular cycle). This modification or transitional period ended in July 2015, when the platform assumed full post-baseline status. In this paper, we summarise the sustainability trade-off metrics developed to compare the three systems, together with the farm platform data collections used to create them; collections that can be viewed as 'big data' when considered in their entirety. We concentrate on the baseline and transitional periods and discuss the potential innovations to optimise grazing livestock systems utilising an experimental farm platform approach.

  2. Model-based identification and use of task complexity factors of human integrated systems

    International Nuclear Information System (INIS)

    Ham, Dong-Han; Park, Jinkyun; Jung, Wondea

    2012-01-01

    Task complexity is one of the conceptual constructs that are critical to explain and predict human performance in human integrated systems. A basic approach to evaluating the complexity of tasks is to identify task complexity factors and measure them. Although a great deal of task complexity factors have been studied, there is still a lack of conceptual frameworks for identifying and organizing them analytically, which can be generally used irrespective of the types of domains and tasks. This study proposes a model-based approach to identifying and using task complexity factors, which has two facets—the design aspects of a task and complexity dimensions. Three levels of design abstraction, which are functional, behavioral, and structural aspects of a task, characterize the design aspect of a task. The behavioral aspect is further classified into five cognitive processing activity types. The complexity dimensions explain a task complexity from different perspectives, which are size, variety, and order/organization. Twenty-one task complexity factors are identified by the combination of the attributes of each facet. Identification and evaluation of task complexity factors based on this model is believed to give insights for improving the design quality of tasks. This model for complexity factors can also be used as a referential framework for allocating tasks and designing information aids. The proposed approach is applied to procedure-based tasks of nuclear power plants (NPPs) as a case study to demonstrate its use. Last, we compare the proposed approach with other studies and then suggest some future research directions.

  3. Machine learning of network metrics in ATLAS Distributed Data Management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00218873; The ATLAS collaboration; Toler, Wesley; Vamosi, Ralf; Bogado Garcia, Joaquin Ignacio

    2017-01-01

    The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our m...

  4. Machine learning of network metrics in ATLAS Distributed Data Management

    Science.gov (United States)

    Lassnig, Mario; Toler, Wesley; Vamosi, Ralf; Bogado, Joaquin; ATLAS Collaboration

    2017-10-01

    The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for networkaware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  5. An Adaptive Regulator for Space Teleoperation System in Task Space

    Directory of Open Access Journals (Sweden)

    Chao Ge

    2014-01-01

    Full Text Available The problem of the gravity information which can not be obtained in advance for bilateral teleoperation is studied. In outer space exploration, the gravity term changes with the position changing of the slave manipulator. So it is necessary to design an adaptive regulator controller to compensate for the unknown gravity signal. Moreover, to get a more accurate position tracking performance, the controller is designed in the task space instead of the joint space. Additionally, the time delay considered in this paper is not only time varying but also unsymmetrical. Finally, simulations are presented to show the effectiveness of the proposed approach.

  6. Analysis and Modeling of Control Tasks in Dynamic Systems

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær; Krink, Thiemo; Jensen, Mikkel Thomas

    2002-01-01

    Most applications of evolutionary algorithms deal with static optimization problems. However, in recent years, there has been a growing interest in time-varying (dynamic) problems, which are typically found in real-world scenarios. One major challenge in this field is the design of realistic test......-case generators (TCGs), which requires a systematic analysis of dynamic optimization tasks. So far, only a few TCGs have been suggested. Our investigation leads to the conclusion that these TCGs are not capable of generating realistic dynamic benchmark tests. The result of our research is the design of a new TCG...

  7. Paired-Associate and Feedback-Based Weather Prediction Tasks Support Multiple Category Learning Systems

    OpenAIRE

    Li, Kaiyun; Fu, Qiufang; Sun, Xunwei; Zhou, Xiaoyan; Fu, Xiaolan

    2016-01-01

    It remains unclear whether probabilistic category learning in the feedback-based weather prediction task (FB-WPT) can be mediated by a non-declarative or procedural learning system. To address this issue, we compared the effects of training time and verbal working memory, which influence the declarative learning system but not the non-declarative learning system, in the FB and paired-associate (PA) WPTs, as the PA task recruits a declarative learning system. The results of Experiment 1 showed...

  8. Concurrent performance of two memory tasks: evidence for domain-specific working memory systems.

    Science.gov (United States)

    Cocchini, Gianna; Logie, Robert H; Della Sala, Sergio; MacPherson, Sarah E; Baddeley, Alan D

    2002-10-01

    Previous studies of dual-task coordination in working memory have shown a lack of dual-task interference when a verbal memory task is combined with concurrent perceptuomotor tracking. Two experiments are reported in which participants were required to perform pairwise combinations of (1) a verbal memory task, a visual memory task, and perceptuomotor tracking (Experiment 1), and (2) pairwise combinations of the two memory tasks and articulatory suppression (Experiment 2). Tracking resulted in no disruption of the verbal memory preload over and above the impact of a delay in recall and showed only minimal disruption of the retention of the visual memory load. Performing an ongoing verbal memory task had virtually no impact on retention of a visual memory preload or vice versa, indicating that performing two demanding memory tasks results in little mutual interference. Experiment 2 also showed minimal disruption when the two memory tasks were combined, although verbal memory (but not visual memory) was clearly disrupted by articulatory suppression interpolated between presentation and recall. These data suggest that a multiple-component working memory model provides a better account for performance in concurrent immediate memory tasks than do theories that assume a single processing and storage system or a limited-capacity attentional system coupled with activated memory traces.

  9. Application of robotic systems to nuclear power plant maintenance tasks

    International Nuclear Information System (INIS)

    Kok, K.D.; Bartilson, B.M.; Rosen, K.L.; Renner, G.F.; Law, T.M.

    1984-01-01

    Robotics technology has developed to where it can provide consistent performance of well-defined tasks. Although nuclear power plant maintenance tasks are characteristically unique, there are some common subtasks which have the consistency required for robots. Several maintenance activities were selected for further study. Concepts for robotic devices and rough scenarios for their use were developed and analyzed for their effect on maintenance costs. The results of the analysis, which was performed using a 10-year life and conservative estimates and procedures, indicate cost savings ranging from $100,000 to $1.5 M in net present value per robot. Projected purchase prices for the robots were less than $200,000. Although the robot concepts used commercially available technology, they are unlike any products either in use or widely required. Robot manufacturers are concentrating on mainstream applications in production, and are unlikely to develop such specialized products. The potential for cost savings indicates that developments should be funded by the nuclear industry

  10. A Practical Method for Collecting Social Media Campaign Metrics

    Science.gov (United States)

    Gharis, Laurie W.; Hightower, Mary F.

    2017-01-01

    Today's Extension professionals are tasked with more work and fewer resources. Integrating social media campaigns into outreach efforts can be an efficient way to meet work demands. If resources go toward social media, a practical method for collecting metrics is needed. Collecting metrics adds one more task to the workloads of Extension…

  11. Different Ways to Cue a Coherent Memory System: A Theory for Episodic, Semantic, and Procedural Tasks.

    Science.gov (United States)

    Humphreys, Michael S.; And Others

    1989-01-01

    An associative theory of memory is proposed to serve as a counterexample to claims that dissociations among episodic, semantic, and procedural memory tasks necessitate separate memory systems. The theory is based on task analyses of matching (recognition and familiarity judgments), retrieval (cued recall), and production (free association). (TJH)

  12. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  13. Evaluating Application-Layer Traffic Optimization Cost Metrics for P2P Multimedia Streaming

    DEFF Research Database (Denmark)

    Poderys, Justas; Soler, José

    2017-01-01

    To help users of P2P communication systems perform better-than-random selection of communication peers, Internet Engineering Task Force standardized the Application Layer Traffic Optimization (ALTO) protocol. The ALTO provided data-routing cost metric, can be used to rank peers in P2P communicati...

  14. Comparative Study of Optical and RF Communication Systems for a Mars Mission - Part II. Unified Value Metrics

    Science.gov (United States)

    Hemmati, H.; Layland, J.; Lesh, J.; Wilson, K.; Sue, M.; Rascoe, D.; Lansing, F.; Wilhelm, M.; Harcke, L.; Chen, C.; hide

    1997-01-01

    In this Par-II report of the Advanced Communications Benefits study, two critical metrics for comparing the benefits of utilizing X-band, Ka-band and Optical frequencies for supporting generic classes of Martian exploration missions have been evaluated.

  15. Task 11 - systems analysis of environmental management technologies

    Energy Technology Data Exchange (ETDEWEB)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy & Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech.

  16. Task 11 - systems analysis of environmental management technologies. Topical report

    International Nuclear Information System (INIS)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy ampersand Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech

  17. Tasks and structure of the WENDELSTEIN 7-X control system

    International Nuclear Information System (INIS)

    Schacht, Joerg; Niedermeyer, Helmut; Laqua, Heike; Spring, Anett; Mueller, Ina; Pingel, Steffen; Woelk, Andreas

    2006-01-01

    The super conducting stellarator WENDELSTEIN 7-X will run pulses of up to 30 min duration with full heating power. Short pulses with arbitrary intervals, steady state long discharges and arbitrary sequences of short phases with different characteristics in one discharge will be supported by the control system. Each technical component and each diagnostic system including its data acquisition will have its own control system permitting autonomous operation for commissioning and testing. During the experimental sessions the activity of these devices will be coordinated by a central control system and the machine runs more or less automatically with predefined programs. A session leader program allows the leader of the experiment to choose and chain predefined segments, to start or stop a segment chain as a discharge. The progress of the discharge is shown by a sequence monitor attached to the central sequence controller and the session leader program. W7-X has to be prepared for the experiment and monitored by means of the PLC based operational management system. A safety system working independently of the operational management consists of local units responsible for the safety of each component and a central unit ensuring the safety of the whole W7-X system. This safety system provides interlocks and controls the human access to the device. A safety analysis is the basis for the development of the safety system

  18. Energy Efficiency of Task Allocation for Embedded JPEG Systems

    Directory of Open Access Journals (Sweden)

    Yang-Hsin Fan

    2014-01-01

    Full Text Available Embedded system works everywhere for repeatedly performing a few particular functionalities. Well-known products include consumer electronics, smart home applications, and telematics device, and so forth. Recently, developing methodology of embedded systems is applied to conduct the design of cloud embedded system resulting in the applications of embedded system being more diverse. However, the more energy consumes result from the more embedded system works. This study presents hyperrectangle technology (HT to embedded system for obtaining energy saving. The HT adopts drift effect to construct embedded systems with more hardware circuits than software components or vice versa. It can fast construct embedded system with a set of hardware circuits and software components. Moreover, it has a great benefit to fast explore energy consumption for various embedded systems. The effects are presented by assessing a JPEG benchmarks. Experimental results demonstrate that the HT, respectively, achieves the energy saving by 29.84%, 2.07%, and 68.80% on average to GA, GHO, and Lin.

  19. Energy efficiency of task allocation for embedded JPEG systems.

    Science.gov (United States)

    Fan, Yang-Hsin; Wu, Jan-Ou; Wang, San-Fu

    2014-01-01

    Embedded system works everywhere for repeatedly performing a few particular functionalities. Well-known products include consumer electronics, smart home applications, and telematics device, and so forth. Recently, developing methodology of embedded systems is applied to conduct the design of cloud embedded system resulting in the applications of embedded system being more diverse. However, the more energy consumes result from the more embedded system works. This study presents hyperrectangle technology (HT) to embedded system for obtaining energy saving. The HT adopts drift effect to construct embedded systems with more hardware circuits than software components or vice versa. It can fast construct embedded system with a set of hardware circuits and software components. Moreover, it has a great benefit to fast explore energy consumption for various embedded systems. The effects are presented by assessing a JPEG benchmarks. Experimental results demonstrate that the HT, respectively, achieves the energy saving by 29.84%, 2.07%, and 68.80% on average to GA, GHO, and Lin.

  20. Task-oriented control of Single-Master Multi-Slave Manipulator System

    International Nuclear Information System (INIS)

    Kosuge, Kazuhiro; Ishikawa, Jun; Furuta, Katsuhisa; Hariki, Kazuo; Sakai, Masaru.

    1994-01-01

    A master-slave manipulator system, in general, consists of a master arm manipulated by a human and a slave arm used for real tasks. Some tasks, such as manipulation of a heavy object, etc., require two or more slave arms operated simultaneously. A Single-Master Multi-Slave Manipulator System consists of a master arm with six degrees of freedom and two or more slave arms, each of which has six or more degrees of freedom. In this system, a master arm controls the task-oriented variables using Virtual Internal Model (VIM) based on the concept of 'Task-Oriented Control'. VIM is a reference model driven by sensory information and used to describe the desired relation between the motion of a master arm and task-oriented variables. The motion of slave arms are controlled based on the task oriented variables generated by VIM and tailors the system to meet specific tasks. A single-master multi-slave manipulator system, having two slave arms, is experimentally developed and illustrates the concept. (author)

  1. Communications data delivery system analysis task 2 report : high-level options for secure communications data delivery systems.

    Science.gov (United States)

    2012-05-16

    This Communications Data Delivery System Analysis Task 2 report describes and analyzes options for Vehicle to Vehicle (V2V) and Vehicle to Infrastructure (V2I) communications data delivery systems using various communication media (Dedicated Short Ra...

  2. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  3. An Improved Task Scheduling Algorithm for Intelligent Control in Tiny Mechanical System

    Directory of Open Access Journals (Sweden)

    Jialiang Wang

    2014-01-01

    Full Text Available Wireless sensor network (WSN has been already widely used in many fields in terms of industry, agriculture, and military, and so forth. The basic composition is WSN nodes that are capable of performing processing, gathering information, and communicating with other connected nodes in the network. The main components of a WSN node are microcontroller, transceiver, and some sensors. Undoubtedly, it also can be added with some actuators to form a tiny mechanical system. Under this case, the existence of task preemption while executing operating system will not only cost more energy for WSN nodes themselves, but also bring unacceptable system states caused by vibrations. However for these nodes, task I/O delays are inevitable due to the existence of task preemption, which will bring extra overhead for the whole system, and even bring unacceptable system states caused by vibrations. This paper mainly considers the earliest deadline first (EDF task preemption algorithm executed in WSN OS and proposes an improved task preemption algorithm so as to lower the preemption overhead and I/O delay and then improve the system performance. The experimental results show that the improved task preemption algorithm can reduce the I/O delay effectively, so the real-time processing ability of the system is enhanced.

  4. Vision system for diagnostic task | Merad | Global Journal of Pure ...

    African Journals Online (AJOL)

    Due to environment degraded conditions, direct measurements are not possible. ... Degraded conditions: vibrations, water and chip of metal projections, ... Before tooling, the vision system has to answer: “is it the right piece at the right place?

  5. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  6. Report on task I: fire protection system study

    International Nuclear Information System (INIS)

    Bernard, E.A.; Cano, G.L.

    1977-02-01

    This study (1) evaluates, on a comparative basis, the national and international regulatory and insurance standards that serve as guidance for fire protection within the nuclear power industry; (2) analyzes the recommendations contained in the major reports on the Browns Ferry Fire; (3) proposes quantitative safety goals and evaluation methods for Nuclear Power Plant Fire Protection Systems (NPPFPS); (4) identifies potential improvements that may be incorporated into NPPFPS; and (5) recommends a plan of action for continuation of the fire protections systems study

  7. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  8. Design criteria document, Fire Protection Task, K Basin Essential Systems Recovery, Project W-405

    International Nuclear Information System (INIS)

    Johnson, B.H.

    1994-01-01

    The K Basin were constructed in the early 1950's with a 20 year design life. The K Basins are currently in their third design life and are serving as a near term storage facility for irradiated N Reactor fuel until an interim fuel storage solution can be implemented. In April 1994, Project W-405, K Basin Essential Systems Recovery, was established to address (among other things) the immediate fire protection needs of the 100K Area. A Fire Barrier Evaluation was performed for the wall between the active and inactive areas of the 105KE and 105KW buildings. This evaluation concludes that the wall is capable of being upgraded to provide an equivalent level of fire resistance as a qualified barrier having a fire resistance rating of 2 hours. The Fire Protection Task is one of four separate Tasks included within the scope of Project W405, K Basin Essential systems Recovery. The other three Tasks are the Water Distribution System Task, the Electrical System Task, and the Maintenance Shop/Support Facility Task. The purpose of Project W-405's Fire Protection Task is to correct Life Safety Code (NFPA 101) non-compliances and to provide fire protection features in Buildings 105KE, 105KW and 190KE that are essential for assuring the safe operation and storage of spent nuclear fuel at the 100K Area Facilities' Irradiated Fuel Storage Basins (K Basins)

  9. Task path planning, scheduling and learning for free-ranging robot systems

    Science.gov (United States)

    Wakefield, G. Steve

    1987-01-01

    The development of robotics applications for space operations is often restricted by the limited movement available to guided robots. Free ranging robots can offer greater flexibility than physically guided robots in these applications. Presented here is an object oriented approach to path planning and task scheduling for free-ranging robots that allows the dynamic determination of paths based on the current environment. The system also provides task learning for repetitive jobs. This approach provides a basis for the design of free-ranging robot systems which are adaptable to various environments and tasks.

  10. Autonomous underwater handling system for service, measurement and cutting tasks for the decommissioning of nuclear facilities

    International Nuclear Information System (INIS)

    Hahn, M.; Haferkamp, H.; Bach, W.; Rose, N.

    1992-01-01

    For about 10 years the Institute for Material Science at the Hanover University has worked on projects of underwater cutting and welding. Increasing tasks to be done in nuclear facilities led to the development of special handling systems to support and handle the cutting tools. Also sensors and computers for extensive and complex tasks were integrated. A small sized freediving handling system, equipped with 2 video cameras, ultrasonic and radiation sensors and a plasma cutting torch for inspection and decommissioning tasks in nuclear facilities is described in this paper. (Author)

  11. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  12. Using task analysis to improve the requirements elicitation in health information system.

    Science.gov (United States)

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2007-01-01

    This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.

  13. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  14. Knowledge-based operation guidance system for nuclear power plants based on generic task methodology

    International Nuclear Information System (INIS)

    Yamada, Naoyuki; Chandrasekaran, B.; Bhatnager, R.

    1989-01-01

    A knowledge-based system for operation guidance of nuclear power plants is proposed. The Dynamic Procedure Management System (DPMS) is designed and developed to assist human operators interactively by selecting and modifying predefined operation procedures in a dynamic situation. Unlike most operation guidance systems, DPMS has been built based on Generic Task Methodology, which makes the overall framework of the system perspicuous and also lets domain knowledge be represented in a natural way. This paper describes the organization of the system, the definition of each task, and the form and organization of knowledge, followed by an application example. (author)

  15. Analysis of task-evoked systemic interference in fNIRS measurements: insights from fMRI.

    Science.gov (United States)

    Erdoğan, Sinem B; Yücel, Meryem A; Akın, Ata

    2014-02-15

    Functional near infrared spectroscopy (fNIRS) is a promising method for monitoring cerebral hemodynamics with a wide range of clinical applications. fNIRS signals are contaminated with systemic physiological interferences from both the brain and superficial tissues, resulting in a poor estimation of the task related neuronal activation. In this study, we use the anatomical resolution of functional magnetic resonance imaging (fMRI) to extract scalp and brain vascular signals separately and construct an optically weighted spatial average of the fMRI blood oxygen level-dependent (BOLD) signal for characterizing the scalp signal contribution to fNIRS measurements. We introduce an extended superficial signal regression (ESSR) method for canceling physiology-based systemic interference where the effects of cerebral and superficial systemic interference are treated separately. We apply and validate our method on the optically weighted BOLD signals, which are obtained by projecting the fMRI image onto optical measurement space by use of the optical forward problem. The performance of ESSR method in removing physiological artifacts is compared to i) a global signal regression (GSR) method and ii) a superficial signal regression (SSR) method. The retrieved signals from each method are compared with the neural signals that represent the 'ground truth' brain activation cleaned from cerebral systemic fluctuations. We report significant improvements in the recovery of task induced neural activation with the ESSR method when compared to the other two methods as reflected in the Pearson R(2) coefficient and mean square error (MSE) metrics (two tailed paired t-tests, pnoise (CNR) improvement (60%). Our findings suggest that, during a cognitive task i) superficial scalp signal contribution to fNIRS signals varies significantly among different regions on the forehead and ii) using an average scalp measurement together with a local measure of superficial hemodynamics better accounts

  16. Biological elements carry out optical tasks in coherent imaging systems

    Science.gov (United States)

    Ferraro, P.; Bianco, V.; Paturzo, M.; Miccio, L.; Memmolo, P.; Merola, F.; Marchesano, V.

    2016-03-01

    We show how biological elements, like live bacteria species and Red Blood Cells (RBCs) can accomplish optical functionalities in DH systems. Turbid media allow coherent microscopy despite the strong light scattering these provoke, acting on light just as moving diffusers. Furthermore, a turbid medium can have positive effects on a coherent imaging system, providing resolution enhancement and mimicking the action of noise decorrelation devices, thus yielding an image quality significantly higher than the quality achievable through a transparent medium in similar recording conditions. Besides, suspended RBCs are demonstrated to behave as controllable liquid micro-lenses, opening new possibilities in biophotonics for endoscopy imaging purposes, as well as telemedicine for point-of-care diagnostics in developing countries and low-resource settings.

  17. A task management system for compliance with health, safety, and environmental regulations

    International Nuclear Information System (INIS)

    Crump, J.J.; O'Gorman, T.P.

    1992-01-01

    Shell Western E and P Inc. (SWEPI) has developed a new computer system to help it comply with health, safety, and environmental (HS and E) regulations. It is a task management system that functions at the detailed inventory level. It schedules work, instructs operations, and records compliance status. This article discusses design and development of the system

  18. Using information systems while performing complex tasks: An example from architectural design

    NARCIS (Netherlands)

    de Vries, Erica; de Jong, Anthonius J.M.

    1997-01-01

    Nowadays, information systems, such as hypertexts, allow a variety of ways in which to structure information. Information systems are also used for an increasing number of purposes. In our study we examined two different purposes for using information systems in the context of a real task:

  19. [The Chilean Health Care System: the task ahead].

    Science.gov (United States)

    Goic, Alejandro

    2015-06-01

    The most important event in Chilean public health in the XXth Century was the creation of the National Health Service (NHS), in 1952. Systematic public policies for the promotion of health, disease prevention, medical care, and rehabilitation were implemented, while a number of more specific programs were introduced, such as those on infant malnutrition, complementary infant feeding, medical control of pregnant women and healthy infants, infant and adult vaccination, and essential sanitation services. In 1981, a parallel private health care system was introduced in the form of medical care financial institutions, which today cover 15% of the population, as contrasted with the public system, which covers about 80%. From 1952 to 2014, public health care policies made possible a remarkable improvement in Chile's health indexes: downward trends in infant mortality rate (from 117.8 to 7.2 x 1,000 live births), maternal mortality (from 276 to 18.5 x 100,000), undernourished children purchasing power parity increased from US$ 3,827 to US$ 20,894 and poverty decreased from 60% to 14.4% of the population. Related indexes such as illiteracy, average schooling, and years of primary school education, were significantly improved as well. Nevertheless, compared with OECD countries, Chile has a relatively low public investment in health (45.7% of total national investment), a deficit in the number of physicians (1.7 x 1,000 inhabitants) and nurses (4.8 x 1,000), in the number of hospital beds (2.1 x 1,000), and in the availability of generic drugs in the market (30%). Chile and the USA are the two OECD countries with the lowest public investment in health. A generalized dissatisfaction with the current Chilean health care model and the need of the vast majority of the population for timely access to acceptable quality medical care are powerful arguments which point to the need for a universal public health care system. The significant increase in public expenditure on health care

  20. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  1. Gravitational lensing in metric theories of gravity

    International Nuclear Information System (INIS)

    Sereno, Mauro

    2003-01-01

    Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other

  2. An investigation on task-technology fit of mobile nursing information systems for nursing performance.

    Science.gov (United States)

    Hsiao, Ju-Ling; Chen, Rai-Fu

    2012-05-01

    This study investigates factors affecting the fit between nursing tasks and mobile nursing information systems and the relationships between the task-technology fit of mobile nursing information systems and nurse performance from the perspective of task-technology fit. Survey research recruited nursing staffs as subjects from selected case hospital. A total of 310 questionnaires were sent out, and 219 copies were obtained, indicating a valid response rate of 70.6%. Collected data were analyzed using the structural equation modeling technique. Our study found that dependence tasks have positive effects on information acquisition (γ=0.234, Pinformation identification (γ=0.478, Pinformation acquisition (γ=0.213, Pintroduction of mobile nursing information systems in assisting nursing practices can help facilitate both independent and dependent nursing tasks. Our study discovered that the supporting functions of mobile nursing information systems have positive effects on information integration and interpretation (γ=0.365, Pinformation acquisition (γ=0.253, Pinformation systems have positive effects on information acquisition (γ=0.318, Pinformation integration and interpretation (γ=0.143, Pinformation identification (β=.055, Pinformation acquisition (β=.176, Pinformation integration and interpretation (β=.706, Pinformation systems have positive effects on nursing performance, indicating 83.2% of totally explained variance. As shown, the use of mobile nursing information systems could provide nursing staffs with real-time and accurate information to increase efficiency and effectiveness in patient-care duties, further improving nursing performance.

  3. Evaluation of modular robot system for maintenance tasks in hot cell

    Energy Technology Data Exchange (ETDEWEB)

    Pagala, Prithvi Sekhar, E-mail: ps.pagala@upm.es [Centre for Automation and Robotics UPM-CSIC (Spain); Ferre, Manuel, E-mail: m.ferre@upm.es [Centre for Automation and Robotics UPM-CSIC (Spain); Orona, Luis, E-mail: l.orona@gsi.de [GSI Helmholtzzentrum für Schwerionenforschung (Germany)

    2014-10-15

    Highlights: •Modular robot deployment inside hot cell for remote manipulation evaluated. •Flexible and adaptable system for variety of tasks presented. •Uses in large workspaces and evolving requirements shown. -- Abstract: This work assesses the use of a modular robot system to perform maintenance and inspection tasks such as, remote flexible inspection, manipulation and cooperation with deployed systems inside the hot cell. A flexible modular solution for the inclusion in maintenance operations is presented. The proposed heterogeneous modular robotic system is evaluated using simulations of the prototype across selected robot configuration to perform tasks. Results obtained show the advantages and ability of the modular robot to perform the necessary tasks as well as its ability to adapt and evolve depending on the need. The simulation test case inside hot cell shows modular robot configuration, a two modular arm to perform tele-operation tasks in the workspace and a wheeled platform for inspection collaborating to perform tasks. The advantage of using re-configurable modular robot over conventional robot platforms is shown.

  4. Multiple-task real-time PDP-15 operating system for data acquisition and analysis

    International Nuclear Information System (INIS)

    Myers, W.R.

    1974-01-01

    The RAMOS operating system is capable of handling up to 72 simultaneous tasks in an interrupt-driven environment. The minimum viable hardware configuration includes a Digital Equipment Corporation PDP-15 computer with 16384 words of memory, extended arithmetic element, automatic priority interrupt, a 256K-word RS09 DECdisk, two DECtape transports, and an alphanumeric keyboard/typer. The monitor executes major tasks by loading disk-resident modules to memory for execution; modules are written in a format that allows page-relocation by the monitor, and can be loaded into any available page. All requests for monitor service by tasks, including input/output, floating point arithmetic, request for additional memory, task initiation, etc., are implemented by privileged monitor calls (CAL). All IO device handlers are capable of queuing requests for service, allowing several tasks ''simultaneous'' use of all resources. All alphanumeric IO (including the PC05) is completely buffered and handled by a single multiplexing routine. The floating point arithmetic software is re-entrant to all operating modules and includes matrix arithmetic functions. One of the system tasks can be a ''batch'' job, controlled by simulating an alphanumeric command terminal through cooperative functions of the disk handler and alphanumeric device software. An alphanumeric control sequence may be executed, automatically accessing disk-resident tasks in any prescribed order; a library of control sequences is maintained on bulk storage for access by the monitor. (auth)

  5. Efficiency improvements in pipeline transportation systems. Technical report, Task 3

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.; Horton, J. H.

    1977-01-01

    This report identifies those potential energy-conservative pipeline innovations that are most energy- and cost-effective, and formulates recommendations for the R, D, and D programs needed to exploit those opportunities. From a candidate field of over twenty classes of efficiency improvements, eight systems are recommended for pursuit. Most of these possess two highly important attributes: large potential energy savings and broad applicability outside the pipeline industry. The R, D, and D program for each improvement and the recommended immediate next step are described. The eight programs recommended for pursuit are: gas-fired combined-cycle compressor station; internally cooled internal combustion engine; methanol-coal slurry pipeline; methanol-coal slurry-fired and coal-fired engines; indirect-fired coal-burning combined-cycle pump station; fuel-cycle pump station; internal coatings in pipelines; and drag-reducing additives in liquid pipelines.

  6. Modeling Relationships between Surface Water Quality and Landscape Metrics Using the Adaptive Neuro-Fuzzy Inference System, A Case Study in Mazandaran Province

    Directory of Open Access Journals (Sweden)

    mohsen Mirzayi

    2016-03-01

    Full Text Available Landscape indices can be used as an approach for predicting water quality changes to monitor non-point source pollution. In the present study, the data collected over the period from 2012 to 2013 from 81 water quality stations along the rivers flowing in Mazandaran Province were analyzed. Upstream boundries were drawn and landscape metrics were extracted for each of the sub-watersheds at class and landscape levels. Principal component analysis was used to single out the relevant water quality parameters and forward linear regression was employed to determine the optimal metrics for the description of each parameter. The first five components were able to describe 96.61% of the variation in water quality in Mazandaran Province. Adaptive Neuro-fuzzy Inference System (ANFIS and multiple linear regression were used to model the relationship between landscape metrics and water quality parameters. The results indicate that multiple regression was able to predict SAR, TDS, pH, NO3‒, and PO43‒ in the test step, with R2 values equal to 0.81, 0.56, 0.73, 0.44. and 0.63, respectively. The corresponding R2 value of ANFIS in the test step were 0.82, 0.79, 0.82, 0.31, and 0.36, respectively. Clearly, ANFIS exhibited a better performance in each case than did the linear regression model. This indicates a nonlinear relationship between the water quality parameters and landscape metrics. Since different land cover/uses have considerable impacts on both the outflow water quality and the available and dissolved pollutants in rivers, the method can be reasonably used for regional planning and environmental impact assessment in development projects in the region.

  7. An Efficient Framework for Development of Task-Oriented Dialog Systems in a Smart Home Environment.

    Science.gov (United States)

    Park, Youngmin; Kang, Sangwoo; Seo, Jungyun

    2018-05-16

    In recent times, with the increasing interest in conversational agents for smart homes, task-oriented dialog systems are being actively researched. However, most of these studies are focused on the individual modules of such a system, and there is an evident lack of research on a dialog framework that can integrate and manage the entire dialog system. Therefore, in this study, we propose a framework that enables the user to effectively develop an intelligent dialog system. The proposed framework ontologically expresses the knowledge required for the task-oriented dialog system's process and can build a dialog system by editing the dialog knowledge. In addition, the framework provides a module router that can indirectly run externally developed modules. Further, it enables a more intelligent conversation by providing a hierarchical argument structure (HAS) to manage the various argument representations included in natural language sentences. To verify the practicality of the framework, an experiment was conducted in which developers without any previous experience in developing a dialog system developed task-oriented dialog systems using the proposed framework. The experimental results show that even beginner dialog system developers can develop a high-level task-oriented dialog system.

  8. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  9. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  10. A Dynamic Intelligent Decision Approach to Dependency Modeling of Project Tasks in Complex Engineering System Optimization

    Directory of Open Access Journals (Sweden)

    Tinggui Chen

    2013-01-01

    Full Text Available Complex engineering system optimization usually involves multiple projects or tasks. On the one hand, dependency modeling among projects or tasks highlights structures in systems and their environments which can help to understand the implications of connectivity on different aspects of system performance and also assist in designing, optimizing, and maintaining complex systems. On the other hand, multiple projects or tasks are either happening at the same time or scheduled into a sequence in order to use common resources. In this paper, we propose a dynamic intelligent decision approach to dependency modeling of project tasks in complex engineering system optimization. The approach takes this decision process as a two-stage decision-making problem. In the first stage, a task clustering approach based on modularization is proposed so as to find out a suitable decomposition scheme for a large-scale project. In the second stage, according to the decomposition result, a discrete artificial bee colony (ABC algorithm inspired by the intelligent foraging behavior of honeybees is developed for the resource constrained multiproject scheduling problem. Finally, a certain case from an engineering design of a chemical processing system is utilized to help to understand the proposed approach.

  11. Task Characterisation and Cross-Platform Programming Through System Identification

    Directory of Open Access Journals (Sweden)

    Theocharis Kyriacou

    2005-12-01

    Full Text Available Developing robust and reliable control code for autonomous mobile robots is difficult, because the interaction between a physical robot and the environment is highly complex, it is subject to noise and variation, and therefore partly unpredictable. This means that to date it is not possible to predict robot behaviour, based on theoretical models. Instead, current methods to develop robot control code still require a substantial trial-and-error component to the software design process. Such iterative refinement could be reduced, we argue, if a more profound theoretical understanding of robot-environment interaction existed. In this paper, we therefore present a modelling method that generates a faithful model of a robot's interaction with its environment, based on data logged while observing a physical robot's behaviour. Because this modelling method — nonlinear modelling using polynomials — is commonly used in the engineering discipline of system identification, we refer to it here as “robot identification”. We show in this paper that using robot identification to obtain a computer model of robot-environment interaction offers several distinct advantages: Very compact representations (one-line programs of the robot control program are generated The model can be analysed, for example through sensitivity analysis, leading to a better understanding of the essential parameters underlying the robot's behaviour, and The generated, compact robot code can be used for cross-platform robot programming, allowing fast transfer of robot code from one type of robot to another. We demonstrate these points through experiments with a Magellan Pro and a Nomad 200 mobile robot.

  12. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  13. Assessing drivers' response during automated driver support system failures with non-driving tasks.

    Science.gov (United States)

    Shen, Sijun; Neyens, David M

    2017-06-01

    With the increase in automated driver support systems, drivers are shifting from operating their vehicles to supervising their automation. As a result, it is important to understand how drivers interact with these automated systems and evaluate their effect on driver responses to safety critical events. This study aimed to identify how drivers responded when experiencing a safety critical event in automated vehicles while also engaged in non-driving tasks. In total 48 participants were included in this driving simulator study with two levels of automated driving: (a) driving with no automation and (b) driving with adaptive cruise control (ACC) and lane keeping (LK) systems engaged; and also two levels of a non-driving task (a) watching a movie or (b) no non-driving task. In addition to driving performance measures, non-driving task performance and the mean glance duration for the non-driving task were compared between the two levels of automated driving. Drivers using the automated systems responded worse than those manually driving in terms of reaction time, lane departure duration, and maximum steering wheel angle to an induced lane departure event. These results also found that non-driving tasks further impaired driver responses to a safety critical event in the automated system condition. In the automated driving condition, driver responses to the safety critical events were slower, especially when engaged in a non-driving task. Traditional driver performance variables may not necessarily effectively and accurately evaluate driver responses to events when supervising autonomous vehicle systems. Thus, it is important to develop and use appropriate variables to quantify drivers' performance under these conditions. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  14. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  15. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  16. An Efficient Framework for Development of Task-Oriented Dialog Systems in a Smart Home Environment

    Directory of Open Access Journals (Sweden)

    Youngmin Park

    2018-05-01

    Full Text Available In recent times, with the increasing interest in conversational agents for smart homes, task-oriented dialog systems are being actively researched. However, most of these studies are focused on the individual modules of such a system, and there is an evident lack of research on a dialog framework that can integrate and manage the entire dialog system. Therefore, in this study, we propose a framework that enables the user to effectively develop an intelligent dialog system. The proposed framework ontologically expresses the knowledge required for the task-oriented dialog system’s process and can build a dialog system by editing the dialog knowledge. In addition, the framework provides a module router that can indirectly run externally developed modules. Further, it enables a more intelligent conversation by providing a hierarchical argument structure (HAS to manage the various argument representations included in natural language sentences. To verify the practicality of the framework, an experiment was conducted in which developers without any previous experience in developing a dialog system developed task-oriented dialog systems using the proposed framework. The experimental results show that even beginner dialog system developers can develop a high-level task-oriented dialog system.

  17. Frequency modulation system test procedure shuttle task 501 approach and landing test configuration

    Science.gov (United States)

    Doland, G. D.

    1976-01-01

    Shuttle Task 501 is an in-line task to test the performance and compatibility of radiofrequency links between the SSO and ground, and relay via a satellite. Under Shuttle Task 501 approach and landing test (ALT) phase only a limited portion of the communication and tracking (C&T) equipment is to be tested. The principal item to be tested is a frequency modulated (FM) data link. To test this RF link, an ALT FM System was designed, constructed, and the console wiring verified. A step-by-step procedure to be used to perform the ALT FM system is presented. The ALT FM system test is to be performed prior to delivery of the equipment to the Electronic Systems Test Laboratory (ESTL).

  18. Analyzing the effect of gain time on soft task scheduling policies in real-time systems

    OpenAIRE

    Búrdalo Rapa, Luis Antonio; Terrasa Barrena, Andrés Martín; Espinosa Minguet, Agustín Rafael; García Fornes, Ana María

    2012-01-01

    In hard real-time systems, gain time is defined as the difference between the Worst Case Execution Time (WCET) of a hard task and its actual processor consumption at runtime. This paper presents the results of an empirical study about how the presence of a significant amount of gain time in a hard real-time system questions the advantages of using the most representative scheduling algorithms or policies for aperiodic or soft tasks in fixed-priority preemptive systems. The work presented here...

  19. Engineering task plan for Tanks 241-AN-103, 104, 105 color video camera systems

    International Nuclear Information System (INIS)

    Kohlman, E.H.

    1994-01-01

    This Engineering Task Plan (ETP) describes the design, fabrication, assembly, and installation of the video camera systems into the vapor space within tanks 241-AN-103, 104, and 105. The one camera remotely operated color video systems will be used to observe and record the activities within the vapor space. Activities may include but are not limited to core sampling, auger activities, crust layer examination, monitoring of equipment installation/removal, and any other activities. The objective of this task is to provide a single camera system in each of the tanks for the Flammable Gas Tank Safety Program

  20. Task QA plan for Modified Prototypic Hydragard trademark Sampler Overflow System Demonstration at TNX

    International Nuclear Information System (INIS)

    Snyder, T.K.

    1993-01-01

    The primary objective of this task is to evaluate the proposed design modifications to the sample system, including the adequacy of the recommended eductor and the quality of samples obtained from the modified system. Presently, the sample streams are circulated from the originating tank, through a Hydragard trademark sampler system, and back to the originating tank. The overflow from the Hydragard trademark sampler flows to the Recycle Collection Tank (RCT). This report outlines the planned quality assurance controls for the design modification task, including organization and personnel, surveillances, and records package

  1. Nuclear power plant control room crew task analysis database: SEEK system. Users manual

    International Nuclear Information System (INIS)

    Burgy, D.; Schroeder, L.

    1984-05-01

    The Crew Task Analysis SEEK Users Manual was prepared for the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission. It is designed for use with the existing computerized Control Room Crew Task Analysis Database. The SEEK system consists of a PR1ME computer with its associated peripherals and software augmented by General Physics Corporation SEEK database management software. The SEEK software programs provide the Crew Task Database user with rapid access to any number of records desired. The software uses English-like sentences to allow the user to construct logical sorts and outputs of the task data. Given the multiple-associative nature of the database, users can directly access the data at the plant, operating sequence, task or element level - or any combination of these levels. A complete description of the crew task data contained in the database is presented in NUREG/CR-3371, Task Analysis of Nuclear Power Plant Control Room Crews (Volumes 1 and 2)

  2. Agent-oriented Architecture for Task-based Information Search System

    NARCIS (Netherlands)

    Aroyo, Lora; de Bra, Paul M.E.; De Bra, P.; Hardman, L.

    1999-01-01

    The topic of the reported research discusses an agent-oriented architecture of an educational information search system AIMS - a task-based learner support system. It is implemented within the context of 'Courseware Engineering' on-line course at the Faculty of Educational Science and Technology,

  3. IEA Wind Task 37 System Modeling Framework and Ontology for Wind Turbines and Plants

    NARCIS (Netherlands)

    Dykes, K; Sanchez Perez Moreno, S.; Zahle, Frederik; Ning, A; McWilliam, M.; Zaayer, M B

    2017-01-01

    This presentation will provide an overview of progress to date in the development of a system modeling framework and ontology for wind turbines and plants as part of the larger IEA Wind Task 37 on wind energy systems engineering. The goals of the effort are to create a set of guidelines for a common

  4. THE CAPABILITIES USING OF THREE-DIMENSIONAL MODELING SYSTEM AUTOCAD IN TEACHING TO PERFORM GRAPHICS TASKS

    Directory of Open Access Journals (Sweden)

    A. V. Krasnyuk

    2008-03-01

    Full Text Available Three-dimensional design possibilities of the AutoCAD system for performing graphic tasks are presented in the article. On the basis of the studies conducted the features of application of computer-aided design system are noted and the methods allowing to decrease considerably the quantity of errors at making the drawings are offered.

  5. Investigating the Effect of Voltage-Switching on Low-Energy Task Scheduling in Hard Real-Time Systems

    National Research Council Canada - National Science Library

    Swaminathan, Vishnu; Chakrabarty, Krishnendu

    2005-01-01

    We investigate the effect of voltage-switching on task execution times and energy consumption for dual-speed hard real-time systems, and present a new approach for scheduling workloads containing periodic tasks...

  6. Hybrid and dependent task scheduling algorithm for on-board system software

    Institute of Scientific and Technical Information of China (English)

    魏振华; 洪炳熔; 乔永强; 蔡则苏; 彭俊杰

    2003-01-01

    In order to solve the hybrid and dependent task scheduling and critical source allocation problems, atask scheduling algorithm has been developed by first presenting the tasks, and then describing the hybrid anddependent scheduling algorithm and deriving the predictable schedulability condition. The performance of thisagorithm was evaluated through simulation, and it is concluded from the evaluation results that the hybrid taskscheduling subalgorithm based on the comparison factor can be used to solve the problem of aperiodic task beingblocked by periodic task in the traditional operating system for a very long time, which results in poor schedu-ling predictability; and the resource allocation subalgorithm based on schedulability analysis can be used tosolve the problems of critical section conflict, ceiling blocking and priority inversion; and the scheduling algo-rithm is nearest optimal when the abortable critical section is 0.6.

  7. Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment

    CERN Document Server

    Grigoryeva, Maria; The ATLAS collaboration

    2018-01-01

    Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...

  8. Nuclear power plant personnel qualifications and training: TAPS: the task analysis profiling system. Volume 2

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1985-06-01

    This report discusses an automated task analysis profiling system (TAPS) designed to provide a linking tool between the behaviors of nuclear power plant operators in performing their tasks and the measurement tools necessary to evaluate their in-plant performance. TAPS assists in the identification of the entry-level skill, knowledge, ability and attitude (SKAA) requirements for the various tasks and rapidly associates them with measurement tests and human factors principles. This report describes the development of TAPS and presents its first demonstration. It begins with characteristics of skilled human performance and proceeds to postulate a cognitive model to formally describe these characteristics. A method is derived for linking SKAA characteristics to measurement tests. The entire process is then automated in the form of a task analysis computer program. The development of the program is detailed and a user guide with annotated code listings and supporting test information is provided

  9. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  10. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  11. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  12. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  13. Choosing Your Poison: Optimizing Simulator Visual System Selection as a Function of Operational Tasks

    Science.gov (United States)

    Sweet, Barbara T.; Kaiser, Mary K.

    2013-01-01

    Although current technology simulator visual systems can achieve extremely realistic levels they do not completely replicate the experience of a pilot sitting in the cockpit, looking at the outside world. Some differences in experience are due to visual artifacts, or perceptual features that would not be present in a naturally viewed scene. Others are due to features that are missing from the simulated scene. In this paper, these differences will be defined and discussed. The significance of these differences will be examined as a function of several particular operational tasks. A framework to facilitate the choice of visual system characteristics based on operational task requirements will be proposed.

  14. IEA Task 32: Wind Lidar Systems for Wind Energy Deployment (LIDAR)

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, Martin; Trabucchi, Davide; Clifton, Andrew; Courtney, Mike; Rettenmeier, Andreas

    2016-05-25

    Under the International Energy Agency Wind Implementing Agreement (IEA Wind) Task 11, researchers started examining novel applications for remote sensing and the issues around them during the 51st topical expert meeting about remote sensing in January 2007. The 59th topical expert meeting organized by Task 11 in October 2009 was also dedicated to remote sensing, and the first draft of the Task's recommended practices on remote sensing was published in January 2013. The results of the Task 11 topical expert meetings provided solid groundwork for a new IEA Wind Task 32 on wind lidar technologies. Members of the wind community identified the need to consolidate the knowledge about wind lidar systems to facilitate their use, and to investigate how to exploit the advantages offered by this technology. This was the motivation that led to the start of IEA Wind Task 32 'Lidar Application for Wind Energy Deployment' in November 2011. The kick-off was meeting was held in May 2012.

  15. Multilevel Flow Modeling Based Decision Support System and Its Task Organization

    DEFF Research Database (Denmark)

    Zhang, Xinxin; Lind, Morten; Ravn, Ole

    2013-01-01

    For complex engineering systems, there is an increasing demand for safety and reliability. Decision support system (DSS) is designed to offer su-pervision and analysis about operational situations. A proper model representa-tion is required for DSS to understand the process knowledge. Multilevel ...... techniques of MFM reasoning and less mature yet relevant MFM concepts are considered. It also offers an architecture design of task organization for MFM software tools by using the concept of agent and technology of multiagent software system....

  16. Simulation of a nuclear measurement system around a multi-task mode real-time monitor

    International Nuclear Information System (INIS)

    De Grandi, G.; Ouiguini, R.

    1983-01-01

    When debugging and testing material and software for the automation of systems, the non-availability of this last one states important logistic problems. A simulator of the system to be automatized, conceived around a multi-task mode real-time monitor, allowing the debugging of the software of automation without the physical presence of the system to be automatized, is proposed in the present report

  17. Control system of the inspection robots group applying auctions and multi-criteria analysis for task allocation

    Science.gov (United States)

    Panfil, Wawrzyniec; Moczulski, Wojciech

    2017-10-01

    In the paper presented is a control system of a mobile robots group intended for carrying out inspection missions. The main research problem was to define such a control system in order to facilitate a cooperation of the robots resulting in realization of the committed inspection tasks. Many of the well-known control systems use auctions for tasks allocation, where a subject of an auction is a task to be allocated. It seems that in the case of missions characterized by much larger number of tasks than number of robots it will be better if robots (instead of tasks) are subjects of auctions. The second identified problem concerns the one-sided robot-to-task fitness evaluation. Simultaneous assessment of the robot-to-task fitness and task attractiveness for robot should affect positively for the overall effectiveness of the multi-robot system performance. The elaborated system allows to assign tasks to robots using various methods for evaluation of fitness between robots and tasks, and using some tasks allocation methods. There is proposed the method for multi-criteria analysis, which is composed of two assessments, i.e. robot's concurrency position for task among other robots and task's attractiveness for robot among other tasks. Furthermore, there are proposed methods for tasks allocation applying the mentioned multi-criteria analysis method. The verification of both the elaborated system and the proposed tasks' allocation methods was carried out with the help of simulated experiments. The object under test was a group of inspection mobile robots being a virtual counterpart of the real mobile-robot group.

  18. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  19. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  20. Optimal task mapping in safety-critical real-time parallel systems

    International Nuclear Information System (INIS)

    Aussagues, Ch.

    1998-01-01

    This PhD thesis is dealing with the correct design of safety-critical real-time parallel systems. Such systems constitutes a fundamental part of high-performance systems for command and control that can be found in the nuclear domain or more generally in parallel embedded systems. The verification of their temporal correctness is the core of this thesis. our contribution is mainly in the following three points: the analysis and extension of a programming model for such real-time parallel systems; the proposal of an original method based on a new operator of synchronized product of state machines task-graphs; the validation of the approach by its implementation and evaluation. The work addresses particularly the main problem of optimal task mapping on a parallel architecture, such that the temporal constraints are globally guaranteed, i.e. the timeliness property is valid. The results incorporate also optimally criteria for the sizing and correct dimensioning of a parallel system, for instance in the number of processing elements. These criteria are connected with operational constraints of the application domain. Our approach is based on the off-line analysis of the feasibility of the deadline-driven dynamic scheduling that is used to schedule tasks inside one processor. This leads us to define the synchronized-product, a system of linear, constraints is automatically generated and then allows to calculate a maximum load of a group of tasks and then to verify their timeliness constraints. The communications, their timeliness verification and incorporation to the mapping problem is the second main contribution of this thesis. FInally, the global solving technique dealing with both task and communication aspects has been implemented and evaluated in the framework of the OASIS project in the LETI research center at the CEA/Saclay. (author)

  1. Indistinguishability Operators Applied to Task Allocation Problems in Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    José Guerrero

    2017-09-01

    Full Text Available In this paper we show an application of indistinguishability operators to model response functions. Such functions are used in the mathematical modeling of the task allocation problem in multi-agent systems when the stimulus, perceived by the agent, to perform a task is assessed by means of the response threshold model. In particular, we propose this kind of operators to represent a response function when the stimulus only depends on the distance between the agent and a determined task, since we prove that two celebrated response functions used in the literature can be reproduced by appropriate indistinguishability operators when the stimulus only depends on the distance to each task that must be carried out. Despite the fact there is currently no systematic method to generate response functions, this paper provides, for the first time, a theoretical foundation to generate them and study their properties. To validate the theoretical results, the aforementioned indistinguishability operators have been used to simulate, with MATLAB, the allocation of a set of tasks in a multi-robot system with fuzzy Markov chains.

  2. Training conquers multitasking costs by dividing task representations in the frontoparietal-subcortical system.

    Science.gov (United States)

    Garner, K G; Dux, Paul E

    2015-11-17

    Negotiating the information-rich sensory world often requires the concurrent management of multiple tasks. Despite this requirement, humans are thought to be poor at multitasking because of the processing limitations of frontoparietal and subcortical (FP-SC) brain regions. Although training is known to improve multitasking performance, it is unknown how the FP-SC system functionally changes to support improved multitasking. To address this question, we characterized the FP-SC changes that predict training outcomes using an individual differences approach. Participants (n = 100) performed single and multiple tasks in pre- and posttraining magnetic resonance imaging (fMRI) sessions interspersed by either a multitasking or an active-control training regimen. Multivoxel pattern analyses (MVPA) revealed that training induced multitasking improvements were predicted by divergence in the FP-SC blood oxygen level-dependent (BOLD) response patterns to the trained tasks. Importantly, this finding was only observed for participants who completed training on the component (single) tasks and their combination (multitask) and not for the control group. Therefore, the FP-SC system supports multitasking behavior by segregating constituent task representations.

  3. The effect of the sensitivity of the BAS and BAS motivational systems on performance in stroke rehabilitation tasks

    Directory of Open Access Journals (Sweden)

    Maja Milavec

    2012-03-01

    Full Text Available Stroke rehabilitation programs are often too short and not intensive enough, possibly due to a lack of patient motivation. This study examined whether the patient's mood, task success and psychophysiological responses are affected by the sensitivity of two motivational systems: the Behavioral Activation System (BAS and the Behavioral Inhibition System (BIS. 22 subacute stroke patients participated in the study. They performed an easier and harder version of a motor rehabilition task as well as the Stroop task. The sensitivities of the two motivational systems were measured using the BIS/BAS scale. Additionally, psychophysiological measurements (heart rate, skin conductance, respiration and skin temperature were taken and the Self-Assessment Manikin was used to measure self-reported valence and arousal. Results showed that valence and arousal are not significantly correlated with BIS/BAS subscales during the rehabilitation task. A negative correlation between valence and the BAS subscales was found in the Stroop task. Results also confirmed the initial hypothesis that the BAS would be correlated with task performance during the rehabilitation task while the BIS would be negatively correlated with task performance during the Stroop task. Only partial confirmation was found for the hypothesis that tasks that include a reward would affect heart rate in subjects with a sensitive BAS while tasks without a reward would affect skin conductance in subjects with a sensitive BIS. In both versions of the rehabilitation task, which includes a reward, the BAS reward subscale was negatively correlated with mean skin temperature. In the harder rehabilitation task, the BAS reward responsiveness subscale was positively correlated with mean heart rate. In the Stroop task, which has no reward, the BIS scale was positively correlated with mean heart rate. The BAS subscale was also negatively correlated with the RMSSD measure of heart rate variability. The results of

  4. Interface Testing for RTOS System Tasks based on the Run-Time Monitoring

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju

    2006-01-01

    Safety critical embedded system requires high dependability of not only hardware but also software. It is intricate to modify embedded software once embedded. Therefore, it is necessary to have rigorous regulations to assure the quality of safety critical embedded software. IEEE V and V (Verification and Validation) process is recommended for software dependability, but a more quantitative evaluation method like software testing is necessary. In case of safety critical embedded software, it is essential to have a test that reflects unique features of the target hardware and its operating system. The safety grade PLC (Programmable Logic Controller) is a safety critical embedded system where hardware and software are tightly coupled. The PLC has HdS (Hardware dependent Software) and it is tightly coupled with RTOS (Real Time Operating System). Especially, system tasks that are tightly coupled with target hardware and RTOS kernel have large influence on the dependability of the entire PLC. Therefore, interface testing for system tasks that reflects the features of target hardware and RTOS kernel becomes the core of the PLC integration test. Here, we define interfaces as overlapped parts between two different layers on the system architecture. In this paper, we identify interfaces for system tasks and apply the identified interfaces to the safety grade PLC. Finally, we show the test results through the empirical study

  5. Interface Testing for RTOS System Tasks based on the Run-Time Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Ahyoung; Choi, Byoungju [Ewha University, Seoul (Korea, Republic of)

    2006-07-01

    Safety critical embedded system requires high dependability of not only hardware but also software. It is intricate to modify embedded software once embedded. Therefore, it is necessary to have rigorous regulations to assure the quality of safety critical embedded software. IEEE V and V (Verification and Validation) process is recommended for software dependability, but a more quantitative evaluation method like software testing is necessary. In case of safety critical embedded software, it is essential to have a test that reflects unique features of the target hardware and its operating system. The safety grade PLC (Programmable Logic Controller) is a safety critical embedded system where hardware and software are tightly coupled. The PLC has HdS (Hardware dependent Software) and it is tightly coupled with RTOS (Real Time Operating System). Especially, system tasks that are tightly coupled with target hardware and RTOS kernel have large influence on the dependability of the entire PLC. Therefore, interface testing for system tasks that reflects the features of target hardware and RTOS kernel becomes the core of the PLC integration test. Here, we define interfaces as overlapped parts between two different layers on the system architecture. In this paper, we identify interfaces for system tasks and apply the identified interfaces to the safety grade PLC. Finally, we show the test results through the empirical study.

  6. Analyses of User Rationality and System Learnability: Performing Task Variants in User Tests

    Science.gov (United States)

    Law, Effie Lai-Chong; Blazic, Borka Jerman; Pipan, Matic

    2007-01-01

    No systematic empirical study on investigating the effects of performing task variants on user cognitive strategy and behaviour in usability tests and on learnability of the system being tested has been documented in the literature. The current use-inspired basic research work aims to identify the underlying cognitive mechanisms and the practical…

  7. Task Force on Energy Systems for Forward/Remote Operating Bases

    Science.gov (United States)

    2016-08-01

    nuclear power energy systems ......................................................... 30 7.2.1 Radioisotope thermoelectric generators...issue, the Task Force found efforts to provide the most efficient methods for power production at the prime-contract level have been hampered by...management. Engineer Prime Power Operations21 describes theater level power infrastructure and inter-service responsibilities and, although dated from

  8. Enhancing On-Task Behavior in Fourth-Grade Students Using a Modified Color Wheel System

    Science.gov (United States)

    Blondin, Carolyn; Skinner, Christopher; Parkhurst, John; Wood, Allison; Snyder, Jamie

    2012-01-01

    The authors used a withdrawal design to evaluate the effects of a modified Color Wheel System (M-CWS) on the on-task behavior of 7 students enrolled in the 4th grade. Standard CWS procedures were modified to include a 4th set of rules designed to set behavioral expectation for cooperative learning activities. Mean data showed that immediately…

  9. Degree of Schedulability of Mixed-Criticality Real-time Systems with Probabilistic Sporadic Tasks

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; David, Alexandre; Kim, Jin Hyun

    2014-01-01

    We present the concept of degree of schedulability for mixed-criticality scheduling systems. This concept is given in terms of the two factors 1) Percentage of Missed Deadlines (PoMD), and 2) Degradation of the Quality of Service (DoQoS). The novel aspect is that we consider task arrival patterns...

  10. Regulating task-monitoring systems in response to variable reward contingencies and outcomes in cocaine addicts.

    Science.gov (United States)

    Morie, Kristen P; De Sanctis, Pierfilippo; Garavan, Hugh; Foxe, John J

    2016-03-01

    We investigated anticipatory and consummatory reward processing in cocaine addiction. In addition, we set out to assess whether task-monitoring systems were appropriately recalibrated in light of variable reward schedules. We also examined neural measures of task-monitoring and reward processing as a function of hedonic tone, since anhedonia is a vulnerability marker for addiction that is obviously germane in the context of reward processing. High-density event-related potentials were recorded while participants performed a speeded response task that systematically varied anticipated probabilities of reward receipt. The paradigm dissociated feedback regarding task success (or failure) from feedback regarding the value of reward (or loss), so that task-monitoring and reward processing could be examined in partial isolation. Twenty-three active cocaine abusers and 23 age-matched healthy controls participated. Cocaine abusers showed amplified anticipatory responses to reward predictive cues, but crucially, these responses were not as strongly modulated by reward probability as in controls. Cocaine users also showed blunted responses to feedback about task success or failure and did not use this information to update predictions about reward. In turn, they showed clearly blunted responses to reward feedback. In controls and users, measures of anhedonia were associated with reward motivation. In cocaine users, anhedonia was also associated with diminished monitoring and reward feedback responses. Findings imply that reward anticipation and monitoring deficiencies in addiction are associated with increased responsiveness to reward cues but impaired ability to predict reward in light of task contingencies, compounded by deficits in responding to actual reward outcomes.

  11. Investigating the Effect of Voltage-Switching on Low-Energy Task Scheduling in Hard Real-Time Systems

    Science.gov (United States)

    2005-01-01

    We investigate the effect of voltage-switching on task execution times and energy consumption for dual-speed hard real - time systems , and present a...scheduling algorithm and apply it to two real-life task sets. Our results show that energy can be conserved in embedded real - time systems using energy...aware task scheduling. We also show that switching times have a significant effect on the energy consumed in hard real - time systems .

  12. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  13. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  14. Task-role-based Access Control Model in Smart Health-care System

    Directory of Open Access Journals (Sweden)

    Wang Peng

    2015-01-01

    Full Text Available As the development of computer science and smart health-care technology, there is a trend for patients to enjoy medical care at home. Taking enormous users in the Smart Health-care System into consideration, access control is an important issue. Traditional access control models, discretionary access control, mandatory access control, and role-based access control, do not properly reflect the characteristics of Smart Health-care System. This paper proposes an advanced access control model for the medical health-care environment, task-role-based access control model, which overcomes the disadvantages of traditional access control models. The task-role-based access control (T-RBAC model introduces a task concept, dividing tasks into four categories. It also supports supervision role hierarchy. T-RBAC is a proper access control model for Smart Health-care System, and it improves the management of access rights. This paper also proposes an implementation of T-RBAC, a binary two-key-lock pair access control scheme using prime factorization.

  15. Semiportable load-cell-based weighing system prototype of 18.14-metric-ton (20-ton) capacity for UF6 cylinder weight verifications: description and testing procedure

    International Nuclear Information System (INIS)

    McAuley, W.A.

    1984-01-01

    The 18.14-metric-ton-capacity (20-ton) Load-Cell-Based Weighing System (LCBWS) prototype tested at the Oak Ridge (Tennessee) Gaseous Diffusion Plant March 20-30, 1984, is semiportable and has the potential for being highly accurate. Designed by Brookhaven National Laboratory, it can be moved to cylinders for weighing as opposed to the widely used operating philosophy of most enrichment facilities of moving cylinders to stationary accountability scales. Composed mainly of commercially available, off-the-shelf hardware, the system's principal elements are two load cells that sense the weight (i.e., force) of a uranium hexafluoride (UF 6 ) cylinder suspended from the LCBWS while the cylinder is in the process of being weighed. Portability is achieved by its attachment to a double-hook, overhead-bridge crane. The LCBWS prototype is designed to weigh 9.07- and 12.70-metric ton (10- and 14-ton) UF 6 cylinders. A detailed description of the LCBWS is given, design information and criteria are supplied, a testing procedure is outlined, and initial test results are reported. A major objective of the testing is to determine the reliability and accuracy of the system. Other testing objectives include the identification of (1) potential areas for system improvements and (2) procedural modifications that will reflect an improved and more efficient system. The testing procedure described includes, but is not limited to, methods that account for temperature sensitivity of the instrumentation, the local variation in the acceleration due to gravity, and buoyance effects. Operational and safety considerations are noted. A preliminary evaluation of the March test data indicates that the LCBWS prototype has the potential to have an accuracy in the vicinity of 1 kg

  16. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  17. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  18. Optimal task partition and state-dependent loading in heterogeneous two-element work sharing system

    International Nuclear Information System (INIS)

    Levitin, Gregory; Xing, Liudong; Ben-Haim, Hanoch; Dai, Yuanshun

    2016-01-01

    Many real-world systems such as multi-channel data communication, multi-path flow transmission and multi-processor computing systems have work sharing attributes where system elements perform different portions of the same task simultaneously. Motivated by these applications, this paper models a heterogeneous work-sharing system with two non-repairable elements. When one element fails, the other element takes over the uncompleted task of the failed element upon finishing its own part; the load level of the remaining operating element can change at the time of the failure, which further affects its performance, failure behavior and operation cost. Considering these dynamics, mission success probability (MSP), expected mission completion time (EMCT) and expected cost of successful mission (ECSM) are first derived. Further, optimization problems are formulated and solved, which find optimal task partition and element load levels maximizing MSP, minimizing EMCT or minimizing ECSM. Effects of element reliability, performance, operation cost on the optimal solutions are also investigated through examples. Results of this work can facilitate a tradeoff analysis of different mission performance indices for heterogeneous work-sharing systems. - Highlights: • A heterogeneous work-sharing system with two non-repairable elements is considered. • The optimal work distribution and element loading problem is formulated and solved. • Effects of element reliability, performance, operation cost on the optimal solutions are investigated.

  19. Paired-Associate and Feedback-Based Weather Prediction Tasks Support Multiple Category Learning Systems.

    Science.gov (United States)

    Li, Kaiyun; Fu, Qiufang; Sun, Xunwei; Zhou, Xiaoyan; Fu, Xiaolan

    2016-01-01

    It remains unclear whether probabilistic category learning in the feedback-based weather prediction task (FB-WPT) can be mediated by a non-declarative or procedural learning system. To address this issue, we compared the effects of training time and verbal working memory, which influence the declarative learning system but not the non-declarative learning system, in the FB and paired-associate (PA) WPTs, as the PA task recruits a declarative learning system. The results of Experiment 1 showed that the optimal accuracy in the PA condition was significantly decreased when the training time was reduced from 7 to 3 s, but this did not occur in the FB condition, although shortened training time impaired the acquisition of explicit knowledge in both conditions. The results of Experiment 2 showed that the concurrent working memory task impaired the optimal accuracy and the acquisition of explicit knowledge in the PA condition but did not influence the optimal accuracy or the acquisition of self-insight knowledge in the FB condition. The apparent dissociation results between the FB and PA conditions suggested that a non-declarative or procedural learning system is involved in the FB-WPT and provided new evidence for the multiple-systems theory of human category learning.

  20. Combining metric episodes with semantic event concepts within the Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS)

    Science.gov (United States)

    Kelley, Troy D.; McGhee, S.

    2013-05-01

    This paper describes the ongoing development of a robotic control architecture that inspired by computational cognitive architectures from the discipline of cognitive psychology. The Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS) combines symbolic and sub-symbolic representations of knowledge into a unified control architecture. The new architecture leverages previous work in cognitive architectures, specifically the development of the Adaptive Character of Thought-Rational (ACT-R) and Soar. This paper details current work on learning from episodes or events. The use of episodic memory as a learning mechanism has, until recently, been largely ignored by computational cognitive architectures. This paper details work on metric level episodic memory streams and methods for translating episodes into abstract schemas. The presentation will include research on learning through novelty and self generated feedback mechanisms for autonomous systems.

  1. Flight Crew State Monitoring Metrics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — eSky will develop specific crew state metrics based on the timeliness, tempo and accuracy of pilot inputs required by the H-mode Flight Control System (HFCS)....

  2. Main tasks of studying strong regulation of excitation of complex electrical system generators

    Energy Technology Data Exchange (ETDEWEB)

    Gruzdev, I.A.; Yekimova, M.M.

    1982-01-01

    A survey is made of the current state of studies of the damping properties of complex electricity systems. The calculation programs of stability are based on frequency methods using the method of D-division. Now, when ARV of strong effect dominates at the SG, the task of coordinating their adjustments develops. Consequently, the following questions are discussed: study of the properties of quality functional with several points of regulation in the circuits of different structure; development of the efficient procedures for coordinating the ARV adjustment of the related energy systems; and creation of resources for solving these tasks. Results are presented of coordinating the ARV adjustments of the generators of the 3-machine electricity system. As an example, nonlinear relationships are shown between the obtained degree of stability and the coefficient of stabilization.

  3. Disintegration of power grid as part of the task of increasing functionality of electric system

    Directory of Open Access Journals (Sweden)

    Mukatov Bekzhan

    2017-01-01

    operation is inevitable with reduced reliability or, otherwise, with incomplete functionality where functionality is the set of functions provided by the power system and the quality of their performance. With the mass input of distributed small generation in grids of almost all voltage classes it is necessary to solve the problem of ensuring stability in previously passive distribution networks. The traditional approach based on the “struggle” to maintain synchronism between power plants in the distribution networks is associated with a number of difficulties, which causes to apply another approach to control modes in distribution networks. Complication of the power grid, automatic devices, increase in possible variations of modes, and tendency to maximize the use of production assets lead to an increase in the complexity of tasks solved by dispatch centers. In this regard, it is important to note that availability of cascade failures in power systems speaks of the urgency of the task of ensuring the survivability of energy supply systems both globally and locally. The paper shows how disintegration of the power grid can solve the task of ensuring the functionality of traditional power systems and help to create favorable conditions for distributed small generation integration into the integrated electric power system.

  4. Assistance tools for generic definition of ITER maintenance tasks and scenarios in advanced supervisory control systems

    International Nuclear Information System (INIS)

    Zieba, Stéphane; Russotto, François-Xavier; Da Silva Simoes, Max; Measson, Yvan

    2013-01-01

    Highlights: ► Improve supervisory control systems for ITER in-vessel and hot cell maintenance. ► Optimize remote handling operations effectiveness, reliability and safety. ► Provide a generic description of the maintenance tasks and scenarios. ► Development of context-based assistances for operators and supervisor. ► Improvement of operator's situation awareness. -- Abstract: This paper concerns the improvement of supervisory control systems in the context of remote handling for the maintenance tasks in ITER. This work aims at providing a single formalism and tools to define in a generic way the ITER maintenance tasks and scenarios for in-vessel and hot cell operations. A three-layered approach is proposed to model these tasks and scenarios. Physical actions are defined for the scene elements. From these physical actions, behaviours are defined to represent high-level functionalities. Finally, interaction modes define the way that behaviours are achieved in terms of human–machine interactions. Case study concerning the blanket maintenance procedure is discussed concerning the contributions of the descriptive model and the context-based assistances to the activities of supervisory control

  5. Complexity Management Using Metrics for Trajectory Flexibility Preservation and Constraint Minimization

    Science.gov (United States)

    Idris, Husni; Shen, Ni; Wing, David J.

    2011-01-01

    The growing demand for air travel is increasing the need for mitigating air traffic congestion and complexity problems, which are already at high levels. At the same time new surveillance, navigation, and communication technologies are enabling major transformations in the air traffic management system, including net-based information sharing and collaboration, performance-based access to airspace resources, and trajectory-based rather than clearance-based operations. The new system will feature different schemes for allocating tasks and responsibilities between the ground and airborne agents and between the human and automation, with potential capacity and cost benefits. Therefore, complexity management requires new metrics and methods that can support these new schemes. This paper presents metrics and methods for preserving trajectory flexibility that have been proposed to support a trajectory-based approach for complexity management by airborne or ground-based systems. It presents extensions to these metrics as well as to the initial research conducted to investigate the hypothesis that using these metrics to guide user and service provider actions will naturally mitigate traffic complexity. The analysis showed promising results in that: (1) Trajectory flexibility preservation mitigated traffic complexity as indicated by inducing self-organization in the traffic patterns and lowering traffic complexity indicators such as dynamic density and traffic entropy. (2)Trajectory flexibility preservation reduced the potential for secondary conflicts in separation assurance. (3) Trajectory flexibility metrics showed potential application to support user and service provider negotiations for minimizing the constraints imposed on trajectories without jeopardizing their objectives.

  6. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  7. Regional Sustainability: The San Luis Basin Metrics Project

    Science.gov (United States)

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  8. A Two-Level Task Scheduler on Multiple DSP System for OpenCL

    Directory of Open Access Journals (Sweden)

    Li Tian

    2014-04-01

    Full Text Available This paper addresses the problem that multiple DSP system does not support OpenCL programming. With the compiler, runtime, and the kernel scheduler proposed, an OpenCL application becomes portable not only between multiple CPU and GPU, but also between embedded multiple DSP systems. Firstly, the LLVM compiler was imported for source-to-source translation in which the translated source was supported by CCS. Secondly, two-level schedulers were proposed to support efficient OpenCL kernel execution. The DSP/BIOS is used to schedule system level tasks such as interrupts and drivers; however, the synchronization mechanism resulted in heavy overhead during task switching. So we designed an efficient second level scheduler especially for OpenCL kernel work-item scheduling. The context switch process utilizes the 8 functional units and cross path links which was superior to DSP/BIOS in the aspect of task switching. Finally, dynamic loading and software managed CACHE were redesigned for OpenCL running on multiple DSP system. We evaluated the performance using some common OpenCL kernels from NVIDIA, AMD, NAS, and Parboil benchmarks. Experimental results show that the DSP OpenCL can efficiently exploit the computing resource of multiple cores.

  9. Task V of the IEA Photovoltaic Power Systems Program: Accomplishments and Activities

    International Nuclear Information System (INIS)

    Bower, Ward

    1999-01-01

    The International Energy Agency (IEA) is an energy forum for 24 industrialized countries and was established in 1974 as an autonomous body within the Organization for Economic Cooperation and Development (OECD). The IEA Photovoltaic Power Systems (PVPS) program implementing agreement was signed in 1993, and renewed for another five years in 1998. Twenty-two countries are collaborating under the auspices of the IEA in the PVPS to address common technical and informational barriers that often limit the rate at which photovoltaic technologies advance into the markets. Task V of the IEA PVPS is entitled ''Grid Interconnection of Building-Integrated and Other Dispersed Photovoltaic Power Systems.'' The task sponsored a workshop in September 1997 on grid-interconnection of photovoltaic systems and is planning a second workshop to address impacts of more penetration of dispersed systems into the utility grid. This paper will summarize the accomplishments of Task V over the last five years and will detail the planned work for the next three years

  10. Overview of the ID, EPI and REL tasks of BioNLP Shared Task 2011

    Directory of Open Access Journals (Sweden)

    Pyysalo Sampo

    2012-06-01

    of performance sufficient for user-facing applications. In this study, we extend on previously reported results and perform further analyses of the outputs of the participating systems. We place specific emphasis on aspects of system performance relating to real-world applicability, considering alternate evaluation metrics and performing additional manual analysis of system outputs. We further demonstrate that the strengths of extraction systems can be combined to improve on the performance achieved by any system in isolation. The manually annotated corpora, supporting resources, and evaluation tools for all tasks are available from http://www.bionlp-st.org and the tasks continue as open challenges for all interested parties.

  11. A support system for water system isolation task of nuclear power plant by using augmented reality and RFID

    International Nuclear Information System (INIS)

    Shimoda, Hiroshi; Ishii, Hirotake; Yamazaki, Yuichiro; Yoshikawa, Hidekazu

    2004-01-01

    Aiming at improvement of task performance and reduction of human error of water system isolation task in NPP periodic maintenance, a support system using state-of-art information technology. Augmented Reality (AR) and Radio Frequency Identification (RFID) has been proposed under the concept of off-site operation and maintenance support center, and a prototype system has been developed. The system has navigation function of which an indication is superimposed directly on the user's view to help to find the designated valves by AR. It also has valve confirmation function by scanning RFID tag attached on the valve. Using the prototype system, an evaluation experiment has been conducted in order to confirm its effectiveness and to reveal its problems. As the result of the experiment, it was found that the system improved efficiency and reliability of water system isolation task, and it was also found that the visibility of HMD and its troublesome feeling to wear were the problems of the system. (author)

  12. A support system for water system isolation task of nuclear power plant by using augmented reality and RFID

    Energy Technology Data Exchange (ETDEWEB)

    Shimoda, Hiroshi; Ishii, Hirotake; Yamazaki, Yuichiro; Yoshikawa, Hidekazu [Kyoto Univ., Graduate School of Energy Science, Uji, Kyoto (Japan)

    2004-07-15

    Aiming at improvement of task performance and reduction of human error of water system isolation task in NPP periodic maintenance, a support system using state-of-art information technology. Augmented Reality (AR) and Radio Frequency Identification (RFID) has been proposed under the concept of off-site operation and maintenance support center, and a prototype system has been developed. The system has navigation function of which an indication is superimposed directly on the user's view to help to find the designated valves by AR. It also has valve confirmation function by scanning RFID tag attached on the valve. Using the prototype system, an evaluation experiment has been conducted in order to confirm its effectiveness and to reveal its problems. As the result of the experiment, it was found that the system improved efficiency and reliability of water system isolation task, and it was also found that the visibility of HMD and its troublesome feeling to wear were the problems of the system. (author)

  13. CETF Space Station payload pointing system design and analysis feasibility study. [Critical Evaluation Task Force

    Science.gov (United States)

    Smagala, Tom; Mcglew, Dave

    1988-01-01

    The expected pointing performance of an attached payload coupled to the Critical Evaluation Task Force Space Station via a payload pointing system (PPS) is determined. The PPS is a 3-axis gimbal which provides the capability for maintaining inertial pointing of a payload in the presence of disturbances associated with the Space Station environment. A system where the axes of rotation were offset from the payload center of mass (CM) by 10 in. in the Z axis was studied as well as a system having the payload CM offset by only 1 inch. There is a significant improvement in pointing performance when going from the 10 in. to the 1 in. gimbal offset.

  14. IEA Wind Task 37: Systems Modeling Framework and Ontology for Wind Turbines and Plants

    Energy Technology Data Exchange (ETDEWEB)

    Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zahle, Frederik [Technical University of Denmark; Merz, Karl [SINTEF Energy Research; McWilliam, Mike [Technical University of Denmark; Bortolotti, Pietro [Technical University Munich

    2017-08-14

    This presentation will provide an overview of progress to date in the development of a system modeling framework and ontology for wind turbines and plants as part of the larger IEA Wind Task 37 on wind energy systems engineering. The goals of the effort are to create a set of guidelines for a common conceptual architecture for wind turbines and plants so that practitioners can more easily share descriptions of wind turbines and plants across multiple parties and reduce the effort for translating descriptions between models; integrate different models together and collaborate on model development; and translate models among different levels of fidelity in the system.

  15. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  16. General relativity: An erfc metric

    Science.gov (United States)

    Plamondon, Réjean

    2018-06-01

    This paper proposes an erfc potential to incorporate in a symmetric metric. One key feature of this model is that it relies on the existence of an intrinsic physical constant σ, a star-specific proper length that scales all its surroundings. Based thereon, the new metric is used to study the space-time geometry of a static symmetric massive object, as seen from its interior. The analytical solutions to the Einstein equation are presented, highlighting the absence of singularities and discontinuities in such a model. The geodesics are derived in their second- and first-order differential formats. Recalling the slight impact of the new model on the classical general relativity tests in the solar system, a number of facts and open problems are briefly revisited on the basis of a heuristic definition of σ. A special attention is given to gravitational collapses and non-singular black holes.

  17. Examining the Use of a Visual Analytics System for Sensemaking Tasks: Case Studies with Domain Experts.

    Science.gov (United States)

    Kang, Youn-Ah; Stasko, J

    2012-12-01

    While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.

  18. Small Engine Technology (SET) Task 24 Business and Regional Aircraft System Studies

    Science.gov (United States)

    Lieber, Lysbeth

    2003-01-01

    This final report has been prepared by Honeywell Engines & Systems, Phoenix, Arizona, a unit of Honeywell International Inc., documenting work performed during the period June 1999 through December 1999 for the National Aeronautics and Space Administration (NASA) Glenn Research Center, Cleveland, Ohio, under the Small Engine Technology (SET) Program, Contract No. NAS3-27483, Task Order 24, Business and Regional Aircraft System Studies. The work performed under SET Task 24 consisted of evaluating the noise reduction benefits compared to the baseline noise levels of representative 1992 technology aircraft, obtained by applying different combinations of noise reduction technologies to five business and regional aircraft configurations. This report focuses on the selection of the aircraft configurations and noise reduction technologies, the prediction of noise levels for those aircraft, and the comparison of the noise levels with those of the baseline aircraft.

  19. Contrasting single and multi-component working-memory systems in dual tasking.

    Science.gov (United States)

    Nijboer, Menno; Borst, Jelmer; van Rijn, Hedderik; Taatgen, Niels

    2016-05-01

    Working memory can be a major source of interference in dual tasking. However, there is no consensus on whether this interference is the result of a single working memory bottleneck, or of interactions between different working memory components that together form a complete working-memory system. We report a behavioral and an fMRI dataset in which working memory requirements are manipulated during multitasking. We show that a computational cognitive model that assumes a distributed version of working memory accounts for both behavioral and neuroimaging data better than a model that takes a more centralized approach. The model's working memory consists of an attentional focus, declarative memory, and a subvocalized rehearsal mechanism. Thus, the data and model favor an account where working memory interference in dual tasking is the result of interactions between different resources that together form a working-memory system. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    Science.gov (United States)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  1. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  2. Perceived Task-Difficulty Recognition from Log-File Information for the Use in Adaptive Intelligent Tutoring Systems

    Science.gov (United States)

    Janning, Ruth; Schatten, Carlotta; Schmidt-Thieme, Lars

    2016-01-01

    Recognising students' emotion, affect or cognition is a relatively young field and still a challenging task in the area of intelligent tutoring systems. There are several ways to use the output of these recognition tasks within the system. The approach most often mentioned in the literature is using it for giving feedback to the students. The…

  3. Operationally efficient propulsion system study (OEPSS) data book. Volume 6; Space Transfer Propulsion Operational Efficiency Study Task of OEPSS

    Science.gov (United States)

    Harmon, Timothy J.

    1992-01-01

    This document is the final report for the Space Transfer Propulsion Operational Efficiency Study Task of the Operationally Efficient Propulsion System Study (OEPSS) conducted by the Rocketdyne Division of Rockwell International. This Study task studied, evaluated and identified design concepts and technologies which minimized launch and in-space operations and optimized in-space vehicle propulsion system operability.

  4. Complementary roles of systems representing sensory evidence and systems detecting task difficulty during perceptual decision making

    Directory of Open Access Journals (Sweden)

    Douglas A Ruff

    2010-11-01

    Full Text Available Perceptual decision making is a multi-stage process where incoming sensory information is used to select one option from several alternatives. Researchers typically have adopted one of two conceptual frameworks to define the criteria for determining whether a brain region is involved in decision computations. One framework, building on single unite recordings in monkeys, posits that activity in a region involved in decision making reflects the accumulation of evidence toward a decision threshold, thus showing the lowest level of BOLD signal during the hardest decisions. The other framework instead posits that activity in a decision-making region reflects the difficulty of a decision, thus showing the highest level of BOLD signal during the hardest decisions. We had subjects perform a face detection task on degraded face images while we simultaneously recorded BOLD activity. We searched for brain regions where changes in BOLD activity during this task supported either of these frameworks by calculating the correlation of BOLD activity with reaction time - a measure of task difficulty. We found that the right supplementary eye field, right frontal eye field and right inferior frontal gyrus had increased activity relative to baseline that positively correlated with reaction time, while the left superior frontal sulcus and left middle temporal gyrus had decreased activity relative to baseline that negatively correlated with reaction time. We propose that a simple mechanism that scales a region’s activity based on task demands can explain our results.

  5. A FAST AND ELITIST BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR SCHEDULING INDEPENDENT TASKS ON HETEROGENEOUS SYSTEMS

    Directory of Open Access Journals (Sweden)

    G.Subashini

    2010-07-01

    Full Text Available To meet the increasing computational demands, geographically distributed resources need to be logically coupled to make them work as a unified resource. In analyzing the performance of such distributed heterogeneous computing systems scheduling a set of tasks to the available set of resources for execution is highly important. Task scheduling being an NP-complete problem, use of metaheuristics is more appropriate in obtaining optimal solutions. Schedules thus obtained can be evaluated using several criteria that may conflict with one another which require multi objective problem formulation. This paper investigates the application of an elitist Nondominated Sorting Genetic Algorithm (NSGA-II, to efficiently schedule a set of independent tasks in a heterogeneous distributed computing system. The objectives considered in this paper include minimizing makespan and average flowtime simultaneously. The implementation of NSGA-II algorithm and Weighted-Sum Genetic Algorithm (WSGA has been tested on benchmark instances for distributed heterogeneous systems. As NSGA-II generates a set of Pareto optimal solutions, to verify the effectiveness of NSGA-II over WSGA a fuzzy based membership value assignment method is employed to choose the best compromise solution from the obtained Pareto solution set.

  6. Remotely controlled inspection and handling systems for decommissioning tasks in nuclear facilities

    International Nuclear Information System (INIS)

    Schreck, G.; Bach, W.; Haferkamp, H.

    1993-01-01

    The Institut fur Werkstoffkunde at the University of Hanover has recently developed three remotely controlled systems for different underwater inspection and dismantling tasks. ODIN I is a tool guiding device, particularly being designed for the dismantling of the steam dryer housing of the KRB A power plant at Gundremmingen, Germany. After being approved by the licencing organization TUEV Bayern, hot operation started in November 1992. The seven axes remotely controlled handling system ZEUS, consisting of a three translatory axes guiding machine and a tool handling device with four rotatory axes, has been developed for the demonstration of underwater plasma arc cutting of spherical metallic components with great wall thicknesses. A specially designed twin sensor system and a modular torch, exchanged by means of a remote controlled tool changing device, will be used for different complex cutting tasks. FAUST, an autonomous, freediving underwater vehicle, was designed for complex inspection, maintenance and dismantling tasks. It is equipped with two video cameras, an ultrasonic and a radiologic sensor and a small plasma torch. A gripper and a subsidiary vehicle for inspection may be attached. (author)

  7. Quench detection system of the EURATOM coil for the Large Coil Task

    International Nuclear Information System (INIS)

    Noether, G.; Gauss, S.; Maurer, W.; Siewerdt, L.; Ulbricht, A.; Wuechner, F.

    1989-01-01

    A special quench detection system has been developed for the EURATOM Large Coil Task (LCT) coil. The system is based on a bridge circuit which uses a special 'two in hand' winding technique for the pancakes of the EURATOM LCT coil. The electronic circuit was designed in a fail safe way to prevent failure of the quench detector due to failure of one of its components. A method for quick balancing of the quench detection system in a large toroidal magnet system was applied. The quench detection system worked very reliably during the experimental phase of the LCT and was within the quench detection level setting of 50 mV, i.e. the system was not sensitive to poloidal field transients at or below this level. Non-electrical methods for quench detection were also investigated. (author)

  8. Human cognitive task distribution model for maintenance support system of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Park, Young Ho

    2007-02-15

    In human factors research, more attention has been devoted to the operation of nuclear power plants (NPPs) than to their maintenance. However, human error related to maintenance is 45% among the total human errors from 1990 to 2005 in Korean nuclear power plants. Therefore, it is necessary to study human factors in the maintenance of an NPP. There is a current trend toward introducing digital technology into both safety and non-safety systems in NPPs. A variety of information about plant conditions can be used digitally. In the future, maintenance support systems will be developed based on an information-oriented NPP. In this context, it is necessary to study the cognitive tasks of the personnel involved in maintenance and the interaction between the personnel and maintenance support systems. The fundamental purpose of this work is how to distribute the cognitive tasks of the personnel involved in the maintenance in order to develop a maintenance support system that considers human factors. The second purpose is to find the causes of errors due to engineers or maintainers and propose system functions that are countermeasures to reduce these errors. In this paper, a cognitive task distribution model of the personnel involved in maintenance is proposed using Rasmussen's decision making model. First, the personnel were divided into three groups: the operators (inspectors), engineers, and maintainers. Second, human cognitive tasks related to maintenance were distributed based on these groups. The operators' cognitive tasks are detection and observation; the engineers' cognitive tasks are identification, evaluation, target state, select target, and procedure: and the maintainers' cognitive task is execution. The case study is an analysis of failure reports related to human error in maintenance over a period of 15years. By using error classification based on the information processing approach, the human errors involved in maintenance were classified

  9. Human cognitive task distribution model for maintenance support system of a nuclear power plant

    International Nuclear Information System (INIS)

    Park, Young Ho

    2007-02-01

    In human factors research, more attention has been devoted to the operation of nuclear power plants (NPPs) than to their maintenance. However, human error related to maintenance is 45% among the total human errors from 1990 to 2005 in Korean nuclear power plants. Therefore, it is necessary to study human factors in the maintenance of an NPP. There is a current trend toward introducing digital technology into both safety and non-safety systems in NPPs. A variety of information about plant conditions can be used digitally. In the future, maintenance support systems will be developed based on an information-oriented NPP. In this context, it is necessary to study the cognitive tasks of the personnel involved in maintenance and the interaction between the personnel and maintenance support systems. The fundamental purpose of this work is how to distribute the cognitive tasks of the personnel involved in the maintenance in order to develop a maintenance support system that considers human factors. The second purpose is to find the causes of errors due to engineers or maintainers and propose system functions that are countermeasures to reduce these errors. In this paper, a cognitive task distribution model of the personnel involved in maintenance is proposed using Rasmussen's decision making model. First, the personnel were divided into three groups: the operators (inspectors), engineers, and maintainers. Second, human cognitive tasks related to maintenance were distributed based on these groups. The operators' cognitive tasks are detection and observation; the engineers' cognitive tasks are identification, evaluation, target state, select target, and procedure: and the maintainers' cognitive task is execution. The case study is an analysis of failure reports related to human error in maintenance over a period of 15years. By using error classification based on the information processing approach, the human errors involved in maintenance were classified

  10. Space Station data system analysis/architecture study. Task 1: Functional requirements definition, DR-5

    Science.gov (United States)

    1985-01-01

    The initial task in the Space Station Data System (SSDS) Analysis/Architecture Study is the definition of the functional and key performance requirements for the SSDS. The SSDS is the set of hardware and software, both on the ground and in space, that provides the basic data management services for Space Station customers and systems. The primary purpose of the requirements development activity was to provide a coordinated, documented requirements set as a basis for the system definition of the SSDS and for other subsequent study activities. These requirements should also prove useful to other Space Station activities in that they provide an indication of the scope of the information services and systems that will be needed in the Space Station program. The major results of the requirements development task are as follows: (1) identification of a conceptual topology and architecture for the end-to-end Space Station Information Systems (SSIS); (2) development of a complete set of functional requirements and design drivers for the SSIS; (3) development of functional requirements and key performance requirements for the Space Station Data System (SSDS); and (4) definition of an operating concept for the SSIS. The operating concept was developed both from a Space Station payload customer and operator perspective in order to allow a requirements practicality assessment.

  11. Performance metrics for state-of-the-art airborne magnetic and electromagnetic systems for mapping and detection of unexploded ordnance

    Science.gov (United States)

    Doll, William E.; Bell, David T.; Gamey, T. Jeffrey; Beard, Les P.; Sheehan, Jacob R.; Norton, Jeannemarie

    2010-04-01

    Over the past decade, notable progress has been made in the performance of airborne geophysical systems for mapping and detection of unexploded ordnance in terrestrial and shallow marine environments. For magnetometer systems, the most significant improvements include development of denser magnetometer arrays and vertical gradiometer configurations. In prototype analyses and recent Environmental Security Technology Certification Program (ESTCP) assessments using new production systems the greatest sensitivity has been achieved with a vertical gradiometer configuration, despite model-based survey design results which suggest that dense total-field arrays would be superior. As effective as magnetometer systems have proven to be at many sites, they are inadequate at sites where basalts and other ferrous geologic formations or soils produce anomalies that approach or exceed those of target ordnance items. Additionally, magnetometer systems are ineffective where detection of non-ferrous ordnance items is of primary concern. Recent completion of the Battelle TEM-8 airborne time-domain electromagnetic system represents the culmination of nearly nine years of assessment and development of airborne electromagnetic systems for UXO mapping and detection. A recent ESTCP demonstration of this system in New Mexico showed that it was able to detect 99% of blind-seeded ordnance items, 81mm and larger, and that it could be used to map in detail a bombing target on a basalt flow where previous airborne magnetometer surveys had failed. The probability of detection for the TEM-8 in the blind-seeded study area was better than that reported for a dense-array total-field magnetometer demonstration of the same blind-seeded site, and the TEM-8 system successfully detected these items with less than half as many anomaly picks as the dense-array total-field magnetometer system.

  12. Assessing the Greenness of Chemical Reactions in the Laboratory Using Updated Holistic Graphic Metrics Based on the Globally Harmonized System of Classification and Labeling of Chemicals

    Science.gov (United States)

    Ribeiro, M. Gabriela T. C.; Yunes, Santiago F.; Machado, Adelio A. S. C.

    2014-01-01

    Two graphic holistic metrics for assessing the greenness of synthesis, the "green star" and the "green circle", have been presented previously. These metrics assess the greenness by the degree of accomplishment of each of the 12 principles of green chemistry that apply to the case under evaluation. The criteria for assessment…

  13. Metrication: An economic wake-up call for US industry

    Science.gov (United States)

    Carver, G. P.

    1993-03-01

    As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

  14. Task analysis and structure scheme for center manager station in large container inspection system

    International Nuclear Information System (INIS)

    Li Zheng; Gao Wenhuan; Wang Jingjin; Kang Kejun; Chen Zhiqiang

    1997-01-01

    LCIS works as follows: the accelerator generates beam pulses which are formed into fan shape; the scanning system drags a lorry with a container passing through the beam in constant speed; the detector array detects the beam penetrating the lorry; the projection data acquisition system reads the projections and completes an inspection image of the lorry. All these works are controlled and synchronized by the center manage station. The author will describe the process of the projection data acquisition in scanning mode and the methods of real-time projection data processing. the task analysis and the structure scheme of center manager station is presented

  15. ExM:System Support for Extreme-Scale, Many-Task Applications

    Energy Technology Data Exchange (ETDEWEB)

    Katz, Daniel S

    2011-05-31

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastest computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)

  16. Student Task Analysis for the Development of E-Learning Lectural System in Basic Chemistry Courses in FKIP UMMY Solok

    Science.gov (United States)

    Afrahamiryano, A.; Ariani, D.

    2018-04-01

    The student task analysis is one part of the define stage in development research using the 4-D development model. Analysis of this task is useful to determine the level of understanding of students on lecture materials that have been given. The results of this task analysis serve as a measuring tool to determine the level of success of learning and as a basis in the development of lecture system. Analysis of this task is done by the method of observation and documentation study of the tasks undertaken by students. The results of this analysis are then described and after that triangulation are done to draw conclusions. The results of the analysis indicate that the students' level of understanding is high for theoretical and low material for counting material. Based on the results of this task analysis, it can be concluded that e-learning lecture system developed should be able to increase students' understanding on basic chemicals that are calculated.

  17. METRICS DEVELOPMENT FOR PATENTS.

    Science.gov (United States)

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  18. Engineering task plan for flammable gas atmosphere mobile color video camera systems

    International Nuclear Information System (INIS)

    Kohlman, E.H.

    1995-01-01

    This Engineering Task Plan (ETP) describes the design, fabrication, assembly, and testing of the mobile video camera systems. The color video camera systems will be used to observe and record the activities within the vapor space of a tank on a limited exposure basis. The units will be fully mobile and designed for operation in the single-shell flammable gas producing tanks. The objective of this tank is to provide two mobile camera systems for use in flammable gas producing single-shell tanks (SSTs) for the Flammable Gas Tank Safety Program. The camera systems will provide observation, video recording, and monitoring of the activities that occur in the vapor space of applied tanks. The camera systems will be designed to be totally mobile, capable of deployment up to 6.1 meters into a 4 inch (minimum) riser

  19. Electrodril system field test program. Phase II: Task C-1-deep drilling system demonstration. Final report for Phase II: Task C-1

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, P D

    1981-04-01

    The Electrodril Deep Drilling System field test demonstrations were aborted in July 1979, due to connector problems. Subsequent post test analyses concluded that the field replacable connectors were the probable cause of the problems encountered. The designs for both the male and female connectors, together with their manufacturing processes, were subsequently modified, as was the acceptance test procedures. A total of nine male and nine female connectors were manufactured and delivered during the 2nd Quarter 1980. Exhaustive testing was then conducted on each connector as a precursor to formal qualification testing conducted during the month of October 1980, at the Brown Oil Tool test facility located in Houston, Texas. With this report, requirements under Phase II, Task C-1 are satisfied. The report documents the results of the connector qualification test program which was successfully completed October 28, 1980. In general, it was concluded that connector qualification had been achieved and plans are now in progress to resume the field test demonstration program so that Electrodril System performance predictions and economic viability can be evaluated.

  20. The challenge of measuring emergency preparedness: integrating component metrics to build system-level measures for strategic national stockpile operations.

    Science.gov (United States)

    Jackson, Brian A; Faith, Kay Sullivan

    2013-02-01

    Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.

  1. Metrical Phonology and SLA.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…

  2. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  3. Area of Concern: A new paradigm in life cycle assessment for the development of footprint metrics

    DEFF Research Database (Denmark)

    Ridoutt, Bradley G.; Pfister, Stephan; Manzardo, Alessandro

    2016-01-01

    As a class of environmental metrics, footprints have been poorly defined, have shared an unclear relationship to life cycle assessment (LCA), and the variety of approaches to quantification have sometimes resulted in confusing and contradictory messages in the marketplace. In response, a task force...... operating under the auspices of the UNEP/SETAC Life Cycle Initiative project on environmental life cycle impact assessment (LCIA) has been working to develop generic guidance for developers of footprint metrics. The purpose of this paper is to introduce a universal footprint definition and related...... terminology as well as to discuss modelling implications. The task force has worked from the perspective that footprints should be based on LCA methodology, underpinned by the same data systems and models as used in LCA. However, there are important differences in purpose and orientation relative to LCA...

  4. Feasibility of the adaptive and automatic presentation of tasks (ADAPT system for rehabilitation of upper extremity function post-stroke

    Directory of Open Access Journals (Sweden)

    Choi Younggeun

    2011-08-01

    Full Text Available Abstract Background Current guidelines for rehabilitation of arm and hand function after stroke recommend that motor training focus on realistic tasks that require reaching and manipulation and engage the patient intensively, actively, and adaptively. Here, we investigated the feasibility of a novel robotic task-practice system, ADAPT, designed in accordance with such guidelines. At each trial, ADAPT selects a functional task according to a training schedule and with difficulty based on previous performance. Once the task is selected, the robot picks up and presents the corresponding tool, simulates the dynamics of the tasks, and the patient interacts with the tool to perform the task. Methods Five participants with chronic stroke with mild to moderate impairments (> 9 months post-stroke; Fugl-Meyer arm score 49.2 ± 5.6 practiced four functional tasks (selected out of six in a pre-test with ADAPT for about one and half hour and 144 trials in a pseudo-random schedule of 3-trial blocks per task. Results No adverse events occurred and ADAPT successfully presented the six functional tasks without human intervention for a total of 900 trials. Qualitative analysis of trajectories showed that ADAPT simulated the desired task dynamics adequately, and participants reported good, although not excellent, task fidelity. During training, the adaptive difficulty algorithm progressively increased task difficulty leading towards an optimal challenge point based on performance; difficulty was then continuously adjusted to keep performance around the challenge point. Furthermore, the time to complete all trained tasks decreased significantly from pretest to one-hour post-test. Finally, post-training questionnaires demonstrated positive patient acceptance of ADAPT. Conclusions ADAPT successfully provided adaptive progressive training for multiple functional tasks based on participant's performance. Our encouraging results establish the feasibility of ADAPT; its

  5. A documentation tool for product configuration systems - improving the documentation task

    DEFF Research Database (Denmark)

    Hvam, Lars; Jensen, Klaes Ladeby

    2005-01-01

    's experience with the procedure and the hitherto empirical experience from companies having applied the procedure have revealed that there is a need for an IT-based docu-mentation tool to support the process of constructing product configuration systems. Time can be saved by letting a documentation tool handle......Configuration systems are increasingly applied to automate the configuration of complex products. A configuration system is an expert system designed to combine specified modules according to constraints. The constraints are stored as product data and rules in a product model, and one of the most...... essential tasks is thus to develop a complete and consistent product model which can reflect the actual product. A procedure for building product models has been developed at the Centre for Product Modelling (CPM), and the pro-cedure has been successfully applied in several industrial companies. CPM...

  6. Rapid analysis of hay attributes using NIRS. Final report, Task II alfalfa supply system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-24

    This final report provides technical information on the development of a near infrared reflectance spectroscopy (NIRS) system for the analysis of alfalfa hay. The purpose of the system is to provide consistent quality for processing alfalfa stems for fuel and alfalfa leaf meal products for livestock feed. Project tasks were to: (1) develop an NIRS driven analytical system for analysis of alfalfa hay and processed alfalfa products; (2) assist in hiring a qualified NIRS technician and recommend changes in testing equipment necessary to provide accurate analysis; (3) calibrate the NIRS instrument for accurate analyses; and (4) develop prototype equipment and sampling procedures as a first step towards development of a totally automated sampling system that would rapidly sample and record incoming feedstock and outbound product. An accurate hay testing program was developed, along with calibration equations for analyzing alfalfa hay and sun-cured alfalfa pellets. A preliminary leaf steam calibration protocol was also developed. 7 refs., 11 figs., 10 tabs.

  7. Different Neural Systems Contribute to Semantic Bias and Conflict Detection in the Inclusion Fallacy Task

    Directory of Open Access Journals (Sweden)

    Peipeng eLiang

    2014-10-01

    Full Text Available more general conclusion category is considered stronger than a generalization to a specific conclusion category nested within the more general set. Such inferences violate rational norms and are part of the reasoning fallacy literature that provides interesting tasks to explore cognitive and neural basis of reasoning. To explore the functional neuroanatomy of the inclusion fallacy, we used a 2×2 factorial design, with factors for Quantification (explicit and implicit and Response (fallacious and nonfallacious. It was found that a left fronto-temporal system, along with a superior medial frontal system, was specifically activated in response to fallacy responses consistent with a semantic biasing of judgment explanation. A right fronto-parietal system was specifically recruited in response to detecting conflict associated with the heightened fallacy condition. These results are largely consistent with previous studies of reasoning fallacy and support a multiple systems model of reasoning.

  8. Effect of the Enabling Perception of Costing Systems by Managers in the Performance of their Tasks

    Directory of Open Access Journals (Sweden)

    Guilherme Eduardo de Souza

    2017-12-01

    Full Text Available The goal of this study is to analyze the effect of the enabling perception of costing systems by managers in the performance of their tasks, mediated by the intensity in of use of these costing systems and the level of psychological empowerment. The research was carried out through a survey of 62 companies listed in the Perfil das Empresas com Projetos Aprovados ou em Implantação na Zona Franca de Manaus Profile of Companies, in 2014. With a view to analyzing the hypotheses replicated from Mahama’s and Cheng’s (2013 study, the Structural Equations Modeling technique wasused. The research results show that the managers’ enabling perception of costing systems does not affect their  intensity of use, but it impacts on psychological empowerment, and this is directly reflected in the performance of tasks, indicating that the greater the empowerment, the better manager performance will be. It isconcluded that the model partially supports the relationships delineated and that the antecedents related to the costing systems require further study.

  9. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  10. NASA Engineering Safety Center NASA Aerospace Flight Battery Systems Working Group 2007 Proactive Task Status

    Science.gov (United States)

    Manzo, Michelle A.

    2007-01-01

    In 2007, the NASA Engineering Safety Center (NESC) chartered the NASA Aerospace Flight Battery Systems Working Group to bring forth and address critical battery-related performance/manufacturing issues for NASA and the aerospace community. A suite of tasks identifying and addressing issues related to Ni-H2 and Li-ion battery chemistries was submitted and selected for implementation. The current NESC funded are: (1) Wet Life of Ni-H2 Batteries (2) Binding Procurement (3) NASA Lithium-Ion Battery Guidelines (3a) Li-Ion Performance Assessment (3b) Li-Ion Guidelines Document (3b-i) Assessment of Applicability of Pouch Cells for Aerospace Missions (3b-ii) High Voltage Risk Assessment (3b-iii) Safe Charge Rates for Li-Ion Cells (4) Availability of Source Material for Li-Ion Cells (5) NASA Aerospace Battery Workshop This presentation provides a brief overview of the tasks in the 2007 plan and serves as an introduction to more detailed discussions on each of the specific tasks.

  11. Metrix Matrix: A Cloud-Based System for Tracking Non-Relative Value Unit Value-Added Work Metrics.

    Science.gov (United States)

    Kovacs, Mark D; Sheafor, Douglas H; Thacker, Paul G; Hardie, Andrew D; Costello, Philip

    2018-03-01

    In the era of value-based medicine, it will become increasingly important for radiologists to provide metrics that demonstrate their value beyond clinical productivity. In this article the authors describe their institution's development of an easy-to-use system for tracking value-added but non-relative value unit (RVU)-based activities. Metrix Matrix is an efficient cloud-based system for tracking value-added work. A password-protected home page contains links to web-based forms created using Google Forms, with collected data populating Google Sheets spreadsheets. Value-added work metrics selected for tracking included interdisciplinary conferences, hospital committee meetings, consulting on nonbilled outside studies, and practice-based quality improvement. Over a period of 4 months, value-added work data were collected for all clinical attending faculty members in a university-based radiology department (n = 39). Time required for data entry was analyzed for 2 faculty members over the same time period. Thirty-nine faculty members (equivalent to 36.4 full-time equivalents) reported a total of 1,223.5 hours of value-added work time (VAWT). A formula was used to calculate "value-added RVUs" (vRVUs) from VAWT. VAWT amounted to 5,793.6 vRVUs or 6.0% of total work performed (vRVUs plus work RVUs [wRVUs]). Were vRVUs considered equivalent to wRVUs for staffing purposes, this would require an additional 2.3 full-time equivalents, on the basis of average wRVU calculations. Mean data entry time was 56.1 seconds per day per faculty member. As health care reimbursement evolves with an emphasis on value-based medicine, it is imperative that radiologists demonstrate the value they add to patient care beyond wRVUs. This free and easy-to-use cloud-based system allows the efficient quantification of value-added work activities. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  12. Selection of an industrial natural-gas-fired advanced turbine system - Task 3A

    Energy Technology Data Exchange (ETDEWEB)

    Holloway, G.M.

    1997-05-01

    TASK OBJECTIVES: Identify a gas-fueled turbine and steam system which will meet the program goals for efficiency - and emissions. TECHNICAL GOALS AND REQUIREMENTS: Goals for the Advanced Turbine System Program (ATS) where outlined in the statement of work for five basic categories: Cycle Efficiency - System heat rate to have a 15% improvement over 1991 vintage systems being offered to the market. Environmental No post-combustion devices while meeting the following parameter targets: (1) Nitrous Oxide (NO{sub x}) emissions to equal 8 parts per million dry (ppmd) with 15% oxygen. (2) Carbon monoxide (CO) and unburned hydrocarbon (UHC) emissions to equal 20 parts per million(ppmd) each. Cost of electricity to be 10 percent less when compared to similar 1991 systems. Fuel Flexibility Have to ability to burn coal or coal derived fuels without extensive redesign. Reliability, Availability, Maintainability Reliability, availability and maintainability must be comparable to modern advanced power generation systems. For all cycle and system studies, analyses were done for the following engine system ambient conditions: Temperature - 59F; Altitude - Sea Level; Humidity - 60%. For the 1991 reference system, GE Aircraft Engines used its LM6OOO engine product offering for comparison of the Industrial System parameters developed under this program.

  13. The geometry of generalized force matching and related information metrics in coarse-graining of molecular systems

    International Nuclear Information System (INIS)

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plecháč, Petr

    2015-01-01

    Using the probabilistic language of conditional expectations, we reformulate the force matching method for coarse-graining of molecular systems as a projection onto spaces of coarse observables. A practical outcome of this probabilistic description is the link of the force matching method with thermodynamic integration. This connection provides a way to systematically construct a local mean force and to optimally approximate the potential of mean force through force matching. We introduce a generalized force matching condition for the local mean force in the sense that allows the approximation of the potential of mean force under both linear and non-linear coarse graining mappings (e.g., reaction coordinates, end-to-end length of chains). Furthermore, we study the equivalence of force matching with relative entropy minimization which we derive for general non-linear coarse graining maps. We present in detail the generalized force matching condition through applications to specific examples in molecular systems

  14. The geometry of generalized force matching and related information metrics in coarse-graining of molecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Kalligiannaki, Evangelia, E-mail: ekalligian@tem.uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, 70013 Heraklion (Greece); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, 70013 Heraklion (Greece); Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), IACM/FORTH, GR-71110 Heraklion (Greece); Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Plecháč, Petr, E-mail: plechac@math.udel.edu [Department of Mathematical Sciences, University of Delaware, Newark, Delaware 19716 (United States)

    2015-08-28

    Using the probabilistic language of conditional expectations, we reformulate the force matching method for coarse-graining of molecular systems as a projection onto spaces of coarse observables. A practical outcome of this probabilistic description is the link of the force matching method with thermodynamic integration. This connection provides a way to systematically construct a local mean force and to optimally approximate the potential of mean force through force matching. We introduce a generalized force matching condition for the local mean force in the sense that allows the approximation of the potential of mean force under both linear and non-linear coarse graining mappings (e.g., reaction coordinates, end-to-end length of chains). Furthermore, we study the equivalence of force matching with relative entropy minimization which we derive for general non-linear coarse graining maps. We present in detail the generalized force matching condition through applications to specific examples in molecular systems.

  15. Task forces for the European railway: trains and rail systems of the future; Task Force der EG: Zuege und Eisenbahnsysteme der Zukunft

    Energy Technology Data Exchange (ETDEWEB)

    Blonk, W. [European Commission, Brussels (Belgium). Directorate-General VII Transport

    1999-02-01

    To achieve greatest benefit from the R and D activities of the EC, the Commission introduced the concept to task forces in 1995. One of these is concerned with 'trains and rail systems of the future'. This was to be seen as an element in a far-reaching political framework guideline, so bringing together hitherto separate strands of railway-related transport with industrial and research policy, contributing to revival of the sector. The main objective of these joint efforts was to contribute to radical structural and cultural changes on the railway so as - on the basis of market orientation, quality demands and cost efficiency - to give greater weight among the population to future transport challenges. This article reviews the main results attained by way of the task force and how these will be implemented in the 5th framework programme of the European Community. (orig.) [German] Um mit den FuE-Aktivitaeten der Europaeischen Gemeinschaft groessten Nutzen zu erzielen, richtete die Kommission im Jahre 1995 das Konzept von Task Forces ein. Eine dieser Task Forces beschaeftigte sich mit dem Thema 'Zuege und Eisenbahnsysteme der Zukunft'. Es war als ein Element einer weitgesteckten politischen Rahmenrichtlinie anzusehen, wobei es die vorher getrennten Straenge der eisenbahngebundenen Verkehre mit der Industrie- und Forschungspolitik zusammenbrachte und dadurch zur Wiederbelebung des Sektors beitragen konnte. (orig.)

  16. Engineering task plan for the annual revision of the rotary mode core sampling system safety equipment list

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This Engineering Task Plan addresses an effort to provide an update to the RMCS Systems 3 and 4 SEL and DCM in order to incorporate the changes to the authorization basis implemented by HNF-SD-WM-BIO-001, Rev. 0 (Draft), Addendum 5 , Safety Analysis for Rotary Mode Core Sampling. Responsibilities, task description, cost estimate, and schedule are presented

  17. Hybrid metric-Palatini stars

    Science.gov (United States)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  18. Task Mapping and Bandwidth Reservation for Mixed Hard/Soft Fault-Tolerant Embedded Systems

    DEFF Research Database (Denmark)

    Saraswat, Prabhat Kumar; Pop, Paul; Madsen, Jan

    2010-01-01

    reserved for the servers determines the quality of service (QoS) for soft tasks. CBS enforces temporal isolation, such that soft task overruns do not affect the timing guarantees of hard tasks. Transient faults in hard tasks are tolerated using checkpointing with rollback recovery. We have proposed a Tabu...

  19. Changing societies and four tasks of schooling: Challenges for strongly differentiated educational systems

    Science.gov (United States)

    van de Werfhorst, Herman G.

    2014-05-01

    Changing labour markets, increased calls for selection and excellence, and increased diversity and individualisation have repercussions on how educational systems can prepare youth for work, optimise knowledge production, achieve equality of opportunity, and socialise students into active civic engagement. This paper discusses four central tasks of schooling and examines to what extent societal developments challenge education policy to deliver on the tasks at hand. Particular attention is given to the challenges Europe's strongly diversified educational systems are currently facing. Both the Netherlands and Germany, for example, have been offering vocationally-oriented pathways alongside traditional academic higher education for some time. But today's ongoing changes in job descriptions, mainly due to ever-accelerating technological developments, are causing a risk of skills obsolescence which can only be avoided by continuous upskilling and/or reskilling of a sufficiently flexible workforce. Overcoming differences of intelligence as well as differences of diverse socioeconomic, ethnic and linguistic backgrounds by way of education is another challenge, as is fostering "soft" skills and political awareness. This paper investigates the effectiveness of current education systems in preparing citizens for a functioning modern society.

  20. Work process and task-based design of intelligent assistance systems in German textile industry

    Science.gov (United States)

    Löhrer, M.; Ziesen, N.; Altepost, A.; Saggiomo, M.; Gloy, Y. S.

    2017-10-01

    The mid-sized embossed German textile industry must face social challenges e.g. demographic change or technical changing processes. Interaction with intelligent systems (on machines) and increasing automation changes processes, working structures and employees’ tasks on all levels. Work contents are getting more complex, resulting in the necessity for diversified and enhanced competencies. Mobile devices like tablets or smartphones are increasingly finding their way into the workplace. Employees who grew up with new forms of media have certain advantages regarding the usage of modern technologies compared to older employees. Therefore, it is necessary to design new systems which help to adapt the competencies of both younger and older employees to new automated production processes in the digital work environment. The key to successful integration of technical assistance systems is user-orientated design and development that includes concepts for competency development under consideration of, e.g., ethical and legal aspects.

  1. Enterprise Sustainment Metrics

    Science.gov (United States)

    2015-06-19

    are negatively impacting KPIs” (Parmenter, 2010: 31). In the current state, the Air Force’s AA and PBL metrics are once again split . AA does...must have the authority to “take immediate action to rectify situations that are negatively impacting KPIs” (Parmenter, 2010: 31). 3. Measuring...highest profitability and shareholder value for each company” (2014: 273). By systematically diagraming a process, either through a swim lane flowchart

  2. Reliability of steam-turbine rotors. Task 1. Lifetime prediction analysis system. Final report

    International Nuclear Information System (INIS)

    Nair, P.K.; Pennick, H.G.; Peters, J.E.; Wells, C.H.

    1982-12-01

    Task 1 of RP 502, Reliability of Steam Turbine Rotors, resulted in the development of a computerized lifetime prediction analysis system (STRAP) for the automatic evaluation of rotor integrity based upon the results of a boresonic examination of near-bore defects. Concurrently an advanced boresonic examination system (TREES), designed to acquire data automatically for lifetime analysis, was developed and delivered to the maintenance shop of a major utility. This system and a semi-automated, state-of-the-art system (BUCS) were evaluated on two retired rotors as part of the Task 2 effort. A modified nonproprietary version of STRAP, called SAFER, is now available for rotor lifetime prediction analysis. STRAP and SAFER share a common fracture analysis postprocessor for rapid evaluation of either conventional boresonic amplitude data or TREES cell data. The final version of this postprocessor contains general stress intensity correlations for elliptical cracks in a radial stress gradient and provision for elastic-plastic instability of the ligament between an imbedded crack and the bore surface. Both linear elastic and ligament rupture models were developed for rapid analysis of linkup within three-dimensional clusters of defects. Bore stress-rupture criteria are included, but a creep-fatigue crack growth data base is not available. Physical and mechanical properties of air-melt 1CrMoV forgings are built into the program; however, only bounding values of fracture toughness versus temperature are available. Owing to the lack of data regarding the probability of flaw detection for the boresonic systems and of quantitative verification of the flaw linkup analysis, automatic evlauation of boresonic results is not recommended, and the lifetime prediction system is currently restricted to conservative, deterministic analysis of specified flaw geometries

  3. Applying dynamic priority scheduling scheme to static systems of pinwheel task model in power-aware scheduling.

    Science.gov (United States)

    Seol, Ye-In; Kim, Young-Kuk

    2014-01-01

    Power-aware scheduling reduces CPU energy consumption in hard real-time systems through dynamic voltage scaling (DVS). In this paper, we deal with pinwheel task model which is known as static and predictable task model and could be applied to various embedded or ubiquitous systems. In pinwheel task model, each task's priority is static and its execution sequence could be predetermined. There have been many static approaches to power-aware scheduling in pinwheel task model. But, in this paper, we will show that the dynamic priority scheduling results in power-aware scheduling could be applied to pinwheel task model. This method is more effective than adopting the previous static priority scheduling methods in saving energy consumption and, for the system being still static, it is more tractable and applicable to small sized embedded or ubiquitous computing. Also, we introduce a novel power-aware scheduling algorithm which exploits all slacks under preemptive earliest-deadline first scheduling which is optimal in uniprocessor system. The dynamic priority method presented in this paper could be applied directly to static systems of pinwheel task model. The simulation results show that the proposed algorithm with the algorithmic complexity of O(n) reduces the energy consumption by 10-80% over the existing algorithms.

  4. Parallel Task Processing on a Multicore Platform in a PC-based Control System for Parallel Kinematics

    Directory of Open Access Journals (Sweden)

    Harald Michalik

    2009-02-01

    Full Text Available Multicore platforms are such that have one physical processor chip with multiple cores interconnected via a chip level bus. Because they deliver a greater computing power through concurrency, offer greater system density multicore platforms provide best qualifications to address the performance bottleneck encountered in PC-based control systems for parallel kinematic robots with heavy CPU-load. Heavy load control tasks are generated by new control approaches that include features like singularity prediction, structure control algorithms, vision data integration and similar tasks. In this paper we introduce the parallel task scheduling extension of a communication architecture specially tailored for the development of PC-based control of parallel kinematics. The Sche-duling is specially designed for the processing on a multicore platform. It breaks down the serial task processing of the robot control cycle and extends it with parallel task processing paths in order to enhance the overall control performance.

  5. System Statement of Tasks of Calculating and Providing the Reliability of Heating Cogeneration Plants in Power Systems

    Science.gov (United States)

    Biryuk, V. V.; Tsapkova, A. B.; Larin, E. A.; Livshiz, M. Y.; Sheludko, L. P.

    2018-01-01

    A set of mathematical models for calculating the reliability indexes of structurally complex multifunctional combined installations in heat and power supply systems was developed. Reliability of energy supply is considered as required condition for the creation and operation of heat and power supply systems. The optimal value of the power supply system coefficient F is based on an economic assessment of the consumers’ loss caused by the under-supply of electric power and additional system expences for the creation and operation of an emergency capacity reserve. Rationing of RI of the industrial heat supply is based on the use of concept of technological margin of safety of technological processes. The definition of rationed RI values of heat supply of communal consumers is based on the air temperature level iside the heated premises. The complex allows solving a number of practical tasks for providing reliability of heat supply for consumers. A probabilistic model is developed for calculating the reliability indexes of combined multipurpose heat and power plants in heat-and-power supply systems. The complex of models and calculation programs can be used to solve a wide range of specific tasks of optimization of schemes and parameters of combined heat and power plants and systems, as well as determining the efficiency of various redundance methods to ensure specified reliability of power supply.

  6. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  7. Balanced metrics for vector bundles and polarised manifolds

    DEFF Research Database (Denmark)

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  8. Jacobi-Maupertuis metric and Kepler equation

    Science.gov (United States)

    Chanda, Sumanto; Gibbons, Gary William; Guha, Partha

    This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.

  9. STATE-OF-THE-ART TASKS AND ACHIEVEMENTS OF PARALINGUISTIC SPEECH ANALYSIS SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. A. Karpov

    2016-07-01

    Full Text Available We present analytical survey of state-of-the-art actual tasks in the area of computational paralinguistics, as well as the recent achievements of automatic systems for paralinguistic analysis of conversational speech. Paralinguistics studies non-verbal aspects of human communication and speech such as: natural emotions, accents, psycho-physiological states, pronunciation features, speaker’s voice parameters, etc. We describe architecture of a baseline computer system for acoustical paralinguistic analysis, its main components and useful speech processing methods. We present some information on an International contest called Computational Paralinguistics Challenge (ComParE, which is held each year since 2009 in the framework of the International conference INTERSPEECH organized by the International Speech Communication Association. We present sub-challenges (tasks that were proposed at the ComParE Challenges in 2009-2016, and analyze winning computer systems for each sub-challenge and obtained results. The last completed ComParE-2015 Challenge was organized in September 2015 in Germany and proposed 3 sub-challenges: 1 Degree of Nativeness (DN sub-challenge, determination of nativeness degree of speakers based on acoustics; 2 Parkinson's Condition (PC sub-challenge, recognition of a degree of Parkinson’s condition based on speech analysis; 3 Eating Condition (EC sub-challenge, determination of the eating condition state during speaking or a dialogue, and classification of consumed food type (one of seven classes of food by the speaker. In the last sub-challenge (EC, the winner was a joint Turkish-Russian team consisting of the authors of the given paper. We have developed the most efficient computer-based system for detection and classification of the corresponding (EC acoustical paralinguistic events. The paper deals with the architecture of this system, its main modules and methods, as well as the description of used training and evaluation

  10. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  11. NASA's Functional Task Test: Providing Information for an Integrated Countermeasure System

    Science.gov (United States)

    Bloomberg, J. J.; Feiveson, A. H.; Laurie, S. S.; Lee, S. M. C.; Mulavara, A. P.; Peters, B. T.; Platts, S. H.; Ploutz-Snyder, L. L.; Reschke, M. F.; Ryder, J. W.; hide

    2015-01-01

    Exposure to the microgravity conditions of spaceflight causes astronauts to experience alterations in multiple physiological systems. These physiological changes include sensorimotor disturbances, cardiovascular deconditioning, and loss of muscle mass and strength. Some or all of these changes might affect the ability of crewmembers to perform critical mission tasks immediately after landing on a planetary surface. The goals of the Functional Task Test (FTT) study were to determine the effects of spaceflight on functional tests that are representative of critical exploration mission tasks and to identify the key physiological factors that contribute to decrements in performance. The FTT was comprised of seven functional tests and a corresponding set of interdisciplinary physiological measures targeting the sensorimotor, cardiovascular and muscular changes associated with exposure to spaceflight. Both Shuttle and ISS crewmembers participated in this study. Additionally, we conducted a supporting study using the FTT protocol on subjects before and after 70 days of 6? head-down bed rest. The bed rest analog allowed us to investigate the impact of body unloading in isolation on both functional tasks and on the underlying physiological factors that lead to decrements in performance, and then to compare them with the results obtained in our spaceflight study. Spaceflight data were collected on three sessions before flight, on landing day (Shuttle only) and 1, 6 and 30 days after landing. Bed rest subjects were tested three times before bed rest and immediately after getting up from bed rest as well as 1, 6, and 12 days after reambulation. We have shown that for Shuttle, ISS and bed rest subjects, functional tasks requiring a greater demand for dynamic control of postural equilibrium (i.e. fall recovery, seat egress/obstacle avoidance during walking, object translation, jump down) showed the greatest decrement in performance. Functional tests with reduced requirements for

  12. A comparative study on assessment procedures and metric properties of two scoring systems of the Coma Recovery Scale-Revised items: standard and modified scores.

    Science.gov (United States)

    Sattin, Davide; Lovaglio, Piergiorgio; Brenna, Greta; Covelli, Venusia; Rossi Sebastiano, Davide; Duran, Dunja; Minati, Ludovico; Giovannetti, Ambra Mara; Rosazza, Cristina; Bersano, Anna; Nigri, Anna; Ferraro, Stefania; Leonardi, Matilde

    2017-09-01

    The study compared the metric characteristics (discriminant capacity and factorial structure) of two different methods for scoring the items of the Coma Recovery Scale-Revised and it analysed scale scores collected using the standard assessment procedure and a new proposed method. Cross sectional design/methodological study. Inpatient, neurological unit. A total of 153 patients with disorders of consciousness were consecutively enrolled between 2011 and 2013. All patients were assessed with the Coma Recovery Scale-Revised using standard (rater 1) and inverted (rater 2) procedures. Coma Recovery Scale-Revised score, number of cognitive and reflex behaviours and diagnosis. Regarding patient assessment, rater 1 using standard and rater 2 using inverted procedures obtained the same best scores for each subscale of the Coma Recovery Scale-Revised for all patients, so no clinical (and statistical) difference was found between the two procedures. In 11 patients (7.7%), rater 2 noted that some Coma Recovery Scale-Revised codified behavioural responses were not found during assessment, although higher response categories were present. A total of 51 (36%) patients presented the same Coma Recovery Scale-Revised scores of 7 or 8 using a standard score, whereas no overlap was found using the modified score. Unidimensionality was confirmed for both score systems. The Coma Recovery Scale Modified Score showed a higher discriminant capacity than the standard score and a monofactorial structure was also supported. The inverted assessment procedure could be a useful evaluation method for the assessment of patients with disorder of consciousness diagnosis.

  13. Task-oriented lossy compression of magnetic resonance images

    Science.gov (United States)

    Anderson, Mark C.; Atkins, M. Stella; Vaisey, Jacques

    1996-04-01

    A new task-oriented image quality metric is used to quantify the effects of distortion introduced into magnetic resonance images by lossy compression. This metric measures the similarity between a radiologist's manual segmentation of pathological features in the original images and the automated segmentations performed on the original and compressed images. The images are compressed using a general wavelet-based lossy image compression technique, embedded zerotree coding, and segmented using a three-dimensional stochastic model-based tissue segmentation algorithm. The performance of the compression system is then enhanced by compressing different regions of the image volume at different bit rates, guided by prior knowledge about the location of important anatomical regions in the image. Application of the new system to magnetic resonance images is shown to produce compression results superior to the conventional methods, both subjectively and with respect to the segmentation similarity metric.

  14. The TRIDEC System-of-Systems; Choreography of large-scale concurrent tasks in Natural Crisis Management

    Science.gov (United States)

    Häner, R.; Wächter, J.

    2012-04-01

    The project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme aims at establishing a network of dedicated, autonomous legacy systems for large-scale concurrent management of natural crises utilising heterogeneous information resources. TRIDEC's architecture reflects the System-of- Systems (SoS) approach which is based on task-oriented systems, cooperatively interacting as a collective in a common environment. The design of the TRIDEC-SoS follows the principles of service-oriented and event-driven architectures (SOA & EDA) exceedingly focusing on a loose coupling of the systems. The SoS approach in combination with SOA and EDA has the distinction of being able to provide novel and coherent behaviours and features resulting from a process of dynamic self-organisation. Self-organisation is a process without the need for a central or external coordinator controlling it through orchestration. It is the result of enacted concurrent tasks in a collaborative environment of geographically distributed systems. Although the individual systems act completely autonomously, their interactions expose emergent structures of evolving nature. Particularly, the fact is important that SoS are inherently able to evolve on all facets of intelligent information management. This includes adaptive properties, e.g. seamless integration of new resource types or the adoption of new fields in natural crisis management. In the case of TRIDEC with various heterogeneous participants involved, concurrent information processing is of fundamental importance because of the achievable improvements regarding cooperative decision making. Collaboration within TRIDEC will be implemented with choreographies and conversations. Choreographies specify the expected behaviour between two or more participants; conversations describe the message exchange between all participants emphasising their logical

  15. Video Analytics Evaluation: Survey of Datasets, Performance Metrics and Approaches

    Science.gov (United States)

    2014-09-01

    people with different ethnicity and gender . Cur- rently we have four subjects, but more can be added in the future. • Lighting Variations. We consider...is however not a proper distance as the triangular inequality condition is not met. For this reason, the next metric should be preferred. • the...and Alan F. Smeaton and Georges Quenot, An Overview of the Goals, Tasks, Data, Evaluation Mechanisms and Metrics, Proceedings of TRECVID 2011, NIST, USA

  16. The Tasks of Information Systems Professionals in the Philippines, Thailand, Indonesia and Malaysia

    Directory of Open Access Journals (Sweden)

    Graham Winley

    1997-05-01

    Full Text Available This paper presents the results of an empirical study into the present and future tasks expected of information systems professionals in a range of organisations in South-east Asian nations. Information was collected using a three round delphi study technique across a period of one year. The views of senior personnel working in information systems positions in organisations in these nations are analysed and compared. The results are related to the present and expected future profiles of these organisations and are of relevance to those responsible for developing IS curricula in academic institutions as well as commercial providers of education and training courses in these important South-east Asian nations. The study also contributes to an emerging body of knowledge concerning the development of the IS profession in the developing nations of South-east Asia.

  17. Metaheuristic Based Scheduling Meta-Tasks in Distributed Heterogeneous Computing Systems

    Directory of Open Access Journals (Sweden)

    Hesam Izakian

    2009-07-01

    Full Text Available Scheduling is a key problem in distributed heterogeneous computing systems in order to benefit from the large computing capacity of such systems and is an NP-complete problem. In this paper, we present a metaheuristic technique, namely the Particle Swarm Optimization (PSO algorithm, for this problem. PSO is a population-based search algorithm based on the simulation of the social behavior of bird flocking and fish schooling. Particles fly in problem search space to find optimal or near-optimal solutions. The scheduler aims at minimizing makespan, which is the time when finishes the latest task. Experimental studies show that the proposed method is more efficient and surpasses those of reported PSO and GA approaches for this problem.

  18. Report of the Census Task Force on beamline control system requirements

    International Nuclear Information System (INIS)

    Barsotti, E.J.; Bartlett, J.F.; Bogert, V.D.; Borcherding, F.O.; Butler, J.; Czarapata, P.C.; Spalding, W.J.; Thomas, A.D.

    1986-01-01

    A special task force was appointed to study the experience with the present beamline control system at Fermilab and to make recommendations in this area. The charge of the committee and the list of its members are appended. In order to carry out its assignment, the committee conducted a series of meetings in which it discussed the controls situation in general and the best way to approach the user community. The various groups of users were identified, and a letter was written to representatives of these groups asking questions concerning the present system and future needs. The committee met with each group to discuss the response to these questions. Written summaries of the discussions are appended. Conclusions are drawn regarding current problems, systematic upgrades and specific recommendations

  19. A task-based support architecture for developing point-of-care clinical decision support systems for the emergency department.

    Science.gov (United States)

    Wilk, S; Michalowski, W; O'Sullivan, D; Farion, K; Sayyad-Shirabad, J; Kuziemsky, C; Kukawka, B

    2013-01-01

    The purpose of this study was to create a task-based support architecture for developing clinical decision support systems (CDSSs) that assist physicians in making decisions at the point-of-care in the emergency department (ED). The backbone of the proposed architecture was established by a task-based emergency workflow model for a patient-physician encounter. The architecture was designed according to an agent-oriented paradigm. Specifically, we used the O-MaSE (Organization-based Multi-agent System Engineering) method that allows for iterative translation of functional requirements into architectural components (e.g., agents). The agent-oriented paradigm was extended with ontology-driven design to implement ontological models representing knowledge required by specific agents to operate. The task-based architecture allows for the creation of a CDSS that is aligned with the task-based emergency workflow model. It facilitates decoupling of executable components (agents) from embedded domain knowledge (ontological models), thus supporting their interoperability, sharing, and reuse. The generic architecture was implemented as a pilot system, MET3-AE--a CDSS to help with the management of pediatric asthma exacerbation in the ED. The system was evaluated in a hospital ED. The architecture allows for the creation of a CDSS that integrates support for all tasks from the task-based emergency workflow model, and interacts with hospital information systems. Proposed architecture also allows for reusing and sharing system components and knowledge across disease-specific CDSSs.

  20. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  1. Reward contingencies and the recalibration of task monitoring and reward systems: a high-density electrical mapping study.

    Science.gov (United States)

    Morie, K P; De Sanctis, P; Foxe, J J

    2014-07-25

    Task execution almost always occurs in the context of reward-seeking or punishment-avoiding behavior. As such, ongoing task-monitoring systems are influenced by reward anticipation systems. In turn, when a task has been executed either successfully or unsuccessfully, future iterations of that task will be re-titrated on the basis of the task outcome. Here, we examined the neural underpinnings of the task-monitoring and reward-evaluation systems to better understand how they govern reward-seeking behavior. Twenty-three healthy adult participants performed a task where they accrued points that equated to real world value (gift cards) by responding as rapidly as possible within an allotted timeframe, while success rate was titrated online by changing the duration of the timeframe dependent on participant performance. Informative cues initiated each trial, indicating the probability of potential reward or loss (four levels from very low to very high). We manipulated feedback by first informing participants of task success/failure, after which a second feedback signal indicated actual magnitude of reward/loss. High-density electroencephalography (EEG) recordings allowed for examination of event-related potentials (ERPs) to the informative cues and in turn, to both feedback signals. Distinct ERP components associated with reward cues, task-preparatory and task-monitoring processes, and reward feedback processes were identified. Unsurprisingly, participants displayed increased ERP amplitudes associated with task-preparatory processes following cues that predicted higher chances of reward. They also rapidly updated reward and loss prediction information dependent on task performance after the first feedback signal. Finally, upon reward receipt, initial reward probability was no longer taken into account. Rather, ERP measures suggested that only the magnitude of actual reward or loss was now processed. Reward and task-monitoring processes are clearly dissociable, but

  2. Software quality metrics aggregation in industry

    NARCIS (Netherlands)

    Mordal, K.; Anquetil, N.; Laval, J.; Serebrenik, A.; Vasilescu, B.N.; Ducasse, S.

    2013-01-01

    With the growing need for quality assessment of entire software systems in the industry, new issues are emerging. First, because most software quality metrics are defined at the level of individual software components, there is a need for aggregation methods to summarize the results at the system

  3. MRS [monitored retrievable storage] Systems Study Task 1 report: Waste management system reliability analysis

    International Nuclear Information System (INIS)

    Clark, L.L.; Myers, R.S.

    1989-04-01

    This is one of nine studies undertaken by contractors to the US Department of Energy (DOE), Office of Civilian Radioactive Waste Management (OCRWM), to provide a technical basis for re-evaluating the role of a monitored retrievable storage (MRS) facility. The study evaluates the relative reliabilities of systems with and without an MRS facility using current facility design bases. The principal finding of this report is that the MRS system has several operational advantages that enhance system reliability. These are: (1) the MRS system is likely to encounter fewer technical issues, (2) the MRS would assure adequate system surface storage capacity to accommodate repository construction and startup delays of up to five years or longer if the Nuclear Waste Policy Amendments Act (NWPAA) were amended, (3) the system with an MRS has two federal acceptance facilities with parallel transportation routing and surface storage capacity, and (4) the MRS system would allow continued waste acceptance for up to a year after a major disruption of emplacement operations at the repository

  4. Underwater-manipulation system for measuring- and cutting tasks in dismantling decommissioned nuclear facilities. Final report

    International Nuclear Information System (INIS)

    Stegemann, D.; Reimche, W.; Hansch, M.; Spitzer, M.

    1995-01-01

    Not only manipulators are necessary for dismantling and inspection of structure parts in decomissioned nuclear facilities, but flexible underwater-vehicles. Free-diving underwater-vehicles for inspection and dismantling tasks are still not developed and tested. Aim of the project is the development of sensors and devices for the position determination and the depth regulation. For inspection tasks an ultrasonic measurement and dosimeter device shall be built up. A measurement device has been developed which evaluates the ultrasonic time of flight from a transmitter at the vehicle to several receivers, installed in the reactor pressure vessel. The depth regulation is based on a pressure sensor and the direct control of the thrusters. The ultrasonic measurements are realized by an adapted ultrasonic card, the γ-dosimetry with an ionization chamber and a pA-amplifier. An acoustic orientation system was built up, which measures very accurately with one transmitter mounted on the vehicle and four receivers. Problem occur by reflection from the walls of the basin. The depth regulation is working faultless. The ultrasonic device is preferably used for distance measurement. The radiation measurement device was tested and mounted in the vehicle. (orig./HP) [de

  5. Exploring the effects of task shifting for HIV through a systems thinking lens: the case of Burkina Faso.

    Science.gov (United States)

    Yaya Bocoum, Fadima; Kouanda, Seni; Kouyaté, Bocar; Hounton, Sennen; Adam, Taghreed

    2013-10-22

    While the impact of task shifting on quality of care and clinical outcomes has been demonstrated in several studies, evidence on its impact on the health system as a whole is limited. This study has two main objectives. The first is to conceptualize the wider range of effects of task shifting through a systems thinking lens. The second is to explore these effects using task shifting for HIV in Burkina Faso as a case study. We used a case study approach, using qualitative research methods. Data sources included document reviews, reviews of available data and records, as well as interviews with key informants and health workers. In addition to the traditional measures of impact of task shifting on health outcomes, our study identified 20 possible effects of the strategy on the system as a whole. Moreover, our analysis highlighted the importance of differentiating between two types of health systems effects. The first are effects inherent to the task shifting strategy itself, such as job satisfaction or better access to health services. The second are effects due to health system barriers, for example the unavailability of medicines and supplies, generating a series of effects on the various components of the health system, e.g., staff frustration.Among the health systems effects that we found are positive, mostly unintended, effects and synergies such as increased health workers' sense of responsibility and worthiness, increased satisfaction due to using the newly acquired skills in other non-HIV tasks, as well as improved patient-provider relationships. Among the negative unintended effects are staff frustration due to lack of medicines and supplies or lack of the necessary infrastructure to be able to perform the new tasks. Our analysis highlights the importance of adopting a systems thinking approach in designing, implementing and evaluating health policies to mitigate some of the design issues or system bottle-necks that may impede their successful implementation

  6. Heuristic extension of the Schwarzschild metric

    International Nuclear Information System (INIS)

    Espinosa, J.M.

    1982-01-01

    The Schwarzschild solution of Einstein's equations of gravitation has several singularities. It is known that the singularity at r = 2Gm/c 2 is only apparent, a result of the coordinates in which the solution was found. Paradoxical results occuring near the singularity show the system of coordinates is incomplete. We introduce a simple, two-dimensional metric with an apparent singularity that makes it incomplete. By a straightforward, heuristic procedure we extend and complete this simple metric. We then use the same procedure to give a heuristic derivation of the Kruskal system of coordinates, which is known to extend the Schwarzschild manifold past its apparent singularity and produce a complete manifold

  7. Metric inhomogeneous Diophantine approximation in positive characteristic

    DEFF Research Database (Denmark)

    Kristensen, Simon

    2011-01-01

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here `almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine-Groshev Theorem and zero...

  8. Metric inhomogeneous Diophantine approximation in positive characteristic

    DEFF Research Database (Denmark)

    Kristensen, S.

    We obtain asymptotic formulae for the number of solutions to systems of inhomogeneous linear Diophantine inequalities over the field of formal Laurent series with coefficients from a finite fields, which are valid for almost every such system. Here 'almost every' is with respect to Haar measure...... of the coefficients of the homogeneous part when the number of variables is at least two (singly metric case), and with respect to the Haar measure of all coefficients for any number of variables (doubly metric case). As consequences, we derive zero-one laws in the spirit of the Khintchine--Groshev Theorem and zero...

  9. Modeling, control and optimization of water systems systems engineering methods for control and decision making tasks

    CERN Document Server

    2016-01-01

    This book provides essential background knowledge on the development of model-based real-world solutions in the field of control and decision making for water systems. It presents system engineering methods for modelling surface water and groundwater resources as well as water transportation systems (rivers, channels and pipelines). The models in turn provide information on both the water quantity (flow rates, water levels) of surface water and groundwater and on water quality. In addition, methods for modelling and predicting water demand are described. Sample applications of the models are presented, such as a water allocation decision support system for semi-arid regions, a multiple-criteria control model for run-of-river hydropower plants, and a supply network simulation for public services.

  10. Tank Waste Remediation System (TWRS) Retrieval Authorization Basis Amendment Task Plan

    International Nuclear Information System (INIS)

    HARRIS, J.P.

    1999-01-01

    This task plan is a documented agreement between Nuclear Safety and Licensing and Retrieval Engineering. The purpose of this task plan is to identify the scope of work, tasks and deliverables, responsibilities, manpower, and schedules associated with an authorization basis amendment as a result of the Waste Feed Delivery Program, Project W-211, Project W-521, and Project W-522

  11. Landscape pattern metrics and regional assessment

    Science.gov (United States)

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  12. Task-driven orbit design and implementation on a robotic C-arm system for cone-beam CT

    Science.gov (United States)

    Ouadah, S.; Jacobson, M.; Stayman, J. W.; Ehtiati, T.; Weiss, C.; Siewerdsen, J. H.

    2017-03-01

    Purpose: This work applies task-driven optimization to the design of non-circular orbits that maximize imaging performance for a particular imaging task. First implementation of task-driven imaging on a clinical robotic C-arm system is demonstrated, and a framework for orbit calculation is described and evaluated. Methods: We implemented a task-driven imaging framework to optimize orbit parameters that maximize detectability index d'. This framework utilizes a specified Fourier domain task function and an analytical model for system spatial resolution and noise. Two experiments were conducted to test the framework. First, a simple task was considered consisting of frequencies lying entirely on the fz-axis (e.g., discrimination of structures oriented parallel to the central axial plane), and a "circle + arc" orbit was incorporated into the framework as a means to improve sampling of these frequencies, and thereby increase task-based detectability. The orbit was implemented on a robotic C-arm (Artis Zeego, Siemens Healthcare). A second task considered visualization of a cochlear implant simulated within a head phantom, with spatial frequency response emphasizing high-frequency content in the (fy, fz) plane of the cochlea. An optimal orbit was computed using the task-driven framework, and the resulting image was compared to that for a circular orbit. Results: For the fz-axis task, the circle + arc orbit was shown to increase d' by a factor of 1.20, with an improvement of 0.71 mm in a 3D edge-spread measurement for edges located far from the central plane and a decrease in streak artifacts compared to a circular orbit. For the cochlear implant task, the resulting orbit favored complementary views of high tilt angles in a 360° orbit, and d' was increased by a factor of 1.83. Conclusions: This work shows that a prospective definition of imaging task can be used to optimize source-detector orbit and improve imaging performance. The method was implemented for execution of

  13. Heat pump concepts for nZEB Technology developments, design tools and testing of heat pump systems for nZEB in the USA: Country report IEA HPT Annex 40 Task 2, Task 3 and Task 4 of the USA

    Energy Technology Data Exchange (ETDEWEB)

    Baxter, Van D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Payne, W. Vance [National Inst. of Standards and Technology (NIST), Gaithersburg, MD (United States); Ling, Jiazhen [Univ. of Maryland, College Park, MD (United States); Radermacher, Reinhard [Univ. of Maryland, College Park, MD (United States)

    2015-12-01

    The IEA HPT Annex 40 "Heat pump concepts for Nearly Zero Energy Buildings" deals with the application of heat pumps as a core component of the HVAC system for Nearly or Net Zero energy buildings (nZEB). This report covers Task 2 on the system comparison and optimisation and Task 3 dedicated to the development of adapted technologies for nZEB and field monitoring results of heat pump systems in nZEB. In the US team three institutions are involved and have worked on the following projects: The Oak Ridge National Laboratory (ORNL) will summarize development activities through the field demonstration stage for several integrated heat pump (IHP) systems electric ground-source (GS-IHP) and air-source (AS-IHP) versions and an engine driven AS-IHP version. The first commercial GS-IHP product was just introduced to the market in December 2012. This work is a contribution to Task 3 of the Annex. The University of Maryland will contribute a software development project to Task 2 of the Annex. The software ThermCom evaluates occupied space thermal comfort conditions accounting for all radiative and convective heat transfer effects as well as local air properties. The National Institute of Standards and Technology (NIST) is working on a field study effort on the NIST Net Zero Energy Residential Test Facility (NZERTF). This residential building was constructed on the NIST campus and officially opened in summer 2013. During the first year, between July 2013 and June 2014, baseline performance of the NZERTF was monitored under a simulated occupancy protocol. The house was equipped with an air-to-air heat pump which included a dedicated dehumidification operating mode. Outdoor conditions, internal loads and modes of heat pump operation were monitored. Field study results with respect to heat pump operation will be reported and recommendations on heat pump optimization for a net zero energy building will be provided. This work is a contribution to Task 3 of the Annex.

  14. Information Entropy-Based Metrics for Measuring Emergences in Artificial Societies

    Directory of Open Access Journals (Sweden)

    Mingsheng Tang

    2014-08-01

    Full Text Available Emergence is a common phenomenon, and it is also a general and important concept in complex dynamic systems like artificial societies. Usually, artificial societies are used for assisting in resolving several complex social issues (e.g., emergency management, intelligent transportation system with the aid of computer science. The levels of an emergence may have an effect on decisions making, and the occurrence and degree of an emergence are generally perceived by human observers. However, due to the ambiguity and inaccuracy of human observers, to propose a quantitative method to measure emergences in artificial societies is a meaningful and challenging task. This article mainly concentrates upon three kinds of emergences in artificial societies, including emergence of attribution, emergence of behavior, and emergence of structure. Based on information entropy, three metrics have been proposed to measure emergences in a quantitative way. Meanwhile, the correctness of these metrics has been verified through three case studies (the spread of an infectious influenza, a dynamic microblog network, and a flock of birds with several experimental simulations on the Netlogo platform. These experimental results confirm that these metrics increase with the rising degree of emergences. In addition, this article also has discussed the limitations and extended applications of these metrics.

  15. An electrophysiological study of the impact of a Forward Collision Warning System in a simulator driving task.

    Science.gov (United States)

    Bueno, Mercedes; Fabrigoule, Colette; Deleurence, Philippe; Ndiaye, Daniel; Fort, Alexandra

    2012-08-27

    Driver distraction has been identified as the most important contributing factor in rear-end collisions. In this context, Forward Collision Warning Systems (FCWS) have been developed specifically to warn drivers of potential rear-end collisions. The main objective of this work is to evaluate the impact of a surrogate FCWS and of its reliability according to the driver's attentional state by recording both behavioral and electrophysiological data. Participants drove following a lead motorcycle in a simplified simulator with or without a warning system which gave forewarning of the preceding vehicle braking. Participants had to perform this driving task either alone (simple task) or simultaneously with a secondary cognitive task (dual task). Behavioral and electrophysiological data contributed to revealing a positive effect of the warning system. Participants were faster in detecting the brake light when the system was perfect or imperfect, and the time and attentional resources allocation required for processing the target at higher cognitive level were reduced when the system was completely reliable. When both tasks were performed simultaneously, warning effectiveness was considerably affected at both performance and neural levels; however, the analysis of the brain activity revealed fewer differences between distracted and undistracted drivers when using the warning system. These results show that electrophysiological data could be a valuable tool to complement behavioral data and to have a better understanding of how these systems impact the driver. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Engineering Task Plan for the 241-AN-105 Multi-Function Corrosion Monitoring System

    International Nuclear Information System (INIS)

    EDGEMON, G.L.

    1999-01-01

    This Engineering Task Plan (ETP) describes the activities associated with the installation of the corrosion probe assembly into riser WST-RISER-016 (formerly 15B) of tank 241-AN-105. The corrosion monitoring system utilizes the technique of electrochemical noise (EN) for monitoring waste tank corrosion. Typically, EN consists of low frequency (4 Hz) and small amplitude signals that are spontaneously generated by electrochemical reactions occurring at corroding or other surfaces. EN analysis is well suited for monitoring and identifying the onset of localized corrosion, and for measuring uniform corrosion rates. A typical EN based corrosion-monitoring system measures instantaneous fluctuations in corrosion current and potential between three nominally identical electrodes of the material of interest immersed in the environment of interest. Time-dependent fluctuations in corrosion current are described by electrochemical current noise, and time-dependent fluctuations of corrosion potential are described by electrochemical noise. The corrosion monitoring system is designed to detect the onset of localized corrosion phenomena if tank conditions should change to allow these phenomena to occur. In addition to the EN technique, the system also facilitates the use of the Linear Polarization Resistance (LPR) technique to collect uniform corrosion rate information. LPR measures the linearity at the origin of the polarization curve for overvoltages up to a few millivolts away from the rest potential or natural corrosion potential. The slope of the current vs. voltage plot gives information on uniform corrosion rates

  17. Physiological responses and air consumption during simulated firefighting tasks in a subway system.

    Science.gov (United States)

    Williams-Bell, F Michael; Boisseau, Geoff; McGill, John; Kostiuk, Andrew; Hughson, Richard L

    2010-10-01

    Professional firefighters (33 men, 3 women), ranging in age from 30 to 53 years, participated in a simulation of a subway system search and rescue while breathing from their self-contained breathing apparatus (SCBA). We tested the hypothesis that during this task, established by expert firefighters to be of moderate intensity, the rate of air consumption would exceed the capacity of a nominal 30-min cylinder. Oxygen uptake, carbon dioxide output, and air consumption were measured with a portable breath-by-breath gas exchange analysis system, which was fully integrated with the expired port of the SCBA. The task involved descending a flight of stairs, walking, performing a search and rescue, retreat walking, then ascending a single flight of stairs to a safe exit. This scenario required between 9:56 and 13:24 min:s (mean, 12:10 ± 1:10 min:s) to complete, with an average oxygen uptake of 24.3 ± 4.5 mL kg(-1) min(-1) (47 ± 10 % peak oxygen uptake) and heart rate of 76% ± 7% of maximum. The highest energy requirement was during the final single-flight stair climb (30.4 ± 5.4 mL kg(-1) min(-1)). The average respiratory exchange ratio (carbon dioxide output/oxygen uptake) throughout the scenario was 0.95 ± 0.08, indicating a high carbon dioxide output for a relatively moderate average energy requirement. Air consumption from the nominal "30-min" cylinder averaged 51% (range, 26%-68%); however, extrapolation of these rates of consumption suggested that the low-air alarm, signalling that only 25% of the air remains, would have occurred as early as 11 min for an individual with the highest rate of air consumption, and at 16 min for the group average. These data suggest that even the moderate physical demands of walking combined with search and rescue while wearing full protective gear and breathing through the SCBA impose considerable physiological strain on professional firefighters. As well, the rate of air consumption in these tasks classed as moderate, compared

  18. Task Migration for Fault-Tolerance in Mixed-Criticality Embedded Systems

    DEFF Research Database (Denmark)

    Saraswat, Prabhat Kumar; Pop, Paul; Madsen, Jan

    2009-01-01

    In this paper we are interested in mixed-criticality embedded applications implemented on distributed architectures. Depending on their time-criticality, tasks can be hard or soft real-time and regarding safety-criticality, tasks can be fault-tolerant to transient faults, permanent faults, or have...... processors, such that the faults are tolerated, the deadlines for the hard real-time tasks are satisfied and the QoS for soft tasks is maximized. The proposed online adaptive approach has been evaluated using several synthetic benchmarks and a real-life case study....... no dependability requirements. We use Earliest Deadline First (EDF) scheduling for the hard tasks and the Constant Bandwidth Server (CBS) for the soft tasks. The CBS parameters determine the quality of service (QoS) of soft tasks. Transient faults are tolerated using checkpointing with roll- back recovery...

  19. Task 3.0 - Advanced power systems. Subtask 3.18 - Ash behavior in power systems

    International Nuclear Information System (INIS)

    Zygarlicke, Christopher J.; Mccollor, Donald P.; Kay, John P.; Swanson, Michael L.

    1998-01-01

    The overall goal of this initiative is to develop fundamental knowledge of ash behavior in power systems for the purpose of increasing power production efficiency, reducing operation and maintenance costs, and reducing greenhouse gas emissions into the atmosphere. The specific objectives of this initiative focus primarily on ash behavior related to advanced power systems and include the following: Determine the current status of the fundamental ash interactions and deposition formation mechanisms as already reported through previous or ongoing projects at the EERC or in the literature. Determine sintering mechanisms for temperatures and particle compositions that are less well known and remain for the most part undetermined. Identify the relationship between the temperature of critical viscosity (T cv ) as measured in a viscometer and the crystallization occurring in the melt. Perform a literature search on the use of heated-stage microscopy (HSM) for examining in situ ash-sintering phenomena and then validate the use of HSM in the determination of viscosity in spherical ash particles. Ascertain the formation and stability of specific mineral or amorphous phases in deposits typical of advanced power systems. Evaluate corrosion for alloys being used in supercritical combustion systems

  20. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  1. Reproducibility of graph metrics of human brain functional networks.

    Science.gov (United States)

    Deuker, Lorena; Bullmore, Edward T; Smith, Marie; Christensen, Soren; Nathan, Pradeep J; Rockstroh, Brigitte; Bassett, Danielle S

    2009-10-01

    Graph theory provides many metrics of complex network organization that can be applied to analysis of brain networks derived from neuroimaging data. Here we investigated the test-retest reliability of graph metrics of functional networks derived from magnetoencephalography (MEG) data recorded in two sessions from 16 healthy volunteers who were studied at rest and during performance of the n-back working memory task in each session. For each subject's data at each session, we used a wavelet filter to estimate the mutual information (MI) between each pair of MEG sensors in each of the classical frequency intervals from gamma to low delta in the overall range 1-60 Hz. Undirected binary graphs were generated by thresholding the MI matrix and 8 global network metrics were estimated: the clustering coefficient, path length, small-worldness, efficiency, cost-efficiency, assortativity, hierarchy, and synchronizability. Reliability of each graph metric was assessed using the intraclass correlation (ICC). Good reliability was demonstrated for most metrics applied to the n-back data (mean ICC=0.62). Reliability was greater for metrics in lower frequency networks. Higher frequency gamma- and beta-band networks were less reliable at a global level but demonstrated high reliability of nodal metrics in frontal and parietal regions. Performance of the n-back task was associated with greater reliability than measurements on resting state data. Task practice was also associated with greater reliability. Collectively these results suggest that graph metrics are sufficiently reliable to be considered for future longitudinal studies of functional brain network changes.

  2. Applying systems thinking to task shifting for mental health using lay providers: a review of the evidence.

    Science.gov (United States)

    Javadi, D; Feldhaus, I; Mancuso, A; Ghaffar, A

    2017-01-01

    This paper seeks to review the available evidence to determine whether a systems approach is employed in the implementation and evaluation of task shifting for mental health using lay providers in low- and middle-income countries, and to highlight system-wide effects of task-shifting strategies in order to better inform efforts to strength community mental health systems. Pubmed, CINAHL, and Cochrane Library databases were searched. Articles were screened by two independent reviewers with a third reviewer resolving discrepancies. Two stages of screens were done to ensure sensitivity. Studies were analysed using the World Health Organization's building blocks framework with the addition of a community building block, and systems thinking characteristics to determine the extent to which system-wide effects had been considered. Thirty studies were included. Almost all studies displayed positive findings on mental health using task shifting. One study showed no effect. No studies explicitly employed systems thinking tools, but some demonstrated systems thinking characteristics, such as exploring various stakeholder perspectives, capturing unintended consequences, and looking across sectors for system-wide impact. Twenty-five of the 30 studies captured elements other than the most directly relevant building blocks of service delivery and health workforce. There is a lack of systematic approaches to exploring complexity in the evaluation of task-shifting interventions. Systems thinking tools should support evidence-informed decision making for a more complete understanding of community-based systems strengthening interventions for mental health.

  3. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  4. Proxy Graph: Visual Quality Metrics of Big Graph Sampling.

    Science.gov (United States)

    Nguyen, Quan Hoang; Hong, Seok-Hee; Eades, Peter; Meidiana, Amyra

    2017-06-01

    Data sampling has been extensively studied for large scale graph mining. Many analyses and tasks become more efficient when performed on graph samples of much smaller size. The use of proxy objects is common in software engineering for analysis and interaction with heavy objects or systems. In this paper, we coin the term 'proxy graph' and empirically investigate how well a proxy graph visualization can represent a big graph. Our investigation focuses on proxy graphs obtained by sampling; this is one of the most common proxy approaches. Despite the plethora of data sampling studies, this is the first evaluation of sampling in the context of graph visualization. For an objective evaluation, we propose a new family of quality metrics for visual quality of proxy graphs. Our experiments cover popular sampling techniques. Our experimental results lead to guidelines for using sampling-based proxy graphs in visualization.

  5. The International Neuroblastoma Risk Group (INRG) staging system: an INRG Task Force report.

    Science.gov (United States)

    Monclair, Tom; Brodeur, Garrett M; Ambros, Peter F; Brisse, Hervé J; Cecchetto, Giovanni; Holmes, Keith; Kaneko, Michio; London, Wendy B; Matthay, Katherine K; Nuchtern, Jed G; von Schweinitz, Dietrich; Simon, Thorsten; Cohn, Susan L; Pearson, Andrew D J

    2009-01-10

    The International Neuroblastoma Risk Group (INRG) classification system was developed to establish a consensus approach for pretreatment risk stratification. Because the International Neuroblastoma Staging System (INSS) is a postsurgical staging system, a new clinical staging system was required for the INRG pretreatment risk classification system. To stage patients before any treatment, the INRG Task Force, consisting of neuroblastoma experts from Australia/New Zealand, China, Europe, Japan, and North America, developed a new INRG staging system (INRGSS) based on clinical criteria and image-defined risk factors (IDRFs). To investigate the impact of IDRFs on outcome, survival analyses were performed on 661 European patients with INSS stages 1, 2, or 3 disease for whom IDRFs were known. In the INGRSS, locoregional tumors are staged L1 or L2 based on the absence or presence of one or more of 20 IDRFs, respectively. Metastatic tumors are defined as stage M, except for stage MS, in which metastases are confined to the skin, liver, and/or bone marrow in children younger than 18 months of age. Within the 661-patient cohort, IDRFs were present (ie, stage L2) in 21% of patients with stage 1, 45% of patients with stage 2, and 94% of patients with stage 3 disease. Patients with INRGSS stage L2 disease had significantly lower 5-year event-free survival than those with INRGSS stage L1 disease (78% +/- 4% v 90% +/- 3%; P = .0010). Use of the new staging (INRGSS) and risk classification (INRG) of neuroblastoma will greatly facilitate the comparison of risk-based clinical trials conducted in different regions of the world.

  6. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    Energy Technology Data Exchange (ETDEWEB)

    Sathiaseelan, V [Northwestern Memorial Hospital, Chicago, IL (United States); Thomadsen, B [University of Wisconsin, Madison, WI (United States)

    2014-06-15

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  7. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    International Nuclear Information System (INIS)

    Sathiaseelan, V; Thomadsen, B

    2014-01-01

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  8. Evolving the JET virtual reality system for delivering the JET EP2 shutdown remote handling tasks

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Adrian, E-mail: adrian.williams@oxfordtechnologies.co.uk [Oxford Technologies Ltd., 7 Nuffield Way, Abingdon, Oxon, OX14 1RJ (United Kingdom); JET-EFDA, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Sanders, Stephen [Oxford Technologies Ltd., 7 Nuffield Way, Abingdon, Oxon, OX14 1RJ (United Kingdom); JET-EFDA, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Weder, Gerard [Tree-C Technology BV, Buys Ballotstraat 8, 6716 BL Ede (Netherlands); JET-EFDA, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Bastow, Roger; Allan, Peter; Hazel, Stuart [CCFE, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); JET-EFDA, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom)

    2011-10-15

    The quality, functionality and performance of the virtual reality (VR) system used at JET for preparation and implementation of remote handling (RH) operations has been progressively enhanced since its first use in the original JET remote handling shutdown in 1998. As preparation began for the JET EP2 (Enhanced Performance 2) shutdown it was recognised that the VR system being used was unable to cope with the increased functionality and the large number of 3D models needed to fully represent the JET in-vessel components and tooling planned for EP2. A bespoke VR software application was developed in collaboration with the OEM, which allowed enhancements to be made to the VR system to meet the requirements of JET remote handling in preparation for EP2. Performance improvements required to meet the challenges of EP2 could not be obtained from the development of the new VR software alone. New methodologies were also required to prepare source, CATIA models for use in the VR using a collection of 3D software packages. In collaboration with the JET drawing office, techniques were developed within CATIA using polygon reduction tools to reduce model size, while retaining surface detail at required user limits. This paper will discuss how these developments have played an essential part in facilitating EP2 remote handling task development and examine their impact during the EP2 shutdown.

  9. Evolving the JET virtual reality system for delivering the JET EP2 shutdown remote handling tasks

    International Nuclear Information System (INIS)

    Williams, Adrian; Sanders, Stephen; Weder, Gerard; Bastow, Roger; Allan, Peter; Hazel, Stuart

    2011-01-01

    The quality, functionality and performance of the virtual reality (VR) system used at JET for preparation and implementation of remote handling (RH) operations has been progressively enhanced since its first use in the original JET remote handling shutdown in 1998. As preparation began for the JET EP2 (Enhanced Performance 2) shutdown it was recognised that the VR system being used was unable to cope with the increased functionality and the large number of 3D models needed to fully represent the JET in-vessel components and tooling planned for EP2. A bespoke VR software application was developed in collaboration with the OEM, which allowed enhancements to be made to the VR system to meet the requirements of JET remote handling in preparation for EP2. Performance improvements required to meet the challenges of EP2 could not be obtained from the development of the new VR software alone. New methodologies were also required to prepare source, CATIA models for use in the VR using a collection of 3D software packages. In collaboration with the JET drawing office, techniques were developed within CATIA using polygon reduction tools to reduce model size, while retaining surface detail at required user limits. This paper will discuss how these developments have played an essential part in facilitating EP2 remote handling task development and examine their impact during the EP2 shutdown.

  10. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  11. The Use of Human Modeling of EVA Tasks as a Systems Engineering Tool

    Science.gov (United States)

    Dischinger, H. Charles, Jr.; Schmidt, Henry J.; Kross, Dennis A. (Technical Monitor)

    2001-01-01

    Computer-generated human models have been used in aerospace design for a decade. They have come to be highly reliable for worksite analysis of certain types of EVA tasks. In many design environments, this analysis comes after the structural design is largely complete. However, the use of these models as a development tool is gaining acceptance within organizations that practice good systems engineering processes. The design of the United States Propulsion Module for the International Space Station provides an example of this application. The Propulsion Module will provide augmentation to the propulsion capability supplied by the Russian Service Module Zvezda. It is a late addition to the set of modules provided by the United States to the ISS Program, and as a result, faces design challenges that result from the level of immaturity of its integration into the Station. Among these are heat dissipation and physical envelopes. Since the rest of the Station was designed to maximize the use of the cooling system, little margin is available for the addition of another module. The Propulsion Module will attach at the forward end of the Station, and will be between the Orbiter and the rest of ISS. Since cargo must be removed from the Payload Bay and transferred to Station by the Canadarm, there is a potential for protrusions from the module, such as thruster booms, to interfere with robotic operations. These and similar engineering issues must be addressed as part of the development. In the implementation of good system design, all design solutions should be analyzed for compatibility with all affected subsystems. Human modeling has been used in this project to provide rapid input to system trades of design concepts. For example, the placement of radiators and avionics components for optimization of heat dissipation had to be examined for feasibility of EVA translation paths and worksite development. Likewise, the location of and mechanism for the retraction of thruster

  12. Metric adjusted skew information

    DEFF Research Database (Denmark)

    Hansen, Frank

    2008-01-01

    ) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We...... establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state...

  13. Neuroticism modulates brain visuo-vestibular and anxiety systems during a virtual rollercoaster task.

    Science.gov (United States)

    Riccelli, Roberta; Indovina, Iole; Staab, Jeffrey P; Nigro, Salvatore; Augimeri, Antonio; Lacquaniti, Francesco; Passamonti, Luca

    2017-02-01

    Different lines of research suggest that anxiety-related personality traits may influence the visual and vestibular control of balance, although the brain mechanisms underlying this effect remain unclear. To our knowledge, this is the first functional magnetic resonance imaging (fMRI) study that investigates how individual differences in neuroticism and introversion, two key personality traits linked to anxiety, modulate brain regional responses and functional connectivity patterns during a fMRI task simulating self-motion. Twenty-four healthy individuals with variable levels of neuroticism and introversion underwent fMRI while performing a virtual reality rollercoaster task that included two main types of trials: (1) trials simulating downward or upward self-motion (vertical motion), and (2) trials simulating self-motion in horizontal planes (horizontal motion). Regional brain activity and functional connectivity patterns when comparing vertical versus horizontal motion trials were correlated with personality traits of the Five Factor Model (i.e., neuroticism, extraversion-introversion, openness, agreeableness, and conscientiousness). When comparing vertical to horizontal motion trials, we found a positive correlation between neuroticism scores and regional activity in the left parieto-insular vestibular cortex (PIVC). For the same contrast, increased functional connectivity between the left PIVC and right amygdala was also detected as a function of higher neuroticism scores. Together, these findings provide new evidence that individual differences in personality traits linked to anxiety are significantly associated with changes in the activity and functional connectivity patterns within visuo-vestibular and anxiety-related systems during simulated vertical self-motion. Hum Brain Mapp 38:715-726, 2017. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  14. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  15. Engineering Task Plan for Routine Engineering Support for Core Sampler System

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    Routine engineering support is required during normal operation of the core sampler trucks and associated ancillary equipment. This engineering support consists of, but is not limited to, troubleshooting operation problems, correcting minor design problems, assistance with work package preparation, assistance with procurement, fabrication shop support, planning of engineering tasks and preparation of associated Engineering Task Plans (ETP) and Engineering Service Requests (ESR). This ETP is the management plan document for implementing routine engineering support. Any additional changes to the scope of this ETP shall require a Letter of Instruction from Lockheed Martin Hanford Corp (LMHC). This document will also be the Work Planning Document for Development Control (HNF 1999a). The scope of this task will be to provide routine engineering support for Characterization equipment as required to support Characterization Operations. A task by task decision will be made by management to determine which tasks will be done per this ETP and if additional ETPs and/or ESRs are required. Due to the unique nature of this task, the only identifiable deliverable is to provide support as requested. Deliverables will be recorded in a task logbook as activities are identified. ESRs will be generated for tasks that require more than 40 person hours to complete, per Characterization Engineering Desk Instructions (DI 1999a)

  16. Metric approach to quantum constraints

    International Nuclear Information System (INIS)

    Brody, Dorje C; Hughston, Lane P; Gustavsson, Anna C T

    2009-01-01

    A framework for deriving equations of motion for constrained quantum systems is introduced and a procedure for its implementation is outlined. In special cases, the proposed new method, which takes advantage of the fact that the space of pure states in quantum mechanics has both a symplectic structure and a metric structure, reduces to a quantum analogue of the Dirac theory of constraints in classical mechanics. Explicit examples involving spin-1/2 particles are worked out in detail: in the first example, our approach coincides with a quantum version of the Dirac formalism, while the second example illustrates how a situation that cannot be treated by Dirac's approach can nevertheless be dealt with in the present scheme.

  17. Instrumentation Needs for Integral Primary System Reactors (IPSRs) - Task 1 Final Report

    International Nuclear Information System (INIS)

    Gary D Storrick; Bojan Petrovic; Luca Oriani; Lawrence E Conway; Diego Conti

    2005-01-01

    This report presents the results of the Westinghouse work performed under Task 1 of this Financial Assistance Award and satisfies a Level 2 Milestone for the project. While most of the signals required for control of IPSRs are typical of other PWRs, the integral configuration poses some new challenges in the design or deployment of the sensors/instrumentation and, in some cases, requires completely new approaches. In response to this consideration, the overall objective of Task 1 was to establish the instrumentation needs for integral reactors, provide a review of the existing solutions where available, and, identify research and development needs to be addressed to enable successful deployment of IPSRs. The starting point for this study was to review and synthesize general characteristics of integral reactors, and then to focus on a specific design. Due to the maturity of its design and availability of design information to Westinghouse, IRIS (International Reactor Innovative and Secure) was selected for this purpose. The report is organized as follows. Section 1 is an overview. Section 2 provides background information on several representative IPSRs, including IRIS. A review of the IRIS safety features and its protection and control systems is used as a mechanism to ensure that all critical safety-related instrumentation needs are addressed in this study. Additionally, IRIS systems are compared against those of current advanced PWRs. The scope of this study is then limited to those systems where differences exist, since, otherwise, the current technology already provides an acceptable solution. Section 3 provides a detailed discussion on instrumentation needs for the representative IPSR (IRIS) with detailed qualitative and quantitative requirements summarized in the exhaustive table included as Appendix A. Section 3 also provides an evaluation of the current technology and the instrumentation used for measurement of required parameters in current PWRs. Section 4

  18. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  19. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  20. Sympathetic nervous system activity measured by skin conductance quantifies the challenge of walking adaptability tasks after stroke.

    Science.gov (United States)

    Clark, David J; Chatterjee, Sudeshna A; McGuirk, Theresa E; Porges, Eric C; Fox, Emily J; Balasubramanian, Chitralakshmi K

    2018-02-01

    Walking adaptability tasks are challenging for people with motor impairments. The construct of perceived challenge is typically measured by self-report assessments, which are susceptible to subjective measurement error. The development of an objective physiologically-based measure of challenge may help to improve the ability to assess this important aspect of mobility function. The objective of this study to investigate the use of sympathetic nervous system (SNS) activity measured by skin conductance to gauge the physiological stress response to challenging walking adaptability tasks in people post-stroke. Thirty adults with chronic post-stroke hemiparesis performed a battery of seventeen walking adaptability tasks. SNS activity was measured by skin conductance from the palmar surface of each hand. The primary outcome variable was the percent change in skin conductance level (ΔSCL) between the baseline resting and walking phases of each task. Task difficulty was measured by performance speed and by physical therapist scoring of performance. Walking function and balance confidence were measured by preferred walking speed and the Activities-specific Balance Confidence Scale, respectively. There was a statistically significant negative association between ΔSCL and task performance speed and between ΔSCL and clinical score, indicating that tasks with greater SNS activity had slower performance speed and poorer clinical scores. ΔSCL was significantly greater for low functioning participants versus high functioning participants, particularly during the most challenging walking adaptability tasks. This study supports the use of SNS activity measured by skin conductance as a valuable approach for objectively quantifying the perceived challenge of walking adaptability tasks in people post-stroke. Published by Elsevier B.V.

  1. Locality-Aware Task Scheduling and Data Distribution for OpenMP Programs on NUMA Systems and Manycore Processors

    Directory of Open Access Journals (Sweden)

    Ananya Muddukrishna

    2015-01-01

    Full Text Available Performance degradation due to nonuniform data access latencies has worsened on NUMA systems and can now be felt on-chip in manycore processors. Distributing data across NUMA nodes and manycore processor caches is necessary to reduce the impact of nonuniform latencies. However, techniques for distributing data are error-prone and fragile and require low-level architectural knowledge. Existing task scheduling policies favor quick load-balancing at the expense of locality and ignore NUMA node/manycore cache access latencies while scheduling. Locality-aware scheduling, in conjunction with or as a replacement for existing scheduling, is necessary to minimize NUMA effects and sustain performance. We present a data distribution and locality-aware scheduling technique for task-based OpenMP programs executing on NUMA systems and manycore processors. Our technique relieves the programmer from thinking of NUMA system/manycore processor architecture details by delegating data distribution to the runtime system and uses task data dependence information to guide the scheduling of OpenMP tasks to reduce data stall times. We demonstrate our technique on a four-socket AMD Opteron machine with eight NUMA nodes and on the TILEPro64 processor and identify that data distribution and locality-aware task scheduling improve performance up to 69% for scientific benchmarks compared to default policies and yet provide an architecture-oblivious approach for programmers.

  2. Data-driven management using quantitative metric and automatic auditing program (QMAP) improves consistency of radiation oncology processes.

    Science.gov (United States)

    Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H

    Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  3. Walking-adaptability assessments with the Interactive Walkway: Between-systems agreement and sensitivity to task and subject variations.

    Science.gov (United States)

    Geerse, Daphne J; Coolen, Bert H; Roerdink, Melvyn

    2017-05-01

    The ability to adapt walking to environmental circumstances is an important aspect of walking, yet difficult to assess. The Interactive Walkway was developed to assess walking adaptability by augmenting a multi-Kinect-v2 10-m walkway with gait-dependent visual context (stepping targets, obstacles) using real-time processed markerless full-body kinematics. In this study we determined Interactive Walkway's usability for walking-adaptability assessments in terms of between-systems agreement and sensitivity to task and subject variations. Under varying task constraints, 21 healthy subjects performed obstacle-avoidance, sudden-stops-and-starts and goal-directed-stepping tasks. Various continuous walking-adaptability outcome measures were concurrently determined with the Interactive Walkway and a gold-standard motion-registration system: available response time, obstacle-avoidance and sudden-stop margins, step length, stepping accuracy and walking speed. The same holds for dichotomous classifications of success and failure for obstacle-avoidance and sudden-stops tasks and performed short-stride versus long-stride obstacle-avoidance strategies. Continuous walking-adaptability outcome measures generally agreed well between systems (high intraclass correlation coefficients for absolute agreement, low biases and narrow limits of agreement) and were highly sensitive to task and subject variations. Success and failure ratings varied with available response times and obstacle types and agreed between systems for 85-96% of the trials while obstacle-avoidance strategies were always classified correctly. We conclude that Interactive Walkway walking-adaptability outcome measures are reliable and sensitive to task and subject variations, even in high-functioning subjects. We therefore deem Interactive Walkway walking-adaptability assessments usable for obtaining an objective and more task-specific examination of one's ability to walk, which may be feasible for both high

  4. Design of an optimal automation system : Finding a balance between a human's task engagement and exhaustion

    NARCIS (Netherlands)

    Klein, Michel; van Lambalgen, Rianne

    2011-01-01

    In demanding tasks, human performance can seriously degrade as a consequence of increased workload and limited resources. In such tasks it is very important to maintain an optimal performance quality, therefore automation assistance is required. On the other hand, automation can also impose

  5. District heating and cooling systems for communities through power-plant retrofit and distribution network. Volume 2. Tasks 1-3. Final report. [Downtown Toledo steam system

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Each of the tasks is described separately: Task 1 - Demonstration Team; Task 2 - Identify Thermal Energy Source(s) and Potential Service Area(s); and Task 3 - Energy Market Analysis. The purpose of the project is to establish and implement measures in the downtown Toledo steam system for conserving scarce fuel supplies through cogeneration, by retrofit of existing base- or intermediate-loaded electric-generating plants to provide for central heating and cooling systems, with the ultimate purpose of applying the results to other communities. For Task 1, Toledo Edison Company has organized a Demonstration Team (Battelle Columbus Laboratories; Stone and Webster; Ohio Dept. of Energy; Public Utilities Commission of Ohio; Toledo Metropolitan Area Council of Governments; and Toledo Edison) that it hopes has the expertise to evaluate the technical, legal, economic, and marketing issues related to the utilization of by-product heat from power generation to supply district heating and cooling services. Task 2 gives a complete technical description of the candidate plant(s), its thermodynamic cycle, role in load dispatch, ownership, and location. It is concluded that the Toledo steam distribution system can be the starting point for developing a new district-heating system to serve an expanding market. Battelle is a member of the team employed as a subcontractor to complete the energy market analysis. The work is summarized in Task 3. (MCW)

  6. Automatic intersection map generation task 10 report.

    Science.gov (United States)

    2016-02-29

    This report describes the work conducted in Task 10 of the V2I Safety Applications Development Project. The work was performed by the University of Michigan Transportation Research Institute (UMTRI) under contract to the Crash Avoidance Metrics Partn...

  7. Greenhouse gas balances of bioenergy systems: Programme and accomplishments of IEA Bioenergy Task XV, 1995-97

    International Nuclear Information System (INIS)

    Spitzer, J.

    1998-01-01

    The goal of IEA Bioenergy Task XV was to investigate all processes involved in using bioenergy systems, on a full fuel-cycle basis, with the aim of establishing overall greenhouse gas (GHG) balances. Task participants have been Austria, Canada, Finland, Sweden and the U.S.A. (Operating Agent: Austria). During its work period (1995-97), Task XV hosted five international workshops. The scientific achievements of the Task are documented in a number of published papers. Also, a bibliography on the research area was compiled. Much work was devoted to the question of carbon accounting in the context of the work of the Intergovernmental Panel on Climate Change (IPCC), and Task XV made contributions to a draft IPCC special report prepared for the IPCC Expert Group on Harvested Wood Products. The technical paper 'Forest harvests and wood products: sources and sinks of atmospheric carbon dioxide' (Forest Science, forthcoming) contrasts two carbon accounting approaches for considering wood products in the IPCC Guidelines (i.e., 'atmospheric-flow' vs. 'stock-change' method) and reports on estimated national carbon source-sink balances for selected countries, regions, and the world. Finally, progress was made in establishing a common analytical framework to compare different bioenergy options. The framework considers on-site carbon storage changes as well as GHG emissions from auxiliary fossil fuels, conversion efficiencies, and emission credits for by-products; comparisons between bioenergy systems and traditional fossil fuel and other energy systems as a reference are allowed, and reference land-uses accounted for. The continuation Task is Task 25 (1998-2000), with New Zealand joining the current partners 9 refs, 2 tabs

  8. The WorkQueue project - a task queue for the CMS workload management system

    Science.gov (United States)

    Ryu, S.; Wakefield, S.

    2012-12-01

    We present the development and first experience of a new component (termed WorkQueue) in the CMS workload management system. This component provides a link between a global request system (Request Manager) and agents (WMAgents) which process requests at compute and storage resources (known as sites). These requests typically consist of creation or processing of a data sample (possibly terabytes in size). Unlike the standard concept of a task queue, the WorkQueue does not contain fully resolved work units (known typically as jobs in HEP). This would require the WorkQueue to run computationally heavy algorithms that are better suited to run in the WMAgents. Instead the request specifies an algorithm that the WorkQueue uses to split the request into reasonable size chunks (known as elements). An advantage of performing lazy evaluation of an element is that expanding datasets can be accommodated by having job details resolved as late as possible. The WorkQueue architecture consists of a global WorkQueue which obtains requests from the request system, expands them and forms an element ordering based on the request priority. Each WMAgent contains a local WorkQueue which buffers work close to the agent, this overcomes temporary unavailability of the global WorkQueue and reduces latency for an agent to begin processing. Elements are pulled from the global WorkQueue to the local WorkQueue and into the WMAgent based on the estimate of the amount of work within the element and the resources available to the agent. WorkQueue is based on CouchDB, a document oriented NoSQL database. The WorkQueue uses the features of CouchDB (map/reduce views and bi-directional replication between distributed instances) to provide a scalable distributed system for managing large queues of work. The project described here represents an improvement over the old approach to workload management in CMS which involved individual operators feeding requests into agents. This new approach allows for a

  9. The WorkQueue project - a task queue for the CMS workload management system

    International Nuclear Information System (INIS)

    Ryu, S; Wakefield, S

    2012-01-01

    We present the development and first experience of a new component (termed WorkQueue) in the CMS workload management system. This component provides a link between a global request system (Request Manager) and agents (WMAgents) which process requests at compute and storage resources (known as sites). These requests typically consist of creation or processing of a data sample (possibly terabytes in size). Unlike the standard concept of a task queue, the WorkQueue does not contain fully resolved work units (known typically as jobs in HEP). This would require the WorkQueue to run computationally heavy algorithms that are better suited to run in the WMAgents. Instead the request specifies an algorithm that the WorkQueue uses to split the request into reasonable size chunks (known as elements). An advantage of performing lazy evaluation of an element is that expanding datasets can be accommodated by having job details resolved as late as possible. The WorkQueue architecture consists of a global WorkQueue which obtains requests from the request system, expands them and forms an element ordering based on the request priority. Each WMAgent contains a local WorkQueue which buffers work close to the agent, this overcomes temporary unavailability of the global WorkQueue and reduces latency for an agent to begin processing. Elements are pulled from the global WorkQueue to the local WorkQueue and into the WMAgent based on the estimate of the amount of work within the element and the resources available to the agent. WorkQueue is based on CouchDB, a document oriented NoSQL database. The WorkQueue uses the features of CouchDB (map/reduce views and bi-directional replication between distributed instances) to provide a scalable distributed system for managing large queues of work. The project described here represents an improvement over the old approach to workload management in CMS which involved individual operators feeding requests into agents. This new approach allows for a

  10. The WorkQueue project: A task queue for the CMS workload management system

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, S. [Fermilab; Wakefield, Stuart [Imperial Coll., London

    2012-01-01

    We present the development and first experience of a new component (termed WorkQueue) in the CMS workload management system. This component provides a link between a global request system (Request Manager) and agents (WMAgents) which process requests at compute and storage resources (known as sites). These requests typically consist of creation or processing of a data sample (possibly terabytes in size). Unlike the standard concept of a task queue, the WorkQueue does not contain fully resolved work units (known typically as jobs in HEP). This would require the WorkQueue to run computationally heavy algorithms that are better suited to run in the WMAgents. Instead the request specifies an algorithm that the WorkQueue uses to split the request into reasonable size chunks (known as elements). An advantage of performing lazy evaluation of an element is that expanding datasets can be accommodated by having job details resolved as late as possible. The WorkQueue architecture consists of a global WorkQueue which obtains requests from the request system, expands them and forms an element ordering based on the request priority. Each WMAgent contains a local WorkQueue which buffers work close to the agent, this overcomes temporary unavailability of the global WorkQueue and reduces latency for an agent to begin processing. Elements are pulled from the global WorkQueue to the local WorkQueue and into the WMAgent based on the estimate of the amount of work within the element and the resources available to the agent. WorkQueue is based on CouchDB, a document oriented NoSQL database. The WorkQueue uses the features of CouchDB (map/reduce views and bi-directional replication between distributed instances) to provide a scalable distributed system for managing large queues of work. The project described here represents an improvement over the old approach to workload management in CMS which involved individual operators feeding requests into agents. This new approach allows for a

  11. Effects of Metric Change on Workers’ Tools and Training.

    Science.gov (United States)

    1981-07-01

    understanding of the metric system, and particularly a lack of fluency in converting customary measurements to metric measuremerts, may increase the...assembly, installing, and repairing occupations 84 Painting, plastering, waterproofing, cementing , and related occupations 85 Excavating, grading... cementing , and related occupations 85 Excavating, grading, paving, and related occupations 86 Construction occupations, n.e.c. 89 Structural work

  12. Predicting class testability using object-oriented metrics

    NARCIS (Netherlands)

    M. Bruntink (Magiel); A. van Deursen (Arie)

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated

  13. Rice by Weight, Other Produce by Bulk, and Snared Iguanas at So Much Per One. A Talk on Measurement Standards and on Metric Conversion.

    Science.gov (United States)

    Allen, Harold Don

    This script for a short radio broadcast on measurement standards and metric conversion begins by tracing the rise of the metric system in the international marketplace. Metric units are identified and briefly explained. Arguments for conversion to metric measures are presented. The history of the development and acceptance of the metric system is…

  14. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  15. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  16. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  17. Weyl metrics and wormholes

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, Gary W. [DAMTP, University of Cambridge, Wilberforce Road, Cambridge, CB3 0WA U.K. (United Kingdom); Volkov, Mikhail S., E-mail: gwg1@cam.ac.uk, E-mail: volkov@lmpt.univ-tours.fr [Laboratoire de Mathématiques et Physique Théorique, LMPT CNRS—UMR 7350, Université de Tours, Parc de Grandmont, Tours, 37200 France (France)

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  18. Task-based data-acquisition optimization for sparse image reconstruction systems

    Science.gov (United States)

    Chen, Yujia; Lou, Yang; Kupinski, Matthew A.; Anastasio, Mark A.

    2017-03-01

    Conventional wisdom dictates that imaging hardware should be optimized by use of an ideal observer (IO) that exploits full statistical knowledge of the class of objects to be imaged, without consideration of the reconstruction method to be employed. However, accurate and tractable models of the complete object statistics are often difficult to determine in practice. Moreover, in imaging systems that employ compressive sensing concepts, imaging hardware and (sparse) image reconstruction are innately coupled technologies. We have previously proposed a sparsity-driven ideal observer (SDIO) that can be employed to optimize hardware by use of a stochastic object model that describes object sparsity. The SDIO and sparse reconstruction method can therefore be "matched" in the sense that they both utilize the same statistical information regarding the class of objects to be imaged. To efficiently compute SDIO performance, the posterior distribution is estimated by use of computational tools developed recently for variational Bayesian inference. Subsequently, the SDIO test statistic can be computed semi-analytically. The advantages of employing the SDIO instead of a Hotelling observer are systematically demonstrated in case studies in which magnetic resonance imaging (MRI) data acquisition schemes are optimized for signal detection tasks.

  19. Energy-Based Metrics for Arthroscopic Skills Assessment.

    Science.gov (United States)

    Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa

    2017-08-05

    Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.

  20. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  1. Improving Learning Tasks for Mentally Handicapped People Using AmI Environments Based on Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Diego Martín

    2018-01-01

    Full Text Available A prototype to improve learning tasks for mentally handicapped people is shown in this research paper using ambient intelligence techniques and based on cyber-physical systems. The whole system is composed of a worktable, a cyber-glove (both with several RFID and NFC detection zones, and an AmI software application for modeling and workflow guidance. A case study was carried out by the authors where sixteen mentally handicapped people and 3 trainers were involved in the experiment. The experiment consisted in the execution of several memorization tasks of movements of objects using the approach presented in this paper. The results obtained were very interesting, indicating that this kind of solutions are feasible and allow the learning of complex tasks to some types of mentally handicapped people. In addition, at the end of the paper are presented some lessons learned after performing the experimentation.

  2. Artificial Immune Systems as a Modern Tool for Solving Multi-Purpose Optimization Tasks in the Field of Logistics

    Directory of Open Access Journals (Sweden)

    Skitsko Volodymyr I.

    2017-03-01

    Full Text Available The article investigates various aspects of the functioning of artificial immune systems and their using to solve different tasks. The analysis of the studied literature showed that nowadays there exist combinations of artificial immune systems, in particular with genetic algorithms, the particle swarm optimization method, artificial neural networks, etc., to solve different tasks. However, the solving of economic tasks is paid little attention. The article presents the basic terminology of artificial immune systems; the steps of the clonal selection algorithm are described, as well as a brief description of the negative selection algorithm, the immune network algorithm and the dendritic algorithm is given; conceptual aspects of the use of an artificial immune system for solving multi-purpose optimization problems are formulated, and an example of solving a problem in the field of logistics is described. Artificial immune systems as a means of solving various weakly structured, multi-criteria and multi-purpose economic tasks, in particular in the sphere of logistics, are a promising tool that requires further research. Therefore, it is advisable in the future to focus on the use of various existing immune algorithms for solving various economic problems.

  3. The role of domain-general frontal systems in language comprehension: evidence from dual-task interference and semantic ambiguity.

    Science.gov (United States)

    Rodd, Jennifer M; Johnsrude, Ingrid S; Davis, Matthew H

    2010-12-01

    Neuroimaging studies have shown that the left inferior frontal gyrus (LIFG) plays a critical role in semantic and syntactic aspects of speech comprehension. It appears to be recruited when listeners are required to select the appropriate meaning or syntactic role for words within a sentence. However, this region is also recruited during tasks not involving sentence materials, suggesting that the systems involved in processing ambiguous words within sentences are also recruited for more domain-general tasks that involve the selection of task-relevant information. We use a novel dual-task methodology to assess whether the cognitive system(s) that are engaged in selecting word meanings are also involved in non-sentential tasks. In Experiment 1, listeners were slower to decide whether a visually presented letter is in upper or lower case when the sentence that they are simultaneously listening to contains words with multiple meanings (homophones), compared to closely matched sentences without homophones. Experiment 2 indicates that this interference effect is not tied to the occurrence of the homophone itself, but rather occurs when listeners must reinterpret a sentence that was initially misparsed. These results suggest some overlap between the cognitive system involved in semantic disambiguation and the domain-general process of response selection required for the case-judgement task. This cognitive overlap may reflect neural overlap in the networks supporting these processes, and is consistent with the proposal that domain-general selection processes in inferior frontal regions are critical for language comprehension. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. ITER task D316 (1996): design review of isotope separation system (WBS 3.2 B) and water detritiation system (WBS 3.2 E)

    International Nuclear Information System (INIS)

    Sood, S.K.; Fong, C.

    1997-05-01

    The design review performed on the ITER Isotope Separation System and the Water Detritiation System are summarized. The objectives of the task are: to produce a Design Description Document for the Feed Treatment and Vacuum system for the Water Detritiation system; to review the process system operation and control philosophy for the Water Detritiation System; to review the equipment arrangement drawings where available. 1 fig., 3 refs

  5. Resource characterization and residuals remediation, Task 1.0: Air quality assessment and control, Task 2.0: Advanced power systems, Task 3.0: Advanced fuel forms and coproducts, Task 4.0

    Energy Technology Data Exchange (ETDEWEB)

    Hawthorne, S.B.; Timpe, R.C.; Hartman, J.H. [and others

    1994-02-01

    This report addresses three subtasks related to the Resource Characterization and Residuals Remediation program: (1) sulfur forms in coal and their thermal transformations, (2) data resource evaluation and integration using GIS (Geographic Information Systems), and (3) supplementary research related to the Rocky Mountain 1 (RM1) UCG (Underground Coal Gasification) test program.

  6. Sustainable Energy Solutions Task 1.0: Networked Monitoring and Control of Small Interconnected Wind Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    edu, Janet. twomey@wichita. [Wichita State Univ., KS (United States)

    2010-04-30

    This report presents accomplishments, results, and future work for one task of five in the Wichita State University Sustainable Energy Solutions Project: To develop a scale model laboratory distribution system for research into questions that arise from networked control and monitoring of low-wind energy systems connected to the AC distribution system. The lab models developed under this task are located in the Electric Power Quality Lab in the Engineering Research Building on the Wichita State University campus. The lab system consists of four parts: 1. A doubly-fed induction generator 2. A wind turbine emulator 3. A solar photovoltaic emulator, with battery energy storage 4. Distribution transformers, lines, and other components, and wireless and wired communications and control These lab elements will be interconnected and will function together to form a complete testbed for distributed resource monitoring and control strategies and smart grid applications testing. Development of the lab system will continue beyond this project.

  7. The SPAtial EFficiency metric (SPAEF): multiple-component evaluation of spatial patterns for optimization of hydrological models

    Science.gov (United States)

    Koch, Julian; Cüneyd Demirel, Mehmet; Stisen, Simon

    2018-05-01

    The process of model evaluation is not only an integral part of model development and calibration but also of paramount importance when communicating modelling results to the scientific community and stakeholders. The modelling community has a large and well-tested toolbox of metrics to evaluate temporal model performance. In contrast, spatial performance evaluation does not correspond to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study makes a contribution towards advancing spatial-pattern-oriented model calibration by rigorously testing a multiple-component performance metric. The promoted SPAtial EFficiency (SPAEF) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multiple-component approach is found to be advantageous in order to achieve the complex task of comparing spatial patterns. SPAEF, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are applied in a spatial-pattern-oriented model calibration of a catchment model in Denmark. Results suggest the importance of multiple-component metrics because stand-alone metrics tend to fail to provide holistic pattern information. The three SPAEF components are found to be independent, which allows them to complement each other in a meaningful way. In order to optimally exploit spatial observations made available by remote sensing platforms, this study suggests applying bias insensitive metrics which further allow for a comparison of variables which are related but may differ in unit. This study applies SPAEF in the hydrological context using the mesoscale Hydrologic Model (mHM; version 5.8), but we see great potential across disciplines related to spatially distributed earth system modelling.

  8. Training conquers multitasking costs by dividing task representations in the frontoparietal-subcortical system

    OpenAIRE

    Garner, K. G.; Dux, Paul E.

    2015-01-01

    The problem of how the brain undertakes multiple tasks concurrently is one of the oldest in psychology and neuroscience. Although successful negotiation of the rich sensory world clearly requires the ongoing management of multiple tasks, humans show substantial multitasking impairments in the laboratory and everyday life. Fortunately, training facilitates multitasking. However, until now, the neural mechanisms driving this functional adaptation were not understood. Here, in a large-scale huma...

  9. Developing a Security Metrics Scorecard for Healthcare Organizations.

    Science.gov (United States)

    Elrefaey, Heba; Borycki, Elizabeth; Kushniruk, Andrea

    2015-01-01

    In healthcare, information security is a key aspect of protecting a patient's privacy and ensuring systems availability to support patient care. Security managers need to measure the performance of security systems and this can be achieved by using evidence-based metrics. In this paper, we describe the development of an evidence-based security metrics scorecard specific to healthcare organizations. Study participants were asked to comment on the usability and usefulness of a prototype of a security metrics scorecard that was developed based on current research in the area of general security metrics. Study findings revealed that scorecards need to be customized for the healthcare setting in order for the security information to be useful and usable in healthcare organizations. The study findings resulted in the development of a security metrics scorecard that matches the healthcare security experts' information requirements.

  10. Assessing Whether Students Seek Constructive Criticism: The Design of an Automated Feedback System for a Graphic Design Task

    Science.gov (United States)

    Cutumisu, Maria; Blair, Kristen P.; Chin, Doris B.; Schwartz, Daniel L.

    2017-01-01

    We introduce a choice-based assessment strategy that measures students' choices to seek constructive feedback and to revise their work. We present the feedback system of a game we designed to assess whether students choose positive or negative feedback and choose to revise their posters in the context of a poster design task, where they learn…

  11. Characteristic-Based, Task-Based, and Results-Based: Three Value Systems for Assessing Professionally Produced Technical Communication Products.

    Science.gov (United States)

    Carliner, Saul

    2003-01-01

    Notes that technical communicators have developed different methodologies for evaluating the effectiveness of their work, such as editing, usability testing, and determining the value added. Explains that at least three broad value systems underlie the assessment practices: characteristic-based, task-based, and results-based. Concludes that the…

  12. Automatic evaluation of task-focused parallel jaw gripper design

    DEFF Research Database (Denmark)

    Wolniakowski, Adam; Miatliuk, Konstantsin; Krüger, Norbert

    2014-01-01

    In this paper, we suggest gripper quality metrics that indicate the performance of a gripper given an object CAD model and a task description. Those, we argue, can be used in the design and selection of an appropriate gripper when the task is known. We present three different gripper metrics that...

  13. Comparing Resource Adequacy Metrics and Their Influence on Capacity Value: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, E.; Milligan, M.

    2014-04-01

    Traditional probabilistic methods have been used to evaluate resource adequacy. The increasing presence of variable renewable generation in power systems presents a challenge to these methods because, unlike thermal units, variable renewable generation levels change over time because they are driven by meteorological events. Thus, capacity value calculations for these resources are often performed to simple rules of thumb. This paper follows the recommendations of the North American Electric Reliability Corporation?s Integration of Variable Generation Task Force to include variable generation in the calculation of resource adequacy and compares different reliability metrics. Examples are provided using the Western Interconnection footprint under different variable generation penetrations.

  14. Task 5. Grid interconnection of building integrated and other dispersed photovoltaic power systems. International guideline for the certification of photovoltaic system components and grid-connected systems

    Energy Technology Data Exchange (ETDEWEB)

    Bower, W.

    2002-02-15

    This report for the International Energy Agency (IEA) made by Task 5 of the Photovoltaic Power Systems (PVPS) programme presents a guideline for the certification of photovoltaic system components and grid-connected systems. The mission of the Photovoltaic Power Systems Programme is to enhance the international collaboration efforts which accelerate the development and deployment of photovoltaic solar energy. Task 5 deals with issues concerning grid-interconnection and distributed PV power systems. This generic international guideline for the certification of photovoltaic system components and complete grid-connected photovoltaic systems describes a set of recommended methods and tests that may be used to verify the integrity of hardware and installations, compliance with applicable standards/codes and can be used to provide a measure of the performance of components or of entire systems. The guideline is to help ensure that photovoltaic installations are both safe for equipment as well as for personnel when used according to the applicable installation standards and codes. The guideline may be used in any country using the rules stipulated by the applicable standards and codes and by applying them to the guideline's recommended tests. This document uses examples for some tests but does not specify exact test set-ups, equipment accuracy, equipment manufacturers or calibration procedures.

  15. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  16. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  17. Critical care providers refer to information tools less during communication tasks after a critical care clinical information system introduction.

    Science.gov (United States)

    Ballermann, Mark; Shaw, Nicola T; Mayes, Damon C; Gibney, R T Noel

    2011-01-01

    Electronic documentation methods may assist critical care providers with information management tasks in Intensive Care Units (ICUs). We conducted a quasi-experimental observational study to investigate patterns of information tool use by ICU physicians, nurses, and respiratory therapists during verbal communication tasks. Critical care providers used tools less at 3 months after the CCIS introduction. At 12 months, care providers referred to paper and permanent records, especially during shift changes. The results suggest potential areas of improvement for clinical information systems in assisting critical care providers in ensuring informational continuity around their patients.

  18. The task of official personal monitoring in Germany using electronic dosimetry systems

    International Nuclear Information System (INIS)

    Huebner, Stephan; Wahl, Wolfgang; Busch, Frank; Martini, Ekkehard

    2008-01-01

    Full text: Since the establishment of the first German personal monitoring services as competent measuring bodies in the year 1952, official personal dosimetry is carried out using passive dosimeters such as film batches, RPL- and TL-dosimeters solely. On the other hand, electronic dosimeters are in use in some big institutions like Nuclear Power Plants, hospitals or industrial units for operational purposes. In most cases, these dosimeters are regulated by competent authorities. For more than 20 years electronic dosimeters proved their worth of being appropriate personal dosimeters. Since 2001 concepts to implement electronic personal dosimeters into the official individual monitoring of occupational exposed workers were developed in different research projects. The EU market of personal dosimetry changes to an open and competitive one, the number of outside workers, especially during the outages of Nuclear Power Plants increases, the landscape of customers is getting more and more heterogeneous. Being able to face these tasks of a sustainable personal monitoring requires the introduction of modern electronic dosimeters into to the official monitoring. Doing so, the needed prompt exchange of dose-data between different monitoring services as well as between the customers and the related monitoring service can be warranted. In cooperation with the industry, competent authorities and a research centre a method for official dosimetry using electronic dosimetry systems was developed, realised and tested successfully by the three big monitoring services of Germany. These investigations are supported by the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety. For this purpose a network between customers and monitoring services was built up in order to monitor people, who work in different places related to different measuring bodies in only one period of surveillance. (author)

  19. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  20. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  1. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  2. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  3. Three propositions on why characteristics of performance management systems converge across policy areas with different levels of task complexity

    DEFF Research Database (Denmark)

    Bjørnholt, Bente; Lindholst, Andrej Christian; Agger Nielsen, Jeppe

    2014-01-01

    of task complexity amidst a lack of formal and overarching, government-wide policies. We advance our propositions from a case study comparing the characteristics of performance management systems across social services (eldercare) and technical services (park services) in Denmark. Contrary to expectations......This article investigates the differences and similarities between performance management systems across public services. We offer three propositions as to why the characteristics of performance management systems may still converge across policy areas in the public sector with different levels...... for divergence due to differences in task complexity, the characteristics of performance management systems in the two policy areas are observed to converge. On the basis of a case study, we propose that convergence has occurred due to 1) similarities in policy-specific reforms, 2) institutional pressures, and 3...

  4. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  5. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  6. Sigma Routing Metric for RPL Protocol

    Directory of Open Access Journals (Sweden)

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  7. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  8. The Glasgow Parallel Reduction Machine: Programming Shared-memory Many-core Systems using Parallel Task Composition

    Directory of Open Access Journals (Sweden)

    Ashkan Tousimojarad

    2013-12-01

    Full Text Available We present the Glasgow Parallel Reduction Machine (GPRM, a novel, flexible framework for parallel task-composition based many-core programming. We allow the programmer to structure programs into task code, written as C++ classes, and communication code, written in a restricted subset of C++ with functional semantics and parallel evaluation. In this paper we discuss the GPRM, the virtual machine framework that enables the parallel task composition approach. We focus the discussion on GPIR, the functional language used as the intermediate representation of the bytecode running on the GPRM. Using examples in this language we show the flexibility and power of our task composition framework. We demonstrate the potential using an implementation of a merge sort algorithm on a 64-core Tilera processor, as well as on a conventional Intel quad-core processor and an AMD 48-core processor system. We also compare our framework with OpenMP tasks in a parallel pointer chasing algorithm running on the Tilera processor. Our results show that the GPRM programs outperform the corresponding OpenMP codes on all test platforms, and can greatly facilitate writing of parallel programs, in particular non-data parallel algorithms such as reductions.

  9. Scale-dependent effects of land cover on water physico-chemistry and diatom-based metrics in a major river system, the Adour-Garonne basin (South Western France)

    International Nuclear Information System (INIS)

    Tudesque, Loïc; Tisseuil, Clément; Lek, Sovan

    2014-01-01

    The scale dependence of ecological phenomena remains a central issue in ecology. Particularly in aquatic ecology, the consideration of the accurate spatial scale in assessing the effects of landscape factors on stream condition is critical. In this context, our study aimed at assessing the relationships between multi-spatial scale land cover patterns and a variety of water quality and diatom metrics measured at the stream reach level. This investigation was conducted in a major European river system, the Adour-Garonne river basin, characterized by a wide range of ecological conditions. Redundancy analysis (RDA) and variance partitioning techniques were used to disentangle the different relationships between land cover, water-chemistry and diatom metrics. Our results revealed a top-down “cascade effect” indirectly linking diatom metrics to land cover patterns through water physico-chemistry, which occurred at the largest spatial scales. In general, the strength of the relationships between land cover, physico-chemistry, and diatoms was shown to increase with the spatial scale, from the local to the basin scale, emphasizing the importance of continuous processes of accumulation throughout the river gradient. Unexpectedly, we established that the influence of land cover on the diatom metric was of primary importance both at the basin and local scale, as a result of discontinuous but not necessarily antagonist processes. The most detailed spatial grain of the Corine land cover classification appeared as the most relevant spatial grain to relate land cover to water chemistry and diatoms. Our findings provide suitable information to improve the implementation of effective diatom-based monitoring programs, especially within the scope of the European Water Framework Directive. - Highlights: •The spatial scale dependence of the “cascade effect” in a river system has been demonstrated. •The strength of the relationships between land cover and diatoms through

  10. Let's Make Metric Ice Cream

    Science.gov (United States)

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  11. Classification Systems for Individual Differences in Multiple-task Performance and Subjective Estimates of Workload

    Science.gov (United States)

    Damos, D. L.

    1984-01-01

    Human factors practitioners often are concerned with mental workload in multiple-task situations. Investigations of these situations have demonstrated repeatedly that individuals differ in their subjective estimates of workload. These differences may be attributed in part to individual differences in definitions of workload. However, after allowing for differences in the definition of workload, there are still unexplained individual differences in workload ratings. The relation between individual differences in multiple-task performance, subjective estimates of workload, information processing abilities, and the Type A personality trait were examined.

  12. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  13. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  14. Engineering task plan for development, fabrication, and deployment of nested, fixed depth fluidic sampling and at-tank analysis systems

    International Nuclear Information System (INIS)

    REICH, F.R.

    1999-01-01

    An engineering task plan was developed that presents the resources, responsibilities, and schedules for the development, test, and deployment of the nested, fixed-depth fluidic sampling and at-tank analysis system. The sampling system, deployed in the privatization contract double-shell tank feed tank, will provide waste samples for assuring the readiness of the tank for shipment to the privatization contractor for vitrification. The at-tank analysis system will provide ''real-time'' assessments of the sampled wastes' chemical and physical properties. These systems support the Hanford Phase 1B Privatization Contract

  15. Description of the tasks of control room operators in German nuclear power plants and support possibilities by advanced computer systems

    International Nuclear Information System (INIS)

    Buettner, W.E.

    1984-01-01

    In course of the development of nuclear power plants the instrumentation and control systems and the information in the control room have been increasing substantially. With this background it is described which operator tasks might be supported by advanced computer aid systems with main emphasis to safety related information and diagnose facilities. Nevertheless, some of this systems under development may be helpful for normal operation modes too. As far as possible recommendations for the realization and test of such systems are made. (orig.) [de

  16. Predicting class testability using object-oriented metrics

    OpenAIRE

    Bruntink, Magiel; Deursen, Arie

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated by means of two case studies of large Java systems for which JUnit test cases exist. The goal of this paper is to define and evaluate a set of metrics that can be used to assess the testability of t...

  17. Task 5. Grid interconnection of building integrated and other dispersed photovoltaic power systems. Grid-connected photovoltaic power systems: power value and capacity value of PV systems

    Energy Technology Data Exchange (ETDEWEB)

    Groppi, F.

    2002-02-15

    This report for the International Energy Agency (IEA) made by Task 5 of the Photovoltaic Power Systems (PVPS) programme takes a look at the power value and capacity value of photovoltaic power systems. The mission of the Photovoltaic Power Systems Programme is to enhance the international collaboration efforts which accelerate the development and deployment of photovoltaic solar energy. Task 5 deals with issues concerning grid-interconnection and dispersed PV power systems. This report summarises the results of a study aimed to assess the benefits that may be obtained when distributed PV production systems are present in a low-voltage grid. The basic aspects concerning the power-value and those related to the capacity-value are discussed. Data obtained from simulations are presented and discussed. A simple concept shows that great variation occurs if varying load patterns are taken into account. The power-value of PV generation in the grid varies instant by instant depending on the current level of power production and on the surrounding load conditions. Although the three case-studies considered do not cover all the possibilities of coupling between PV and loads, the results obtained show a good differentiation among users with PV production which leads to interesting conclusions.

  18. Defense Science Board Task Force Report: The Role of Autonomy in DoD Systems

    Science.gov (United States)

    2012-07-01

    kind of best practice that contributed to the success of DARPA’s successful Grand Challenge program which aimed to create long-distance, driverless ...performing on an urban task, to now the existence of a driverless vehicle. The key is to have a community pushing the bounds of interacting with the real

  19. Mutual interaction effects between discomfort and cognitive task performance in clothing systems

    NARCIS (Netherlands)

    Hartog, E.A. den; Koerhuis, C.L.

    2016-01-01

    The focus of this study was to establish a relationship between physical discomfort and performance. Eleven healthy male subjects participated in this pilot study. The subjects performed a 2-h protocol without and with significant thermal and mechanical discomfort. Various cognitive tasks were

  20. A software platform to develop and execute kitting tasks on industrial cyber-physical systems

    DEFF Research Database (Denmark)

    Rovida, Francesco

    2017-01-01

    The current material handling infrastructure associated with manufacturing and assembly operations still register a great presence of human work for highly repetitive tasks. A major contributing factor for the low automation is that current manufacturing robots have little or no understanding of ....... A platform where skills, similarly to computer or smartphone applications, can be installed and removed from heterogeneous robots with few elementary steps....