WorldWideScience

Sample records for analysis task volume

  1. Task analysis of nuclear power plant control room crews. Volume 3. Task data forms

    International Nuclear Information System (INIS)

    A task analysis of nuclear power plant control room crews was performed by General Physics Corporation and BioTechnology, Inc., for the Office of Nuclear Regulatory Research. The task analysis methodology used in the project is discussed and compared to traditional task analysis and job analysis methods. The objective of the project was to conduct a crew task analysis that would provide data for evaluating six areas: (1) human engineering design of control rooms and retrofitting of current control rooms; (2) the numbers and types of control room operators needed with requisite skills and knowledge; (3) operator qualification and training requirements; (4) normal, off-normal, and emergency operating procedures; (5) job performance aids; and (6) communications. The data collection approach focused on a generic structural framework for assembling the multitude of task data that were observed. Control room crew task data were observed and recorded within the context of an operating sequence. The data collection was conducted at eight power plant sites (in simulators and/or in control rooms) by teams comprising human factors and operations personnel. Plants were sampled according to NSSS vendor, vintage, simulator availability, architect-engineer, and control room configuration. The results of the data collection effort were compiled in a computerized task database. Six demonstrations for suitability analysis were subsequently conducted in each of the above areas and are described in this report. Volume 1 details the Project Approach and Methodology. Volume 2 provides the Data Results including a description of the computerized task analysis data format. Volumes 3 and 4 present the Task Data Forms that resulted from the project and are available on a computerized data-base management system

  2. Underground Test Area Subproject Phase I Data Analysis Task. Volume II - Potentiometric Data Document Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume II of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the potentiometric data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  3. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  4. Underground Test Area Subproject Phase I Data Analysis Task. Volume VI - Groundwater Flow Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-11-01

    Volume VI of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the groundwater flow model data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  5. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  6. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 2

    Science.gov (United States)

    1985-01-01

    Results of a Space Station Data System Analysis/Architecture Study for the Goddard Space Flight Center are presented. This study, which emphasized a system engineering design for a complete, end-to-end data system, was divided into six tasks: (1); Functional requirements definition; (2) Options development; (3) Trade studies; (4) System definitions; (5) Program plan; and (6) Study maintenance. The Task inter-relationship and documentation flow are described. Information in volume 2 is devoted to Task 3: trade Studies. Trade Studies have been carried out in the following areas: (1) software development test and integration capability; (2) fault tolerant computing; (3) space qualified computers; (4) distributed data base management system; (5) system integration test and verification; (6) crew workstations; (7) mass storage; (8) command and resource management; and (9) space communications. Results are presented for each task.

  7. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  8. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  9. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1

    Science.gov (United States)

    1985-01-01

    The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.

  10. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    International Nuclear Information System (INIS)

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses

  11. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A. [Pacific Science and Engineering Group, San Diego, CA (United States); Saunders, W.M.; Lepage, R.P.; Chin, E. [University of California San Diego Medical Center, CA (United States). Div. of Radiation Oncology; Schoenfeld, I.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.

  12. Optimized training of responsible shift personnel in nuclear power plants. Supplement volume 3 for chapter 5: Task analysis

    International Nuclear Information System (INIS)

    Derivation of learning targets from task analyses requires consideration of all process steps relevant to safety. This supplement volume indicates the functions they serve, classes of tasks, list of tasks, an example of task solution, the class of tasks 'abnormal operation' (heavy water reator), the definition of learning stages, the list of verbs for relating learning to stages, descriptors for technical and abnormal operation (heavy water reactor), verification of descriptors, and connection of action and learning targets by descriptors. (DG)

  13. Space station data system analysis/architecture study. Task 2: Options development, DR-5. Volume 3: Programmatic options

    Science.gov (United States)

    1985-01-01

    Task 2 in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make design/programmatic decisions. This volume identifies the preferred options in the programmatic category and characterizes these options with respect to performance attributes, constraints, costs, and risks. The programmatic category includes methods used to administrate/manage the development, operation and maintenance of the SSDS. The specific areas discussed include standardization/commonality; systems management; and systems development, including hardware procurement, software development and system integration, test and verification.

  14. Space station data system analysis/architecture study. Task 2: Options development DR-5. Volume 1: Technology options

    Science.gov (United States)

    1985-01-01

    The second task in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make key design/programmatic decisions. This volume identifies the preferred options in the technology category and characterizes these options with respect to performance attributes, constraints, cost, and risk. The technology category includes advanced materials, processes, and techniques that can be used to enhance the implementation of SSDS design structures. The specific areas discussed are mass storage, including space and round on-line storage and off-line storage; man/machine interface; data processing hardware, including flight computers and advanced/fault tolerant computer architectures; and software, including data compression algorithms, on-board high level languages, and software tools. Also discussed are artificial intelligence applications and hard-wire communications.

  15. Data analysis tasks: BATSE

    Science.gov (United States)

    Paciesas, William S.

    1993-01-01

    Miscellaneous tasks related to the operation of, and analysis of data from, the Burst and Transient Experiment (BATSE) on the Compton Gamma Ray Observatory (CGRO) were performed. The results are summarized and relevant references are included.

  16. Space station data system analysis/architecture study. Task 2: Options development, DR-5. Volume 2: Design options

    Science.gov (United States)

    1985-01-01

    The primary objective of Task 2 is the development of an information base that will support the conduct of trade studies and provide sufficient data to make key design/programmatic decisions. This includes: (1) the establishment of option categories that are most likely to influence Space Station Data System (SSDS) definition; (2) the identification of preferred options in each category; and (3) the characterization of these options with respect to performance attributes, constraints, cost and risk. This volume contains the options development for the design category. This category comprises alternative structures, configurations and techniques that can be used to develop designs that are responsive to the SSDS requirements. The specific areas discussed are software, including data base management and distributed operating systems; system architecture, including fault tolerance and system growth/automation/autonomy and system interfaces; time management; and system security/privacy. Also discussed are space communications and local area networking.

  17. Futures Trading: Task Analysis

    Czech Academy of Sciences Publication Activity Database

    Zeman, Jan

    Praha: UTIA AV ČR, v.v.i, - DAR, 2009 - (Janžura; Ivánek). s. 38-38 [5th International Workshop on Data – Algorithms – Decision Making. 29.11.2009-01.12.2009, Plzeň] Institutional research plan: CEZ:AV0Z10750506 Keywords : decision making * futures trading * optimization Subject RIV: IN - Informatics, Computer Science http://library.utia.cas.cz/separaty/2009/AS/zeman-futures trading task analysis .pdf

  18. NASA TLA workload analysis support. Volume 1: Detailed task scenarios for general aviation and metering and spacing studies

    Science.gov (United States)

    Sundstrom, J. L.

    1980-01-01

    The techniques required to produce and validate six detailed task timeline scenarios for crew workload studies are described. Specific emphasis is given to: general aviation single pilot instrument flight rules operations in a high density traffic area; fixed path metering and spacing operations; and comparative workload operation between the forward and aft-flight decks of the NASA terminal control vehicle. The validation efforts also provide a cursory examination of the resultant demand workload based on the operating procedures depicted in the detailed task scenarios.

  19. Task Analysis Technologies at KSC

    Science.gov (United States)

    Carstens, Deborah S.

    2003-01-01

    Project objective: (1) Form an integrated team of NASA. USA, Boeing, and Dynacs researches. (2) Create a user friendly software prototype that assists an analyst in performing a human factors process failure modes and effects analysis (HF-PFMEA). (3)Perform four task analyses on center: cargo late access task analysis (NASA/Boeing team); payload test and verification system task analysis (NASA/Boeing team); slammer cover installation operations task analysis (NASA/USA team); ATDC LOX pump acceptance test procedure task analysis (NASA/Dynacs team).

  20. Skill Components of Task Analysis

    Science.gov (United States)

    Adams, Anne E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Some task analysis methods break down a task into a hierarchy of subgoals. Although an important tool of many fields of study, learning to create such a hierarchy (redescription) is not trivial. To further the understanding of what makes task analysis a skill, the present research examined novices' problems with learning Hierarchical Task…

  1. Feasibility study of modern airships, phase 1. Volume 1: Summary and mission analysis (tasks 2 and 4)

    Science.gov (United States)

    Bloetscher, F.

    1975-01-01

    The histroy, potential mission application, and designs of lighter-than-air (LTA) vehicles are researched and evaluated. Missions are identified to which airship vehicles are potentially suited. Results of the mission analysis are combined with the findings of a parametric analysis to formulate the mission/vehicle combinations recommended for further study. Current transportation systems are surveyed and potential areas of competition are identified as well as potential missions resulting from limitations of these systems. Potential areas of military usage are included.

  2. Design study of wind turbines 50 kW to 3000 kW for electric utility applications. Volume 3: Supplementary design and analysis tasks

    Science.gov (United States)

    1976-01-01

    Additional design and analysis data are provided to supplement the results of the two parallel design study efforts. The key results of the three supplemental tasks investigated are: (1) The velocity duration profile has a significant effect in determining the optimum wind turbine design parameters and the energy generation cost. (2) Modest increases in capacity factor can be achieved with small increases in energy generation costs and capital costs. (3) Reinforced concrete towers that are esthetically attractive can be designed and built at a cost comparable to those for steel truss towers. The approach used, method of analysis, assumptions made, design requirements, and the results for each task are discussed in detail.

  3. Task analysis and support for problem solving tasks

    International Nuclear Information System (INIS)

    This paper is concerned with Task Analysis as the basis for ergonomic design to reduce human error rates, rather than for predicting human error rates. Task Analysis techniques usually provide a set of categories for describing sub tasks, and a framework describing the relations between sub-tasks. Both the task type categories and their organisation have implications for optimum interface and training design. In this paper, the framework needed for considering the most complex tasks faced by operators in process industries is discussed such as fault management in unexpected situations, and what is likely to minimise human error in these circumstances. (author)

  4. Genetic Inventory Task Final Report. Volume 2

    Science.gov (United States)

    Venkateswaran, Kasthuri; LaDuc, Myron T.; Vaishampayan, Parag

    2012-01-01

    Contaminant terrestrial microbiota could profoundly impact the scientific integrity of extraterrestrial life-detection experiments. It is therefore important to know what organisms persist on spacecraft surfaces so that their presence can be eliminated or discriminated from authentic extraterrestrial biosignatures. Although there is a growing understanding of the biodiversity associated with spacecraft and cleanroom surfaces, it remains challenging to assess the risk of these microbes confounding life-detection or sample-return experiments. A key challenge is to provide a comprehensive inventory of microbes present on spacecraft surfaces. To assess the phylogenetic breadth of microorganisms on spacecraft and associated surfaces, the Genetic Inventory team used three technologies: conventional cloning techniques, PhyloChip DNA microarrays, and 454 tag-encoded pyrosequencing, together with a methodology to systematically collect, process, and archive nucleic acids. These three analysis methods yielded considerably different results: Traditional approaches provided the least comprehensive assessment of microbial diversity, while PhyloChip and pyrosequencing illuminated more diverse microbial populations. The overall results stress the importance of selecting sample collection and processing approaches based on the desired target and required level of detection. The DNA archive generated in this study can be made available to future researchers as genetic-inventory-oriented technologies further mature.

  5. Cognitive task load analysis : Allocating tasks and designing support

    OpenAIRE

    Neerincx, M.A.

    2003-01-01

    We present a method for Cognitive Task Analysis that guides the early stages of software development, aiming at an optimal cognitive load for operators of process control systems. The method is based on a practical theory of cognitive task load and support. In addition to the classical measure percentage time occupied, this theory distinguishes two load factors that affect cognitive task performance and mental effort: the level of information processing and the number of task-set switches. Re...

  6. [Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio]. Volume 3, Sampling and analysis plan (SAP): Phase 1, Task 4, Field Investigation: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    In April 1990, Wright-Patterson Air Force Base (WPAFB), initiated an investigation to evaluate a potential Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) removal action to prevent, to the extent practicable, the offsite migration of contaminated ground water from WPAFB. WPAFB retained the services of the Environmental Management Operations (EMO) and its principle subcontractor, International Technology Corporation (IT) to complete Phase 1 of the environmental investigation of ground-water contamination at WPAFB. Phase 1 of the investigation involves the short-term evaluation and potential design for a program to remove ground-water contamination that appears to be migrating across the western boundary of Area C, and across the northern boundary of Area B along Springfield Pike. Primarily, Task 4 of Phase 1 focuses on collection of information at the Area C and Springfield Pike boundaries of WPAFB. This Sampling and Analysis Plan (SAP) has been prepared to assist in completion of the Task 4 field investigation and is comprised of the Quality Assurance Project Plan (QAPP) and the Field Sampling Plan (FSP).

  7. Analysis of the structural parameters that influence gas production from the Devonian shale. Volume 1. Executive Summary and Task Reports. Annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Shumaker, R.C.; de Wys, J.N.; Dixon, J.M.

    1978-10-01

    The first portion of the report, from the Executive Summary (page 1) through the Schedule of Milestones (page 10), gives a general overview which highlights our progress and problems for the second year. The Task report portion of the text, written by individual task investigators, is designed primarily for scientists interested in technical details of the second year's work. The second portion of the report consists of appendices of data compiled by the principal investigators.

  8. Overview of job and task analysis

    International Nuclear Information System (INIS)

    During the past few years the nuclear industry has become concerned with predicting human performance in nuclear power plants. One of the best means available at the present time to make sure that training, procedures, job performance aids and plant hardware match the capabilities and limitations of personnel is by performing a detailed analysis of the tasks required in each job position. The approved method for this type of analysis is referred to as job or task analysis. Job analysis is a broader type of analysis and is usually thought of in terms of establishing overall performance objectives, and in establishing a basis for position descriptions. Task analysis focuses on the building blocks of task performance, task elements, and places them within the context of specific performance requirements including time to perform, feedback required, special tools used, and required systems knowledge. The use of task analysis in the nuclear industry has included training validation, preliminary risk screening, and procedures development

  9. A Brief Analysis of Communication Tasks in Task- based Teaehing

    Institute of Scientific and Technical Information of China (English)

    Xu Xiaoying

    2011-01-01

    Task -Based Language Teaching (TBLT) aims at proving opportunities for the learners to experiment with and explore both spoken and written language through learning activities. This passage further exam if the following four communicative tasks jigsaw tasks, role - play tasks, problem solving tasks, and information gap tasks can assist classroom learning.

  10. An ergonomic task analysis of spinal anaesthesia.

    LENUS (Irish Health Repository)

    Ajmal, Muhammad

    2009-12-01

    Ergonomics is the study of physical interaction between humans and their working environment. The objective of this study was to characterize the performance of spinal anaesthesia in an acute hospital setting, applying ergonomic task analysis.

  11. Job and task analysis for technical staff

    International Nuclear Information System (INIS)

    In September of 1989 Cooper Nuclear Station began a project to upgrade the Technical Staff Training Program. This project's roots began by performing job and Task Analysis for Technical Staff. While the industry has long been committed to Job and Task Analysis to target performance based instruction for single job positions, this approach was unique in that it was not originally considered appropriate for a group as diverse as Tech Staff. Much to his satisfaction the Job and Task Analysis Project was much less complicated for Technical Staff than the author had imagined. The benefits of performing the Job and Task Analysis for Technical Staff have become increasingly obvious as he pursues lesson plan development and course revisions. The outline for this presentation will be as follows: philosophy adopted; preparation of the job survey document; performing the job analysis; performing task analysis for technical staff and associated pitfalls; clustering objectives for training and comparison to existing program; benefits now and in the future; final phase (comparison to INPO guides and meeting the needs of non-degreed engineering professionals); and conclusion. By focusing on performance based needs for engineers rather than traditional academics for training the author is confident the future Technical Staff Program will meet the challenges ahead and will exceed requirements for accreditation

  12. 49 CFR 236.1043 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Task analysis and basic requirements. 236.1043... Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... of work, etc.), task(s), and desired success rate; (2) Based on a formal task analysis, identify...

  13. Task Descriptions in Diagnostic Radiology. Research Report No. 7. Volume 1, Medical Tasks: What the Radiologist Does.

    Science.gov (United States)

    Gilpatrick, Eleanor

    The first of four volumes in Research Report No. 7 of the Health Services Mobility Study (HSMS), this book contains 143 task descriptions covering most of the medical activities carried out by diagnostic radiologists. (The work carried out by radiologic technologists, and administrative, machine-related, and nursing-type functions are found in…

  14. A Cognitive Task Analysis for Dental Hygiene.

    Science.gov (United States)

    Cameron, Cheryl A.; Beemsterboer, Phyllis L.; Johnson, Lynn A.; Mislevy, Robert J.; Steinberg, Linda S.; Breyer, F. Jay

    2000-01-01

    As part of the development of a scoring algorithm for a simulation-based dental hygiene initial licensure examination, this effort conducted a task analysis of the dental hygiene domain. Broad classes of behaviors that distinguish along the dental hygiene expert-novice continuum were identified and applied to the design of nine paper-based cases…

  15. Task force on compliance and enforcement. Final report. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    Recommendations for measures to strengthen the FEA enforcement program in the area of petroleum price regulation are presented. Results of task force efforts are presented in report and recommendations sections concerned with pending cases, compliance program organization, enforcement powers, compliance strategy, and audit staffing and techniques. (JRD)

  16. Radiation protection technician job task analysis manual

    International Nuclear Information System (INIS)

    This manual was developed to assist all DOE contractors in the design and conduct of job task analysis (JTA) for the radiation protection technician. Experience throughout the nuclear industry and the DOE system has indicated that the quality and efficiency in conducting a JTA at most sites is greatly enhanced by using a generic task list for the position, and clearly written guidelines on the JTA process. This manual is designed to provide this information for personnel to use in developing and conducting site-specific JTAs. (VC)

  17. Frequency Analysis Of Data On Telerobotic Tasks

    Science.gov (United States)

    Fiorini, Paolo; Giancaspro, Antonio

    1994-01-01

    Data on forces and torques measured in experiments with remote manipulators processed into spectral signatures via special frequency-analysis procedure. Spectral signatures complement other measures used to evaluate performances of telerobotic systems and human operators. Contributes to verification of some assumptions made in designing manipulator arms and control subsystems and used as feedback by operators engaged in realtime monitoring of telerobotic tasks. Also provides useful indications of flows of power between manipulators and their environments.

  18. Final report on the Pathway Analysis Task

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, F.W.; Kirchner, T.B. [Colorado State Univ., Fort Collins, CO (United States)

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University`s Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere.

  19. Final report on the Pathway Analysis Task

    International Nuclear Information System (INIS)

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University's Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere

  20. Hierarchical task analysis: Developments, applications and extensions

    OpenAIRE

    Stanton, Neville A.

    2006-01-01

    Hierarchical task analysis (HTA) is a core ergonomics approach with a pedigree of over 30 years continuous use. At its heart, HTA is based upon a theory of performance and has only three governing principles. Originally developed as a means of determining training requirements, there was no way the initial pioneers of HTA could have foreseen the extent of its success. HTA has endured as a way of representing a system sub-goal hierarchy for extended analysis. It has been used for a range of ap...

  1. Modeling Controller Tasks for Safety Analysis

    Science.gov (United States)

    Brown, Molly; Leveson, Nancy G.

    1998-01-01

    As control systems become more complex, the use of automated control has increased. At the same time, the role of the human operator has changed from primary system controller to supervisor or monitor. Safe design of the human computer interaction becomes more difficult. In this paper, we present a visual task modeling language that can be used by system designers to model human-computer interactions. The visual models can be translated into SpecTRM-RL, a blackbox specification language for modeling the automated portion of the control system. The SpecTRM-RL suite of analysis tools allow the designer to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system.

  2. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  3. Data analysis & probability task & drill sheets

    CERN Document Server

    Cook, Tanya

    2011-01-01

    For grades 3-5, our State Standards-based combined resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. The task sheets introduce the mathematical concepts to the students around a central problem taken from real-life experiences, while the drill sheets provide warm-up and timed practice questions for the students to strengthen their procedural proficiency skills. Included in our resource are activities to help students learn how to collect, organize, analyze, interpret, and predict data pro

  4. A cognitive task analysis of the SGTR scenario

    International Nuclear Information System (INIS)

    This report constitutes a contribution to the NKS/RAK-1:3 project on Integrated Sequence Analysis. Following the meeting at Ringhals, the work was proposed to be performed by the following three steps: Task 1. Cognitive Task Analysis of the E-3 procedure. Task 2. Evaluation and revision of task analysis with Ringhals/KSU experts. Task 3. Integration with simulator data. The Cognitive Task Analysis (CTA) of Task 1 uses the Goals-Means Task Analysis (GMTA) method to identify the sequence of tasks and task steps necessary to achieve the goals of the procedure. It is based on material supplied by Ringhals, which describes the E-3 procedure, including the relevant ES and ECA procedures. The analysis further outlines the cognitive demands profile associated with individual task steps as well as with the task as a whole, as an indication of the nominal task load. The outcome of the cognitive task analysis provides a basis for proposing an adequate event tree. This report describes the results from Task 1. The work has included a two-day meeting between the three contributors, as well as the exchange of intermediate results and comments throughout the period. After the initial draft of the report was prepared, an opportunity was given to observe the SGTR scenario in a full-scope training simulator, and to discuss the details with the instructors. This led to several improvements from the initial draft. (EG)

  5. JSpOC Cognitive Task Analysis

    Science.gov (United States)

    Aleva, D.; McCracken, J.

    This paper will overview a Cognitive Task Analysis (CTA) of the tasks accomplished by space operators in the Combat Operations Division (COD) of the Joint Space Operations Center (JSpOC). The methodology used to collect data will be presented. The work was performed in support of the AFRL Space Situation Awareness Fusion Intelligent Research Environment (SAFIRE) effort. SAFIRE is a multi-directorate program led by Air Force Research Laboratory (AFRL), Space Vehicles Directorate (AFRL/RV) and supporting Future Long Term Challenge 2.6.5. It is designed to address research areas identified from completion of a Core Process 3 effort for Joint Space Operations Center (JSpOC). The report is intended to be a resource for those developing capability in support of SAFIRE, the Joint Functional Component Command (JFCC) Space Integrated Prototype (JSIP) User-Defined Operating Picture (UDOP), and other related projects. The report is under distribution restriction; our purpose here is to expose its existence to a wider audience so that qualified individuals may access it. The report contains descriptions of the organization, its most salient products, tools, and cognitive tasks. Tasks reported are derived from the data collected and presented at multiple levels of abstraction. Recommendations for leveraging the findings of the report are presented. The report contains a number of appendices that amplify the methodology, provide background or context support, and includes references in support of cognitive task methodology. In a broad sense, the CTA is intended to be the foundation for relevant, usable capability in support of space warfighters. It presents, at an unclassified level, introductory material to familiarize inquirers with the work of the COD; this is embedded in a description of the broader context of the other divisions of the JSpOC. It does NOT provide guidance for the development of Tactics, Techniques, and Procedures (TT&Ps) in the development of JSpOC processes

  6. Research and development of a heat-pump water heater. Volume 2. R and D task reports

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, R.L.; Amthor, F.R.; Doyle, E.J.

    1978-08-01

    The heat pump water heater is a device that works much like a window air conditioner except that heat from the home is pumped into a water tank rather than to the outdoors. The objective established for the device is to operate with a Coefficient of Performance (COP) of 3 or, an input of one unit of electric energy would create three units of heat energy in the form of hot water. With such a COP, the device would use only one-third the energy and at one-third the cost of a standard resistance water heater. This Volume 2 contains the final reports of the three major tasks performed in Phase I. In Task 2, a market study identifies the future market and selects an initial target market and channel of distribution, all based on an analysis of the parameters affecting feasibility of the device and the factors that will affect its market acceptance. In the Task 3 report, the results of a design and test program to arrive at final designs of heat pumps for both new water heaters and for retrofitting existing water heaters are presented. In the Task 4 report, a plan for an extensive field demonstration involving use in actual homes is presented. Volume 1 contains a final summary report of the information in Volume 2.

  7. Using Task Data in Diagnostic Radiology. Research Report No. 8. Volume 1. Job Ladders: Assigning Tasks to Jobs.

    Science.gov (United States)

    Gilpatrick, Eleanor

    This report on the results of the application of the Health Services Mobility Study (HSMS) task analysis method in diagnostic radiology describes several career ladders starting from the aide level in quality assurance or patient care, rising to the technician level, and then on to the radiologic technologist level, with options to continue to…

  8. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  9. Cue Representation and Situational Awareness in Task Analysis

    Science.gov (United States)

    Carl, Diana R.

    2009-01-01

    Task analysis in human performance technology is used to determine how human performance can be well supported with training, job aids, environmental changes, and other interventions. Early work by Miller (1953) and Gilbert (1969, 1974) addressed cue processing in task execution and recommended cue descriptions in task analysis. Modern task…

  10. An informal analysis of flight control tasks

    Science.gov (United States)

    Andersen, George J.

    1991-01-01

    Issues important in rotorcraft flight control are discussed. A perceptual description is suggested of what is believed to be the major issues in flight control. When the task is considered of a pilot controlling a helicopter in flight, the task is decomposed in several subtasks. These subtasks include: (1) the control of altitude, (2) the control of speed, (3) the control of heading, (4) the control of orientation, (5) the control of flight over obstacles, and (6) the control of flight to specified positions in the world. The first four subtasks can be considered to be primary control tasks as they are not dependent on any other subtasks. However, the latter two subtasks can be considered hierarchical tasks as they are dependent on other subtasks. For example, the task of flight control over obstacles can be decomposed as a task requiring the control of speed, altitude, and heading. Thus, incorrect control of altitude should result in poor control of flight over an obstacle.

  11. Analysis of Task-based Syllabus

    Institute of Scientific and Technical Information of China (English)

    马进胜

    2011-01-01

    Task-based language teaching is very popular in the modem English teaching.It is based on the Task-based Syllabus.Taskbased Syllabus focuses on the learners' communicative competence,which stresses learning by doing.From the theoretical assumption and definitions of the task,the paper analysizes the components of the task,then points out the merits and demerits of the syllabus.By this means the paper may give some tips to teachers and students when they use the tsk-based language teaching.

  12. Job and task analysis: a view from the inside

    International Nuclear Information System (INIS)

    This paper is not intended to describe how to perform a Job and Task Analysis. There are a wide variety of approaches to conducting a Job and Task Analysis, many of which have been developed by highy seasoned and skilled professionals in this field. This paper is intended to discuss the internal support, in terms of money, time, and people, required for the Job and Task Analysis Project

  13. A procedure for the frequency analysis of telerobotic tasks data

    Science.gov (United States)

    Fiorini, Paolo; Giancaspro, Antonio

    1992-01-01

    In the last few years, teleoperated tasks have been the subject of extensive research to determine the best combination of control modalities according to specific criteria. The operator's performance were compared on the basis of task completion time and of force and torque measurements during the tasks. This paper proposes a procedure for the spectral analysis of force and torque signals generated during teleoperation experiments. There are two main reasons for examining teleoperation data in the frequency domain: a spectral analysis of different tasks can validate the assumptions made in the design of the teleoperator, and a task's frequency signature can be a valuable measure of the operator's performance.

  14. Guidelines for job and task analysis for DOE nuclear facilities

    International Nuclear Information System (INIS)

    The guidelines are intended to be responsive to the need for information on methodology, procedures, content, and use of job and task analysis since the establishment of a requirement for position task analysis for Category A reactors in DOE 5480.1A, Chapter VI. The guide describes the general approach and methods currently being utilized in the nuclear industry and by several DOE contractors for the conduct of job and task analysis and applications to the development of training programs or evaluation of existing programs. In addition other applications for job and task analysis are described including: operating procedures development, personnel management, system design, communications, and human performance predictions

  15. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 2: Participant Case Studies

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  16. Task Analysis as a Resource for Strengthening Health Systems.

    Science.gov (United States)

    Hart, Leah J; Carr, Catherine; Fullerton, Judith T

    2016-01-01

    Task analysis is a descriptive study methodology that has wide application in the health professions. Task analysis is particularly useful in assessment and definition of the knowledge, skills, and behaviors that define the scope of practice of a health profession or occupation. Jhpiego, a US-based nongovernmental organization, has adapted traditional task analysis methods in several countries in assessment of workforce education and practice issues. Four case studies are presented to describe the utility and adaptability of the task analysis approach. Traditional task analysis field survey methods were used in assessment of the general and maternal-child health nursing workforce in Mozambique that led to curriculum redesign, reducing the number of education pathways from 4 to 2. The process of health system strengthening in Liberia, following a long history of civil war conflict, included a traditional task analysis study conducted among 119 registered nurses and 46 certified midwives who had graduated in the last 6 months to 2 years to determine gaps in education and preparation. An innovative approach for data collection that involves "playing cards" to document participant opinions (Task Master, Mining for Data) was developed by Jhpiego for application in other countries. Results of a task analysis involving 54 nurses and 100 nurse-midwives conducted in Lesotho were used to verify the newly drafted scope and standards of practice for nurses and to inform planning for a competency-based preservice curriculum for nursing. The Nursing and Midwifery Council developed a 100-question licensing examination for new graduates following a task analysis in Botswana. The task analysis process in each country resulted in recommendations that were action oriented and were implemented by the country governments. For maximal utility and ongoing impact, a task analysis study should be repeated on a periodic basis and more frequently in countries undergoing rapid change in

  17. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  18. AUTHENTICITY IN TASK-BASED INTERACTION: A CONVERSATION ANALYSIS PERSPECTIVE

    OpenAIRE

    HANAN WAER

    2009-01-01

    In recent years, there has been an increasing interest in task-based learning. Authenticity has been characterized as a main aspect in defining a task (Long 1985; Skehan 1996; Ellis 2003). However, far too little attention has been paid to investigating authenticity in task-based interaction (TBI). To the best knowledge of the researcher, no research has been done using conversation analysis (CA) to investigate authenticity in TBI. Therefore, the present paper focuses on the issue of authent...

  19. Task analysis: a detailed example of stepping up from JSA

    International Nuclear Information System (INIS)

    This paper discusses a pilot task analysis of operations in a proposed facility for the cutting and packaging of radioactively contaminated gloveboxes, for long-term storage or burial. The objective was to demonstrate how task analysis may be used as a tool for planning and risk management. Two specific products were generated - preliminary operating procedures and training requirements. The task data base, procedures list and training requirements developed were intended as first order categorizations. The analysis was limited to tasks that will be performed within the boundaries of the operational facility and the associated load-out area. The analysis documents tasks to be performed by D and D (Decontamination and Decommissioning) Workers. However, the analysis included all tasks identified as an integral part of glovebox processing within the facility. Thus tasks involving Radiation Protection Technicians (RPTs) are included. Based on hazard assessments, it is planned that at least two RPTs will be assigned full-time to the facility, so they may be considered part of its crew. Similarly, supervisory/administrative tasks are included where they were determined to be directly part of process sequences, such as obtaining appropriate certification. 11 tables

  20. Task Analysis of Shuttle Entry and Landing Activities

    Science.gov (United States)

    Holland, Albert W.; Vanderark, Stephen T.

    1993-01-01

    The Task Analysis of Shuttle Entry and Landing (E/L) Activities documents all tasks required to land the Orbiter following an STS mission. In addition to analysis of tasks performed, task conditions are described, including estimated time for completion, altitude, relative velocity, normal and lateral acceleration, location of controls operated or monitored, and level of g's experienced. This analysis precedes further investigations into potential effects of zero g on piloting capabilities for landing the Orbiter following long-duration missions. This includes, but is not limited to, researching the effects of extended duration missions on piloting capabilities. Four primary constraints of the analysis must be clarified: (1) the analysis depicts E/L in a static manner--the actual process is dynamic; (2) the task analysis was limited to a paper analysis, since it was not feasible to conduct research in the actual setting (i.e., observing or filming duration an actual E/L); (3) the tasks included are those required for E/L during nominal, daylight conditions; and (4) certain E/L tasks will vary according to the flying style of each commander.

  1. A Task that Elicits Reasoning: A Dual Analysis

    Science.gov (United States)

    Yankelewitz, Dina; Mueller, Mary; Maher, Carolyn A.

    2010-01-01

    This paper reports on the forms of reasoning elicited as fourth grade students in a suburban district and sixth grade students in an urban district worked on similar tasks involving reasoning with the use of Cuisenaire rods. Analysis of the two data sets shows similarities in the reasoning used by both groups of students on specific tasks, and the…

  2. A Task-Content Analysis of an Introductory Entomology Curriculum.

    Science.gov (United States)

    Brandenburg, R.

    Described is an analysis of the content, tasks, and strategies needed by students to enable them to identify insects to order by sight and to family by use of a standard dichotomous taxonomic key. Tasks and strategies are broken down and arranged progressively in the approximate order in which students should progress. Included are listings of…

  3. Toward a cognitive task analysis for biomedical query mediation.

    Science.gov (United States)

    Hruby, Gregory W; Cimino, James J; Patel, Vimla; Weng, Chunhua

    2014-01-01

    In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM. PMID:25954589

  4. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.B.; Stobbs, J.J.; Collier, D.M.; Hobbs, J.S.

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US counries. Data are compiled in this volume on Canada, Egypt, Federal Republic of Germany, Finland, and France.

  5. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US countries. This volume contains compiled data on Mexico, Netherlands, Pakistan, Philippines, South Africa, South Korea, and Spain.

  6. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  7. Silicon dendritic web growth thermal analysis task

    Science.gov (United States)

    Richter, R.; Bhandari, P.

    1985-01-01

    A thermal analysis model is presented which describes the dendritic ribbon process. The model uses a melt-dendrite interface which projects out of the bulk melt as the basic interpretation of the ribbon production process. This is a marked departure from the interpretations of the interface phenomena which were used previously. The model was extensively illustrated with diagrams and pictures of ribbon samples. This model should have great impact on the analyses of experimental data as well as on future design modifications of ribbon-pulling equipment.

  8. Development of A Human Error Analysis Procedure for Emergency Tasks

    International Nuclear Information System (INIS)

    It has been criticized that conventional human reliability analysis (HRA) methodologies are mainly focused on the quantification of human error probability (HEP) that an operator fails to perform a required task in a specific situation on the progression of an accident (Dougherty, 1990; Hollnagel, 1998). Generally, in the conventional HRA, the situation in which a task is required is identified on the event tree (ET) or fault tree (FT) of the probabilistic safety assessment (PSA). Many HRA practitioners and risk analysts (Kirwan, 1994; Parry, 1995; Julius et al, 1995; Hollnagel, 2000) have raised the need on the inclusion of various error types (or modes) including EOCs and analysis of specific error context (error causes and performance influencing factors) inducing such error modes. In order to complement the insufficiently treated aspect of the HRA, we are developing a human error analysis (HEA) procedure so that it may help analysts identify probable error modes and their possibilities for a given task or a situation. The developed HEA procedure is composed of two modules. The first module is a flowchart for the identification of error analysis items by cognitive function and for the selection of corresponding error analysis procedure. The second module is composed of the HEA procedures in which detailed error analyses are conducted. The HEA procedure is focused on the prediction of probable error modes (including EOOs and EOCs) considering task context for a situation where a task is required to perform. But, the framework is being extended to the analysis of error possibility of EOCs that can deteriorate the plant safety by operators' inappropriate interventions. Section 2 describes the framework to the analysis of error possibilities, and Section 3 presents the basic structure of the developed HEA procedure. In Section 4, results of case applications to emergency tasks are introduced. A human error analysis (HEA) procedure was developed in order to help

  9. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  10. Unique Systems Analysis Task 7, Advanced Subsonic Technologies Evaluation Analysis

    Science.gov (United States)

    Eisenberg, Joseph D. (Technical Monitor); Bettner, J. L.; Stratton, S.

    2004-01-01

    To retain a preeminent U.S. position in the aircraft industry, aircraft passenger mile costs must be reduced while at the same time, meeting anticipated more stringent environmental regulations. A significant portion of these improvements will come from the propulsion system. A technology evaluation and system analysis was accomplished under this task, including areas such as aerodynamics and materials and improved methods for obtaining low noise and emissions. Previous subsonic evaluation analyses have identified key technologies in selected components for propulsion systems for year 2015 and beyond. Based on the current economic and competitive environment, it is clear that studies with nearer turn focus that have a direct impact on the propulsion industry s next generation product are required. This study will emphasize the year 2005 entry into service time period. The objective of this study was to determine which technologies and materials offer the greatest opportunities for improving propulsion systems. The goals are twofold. The first goal is to determine an acceptable compromise between the thermodynamic operating conditions for A) best performance, and B) acceptable noise and chemical emissions. The second goal is the evaluation of performance, weight and cost of advanced materials and concepts on the direct operating cost of an advanced regional transport of comparable technology level.

  11. A Validated Task Analysis of the Single Pilot Operations Concept

    Science.gov (United States)

    Wolter, Cynthia A.; Gore, Brian F.

    2015-01-01

    The current day flight deck operational environment consists of a two-person Captain/First Officer crew. A concept of operations (ConOps) to reduce the commercial cockpit to a single pilot from the current two pilot crew is termed Single Pilot Operations (SPO). This concept has been under study by researchers in the Flight Deck Display Research Laboratory (FDDRL) at the National Aeronautics and Space Administration's (NASA) Ames (Johnson, Comerford, Lachter, Battiste, Feary, and Mogford, 2012) and researchers from Langley Research Centers (Schutte et al., 2007). Transitioning from a two pilot crew to a single pilot crew will undoubtedly require changes in operational procedures, crew coordination, use of automation, and in how the roles and responsibilities of the flight deck and ATC are conceptualized in order to maintain the high levels of safety expected of the US National Airspace System. These modifications will affect the roles and the subsequent tasks that are required of the various operators in the NextGen environment. The current report outlines the process taken to identify and document the tasks required by the crew according to a number of operational scenarios studied by the FDDRL between the years 2012-2014. A baseline task decomposition has been refined to represent the tasks consistent with a new set of entities, tasks, roles, and responsibilities being explored by the FDDRL as the move is made towards SPO. Information from Subject Matter Expert interviews, participation in FDDRL experimental design meetings, and study observation was used to populate and refine task sets that were developed as part of the SPO task analyses. The task analysis is based upon the proposed ConOps for the third FDDRL SPO study. This experiment possessed nine different entities operating in six scenarios using a variety of SPO-related automation and procedural activities required to guide safe and efficient aircraft operations. The task analysis presents the roles and

  12. Data analysis & probability task sheets : grades pk-2

    CERN Document Server

    Cook, Tanya

    2009-01-01

    For grades PK-2, our Common Core State Standards-based resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to learn and review the concepts in unique ways. Each task sheet is organized around a central problem taken from real-life experiences of the students.

  13. Task Analysis Exemplified: The Process of Resolving Unfinished Business.

    Science.gov (United States)

    Greenberg, Leslie S.; Foerster, Florence S.

    1996-01-01

    The steps of a task analysis research program designed to identify the in-session performances involved in resolving lingering bad feelings toward a significant other are described. A rational-empirical methodology of repeatedly cycling between rational conjecture and empirical observations is demonstrated as a method of developing an intervention…

  14. Can Distributed Volunteers Accomplish Massive Data Analysis Tasks?

    Science.gov (United States)

    Kanefsky, B.; Barlow, N. G.; Gulick, V. C.

    2001-01-01

    We argue that many image analysis tasks can be performed by distributed amateurs. Our pilot study, with crater surveying and classification, has produced encouraging results in terms of both quantity (100,000 crater entries in 2 months) and quality. Additional information is contained in the original extended abstract.

  15. Analysis and Modeling of Control Tasks in Dynamic Systems

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær; Krink, Thiemo; Jensen, Mikkel Thomas;

    2002-01-01

    -case generators (TCGs), which requires a systematic analysis of dynamic optimization tasks. So far, only a few TCGs have been suggested. Our investigation leads to the conclusion that these TCGs are not capable of generating realistic dynamic benchmark tests. The result of our research is the design of a new TCG...

  16. Partial Derivative Games in Thermodynamics: A Cognitive Task Analysis

    Science.gov (United States)

    Kustusch, Mary Bridget; Roundy, David; Dray, Tevian; Manogue, Corinne A.

    2014-01-01

    Several studies in recent years have demonstrated that upper-division students struggle with the mathematics of thermodynamics. This paper presents a task analysis based on several expert attempts to solve a challenging mathematics problem in thermodynamics. The purpose of this paper is twofold. First, we highlight the importance of cognitive task…

  17. An Analysis of Tasks Performed in the Ornamental Horticulture Industry.

    Science.gov (United States)

    Berkey, Arthur L.; Drake, William E.

    This publication is the result of a detailed task analysis study of the ornamental horticulture industry in New York State. Nine types of horticulture businesses identified were: (1) retail florists, (2) farm and garden supply store, (3) landscape services, (4) greenhouse production, (5) nursery production, (6) turf production, (7) arborist…

  18. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  19. Feasibility of developing a portable driver performance data acquisition system for human factors research: Technical tasks. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.J.; Barickman, F.S.; Spelt, P.F.; Schmoyer, R.L.; Kirkpatrick, J.R.

    1998-01-01

    A two-phase, multi-year research program entitled ``development of a portable driver performance data acquisition system for human factors research`` was recently completed. The primary objective of the project was to develop a portable data acquisition system for crash avoidance research (DASCAR) that will allow drive performance data to be collected using a large variety of vehicle types and that would be capable of being installed on a given vehicle type within a relatively short-time frame. During phase 1 a feasibility study for designing and fabricating DASCAR was conducted. In phase 2 of the research DASCAR was actually developed and validated. This technical memorandum documents the results from the feasibility study. It is subdivided into three volumes. Volume one (this report) addresses the last five items in the phase 1 research and the first issue in the second phase of the project. Volumes two and three present the related appendices, and the design specifications developed for DASCAR respectively. The six tasks were oriented toward: identifying parameters and measures; identifying analysis tools and methods; identifying measurement techniques and state-of-the-art hardware and software; developing design requirements and specifications; determining the cost of one or more copies of the proposed data acquisition system; and designing a development plan and constructing DASCAR. This report also covers: the background to the program; the requirements for the project; micro camera testing; heat load calculations for the DASCAR instrumentation package in automobile trunks; phase 2 of the research; the DASCAR hardware and software delivered to the National Highway Traffic Safety Administration; and crash avoidance problems that can be addressed by DASCAR.

  20. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  1. Inpo/industry job and task analysis efforts

    International Nuclear Information System (INIS)

    One of the goals of INPO is to develop and coordinate industrywide programs to improve the education, training and qualification of nuclear utility personnel. To accomplish this goal, INPO's Training and Education Division: conducts periodic evaluations of industry training programs; provides assistance to the industry in developing training programs; manages the accreditation of utility training programs. These efforts are aimed at satisfying the need for training programs for nuclear utility personnel to be performance-based. Performance-based means that training programs provide an incumbent with the skills and knowledge required to safely perform the job. One of the ways that INPO has provided assistance to the industry is through the industrywide job and task analysis effort. I will discuss the job analysis and task analysis processes, the current status of JTA efforts, JTA products and JTA lessons learned

  2. Evaluation of the Trajectory Operations Applications Software Task (TOAST). Volume 2: Interview transcripts

    Science.gov (United States)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project whose purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle. The purpose of the evaluation was to evaluate TOAST as an Application Manager - to assess current and planned capabilities, compare capabilities to commercially-available off the shelf (COTS) software, and analyze requirements of MCC and Flight Analysis Design System (FADS) for TOAST implementation. As a major part of the data gathering for the evaluation, interviews were conducted with NASA and contractor personnel. Real-time and flight design users, orbit navigation users, the TOAST developers, and management were interviewed. Code reviews and demonstrations were also held. Each of these interviews was videotaped and transcribed as appropriate. Transcripts were edited and are presented chronologically.

  3. Using task analysis to understand the Data System Operations Team

    Science.gov (United States)

    Holder, Barbara E.

    1994-01-01

    The Data Systems Operations Team (DSOT) currently monitors the Multimission Ground Data System (MGDS) at JPL. The MGDS currently supports five spacecraft and within the next five years, it will support ten spacecraft simultaneously. The ground processing element of the MGDS consists of a distributed UNIX-based system of over 40 nodes and 100 processes. The MGDS system provides operators with little or no information about the system's end-to-end processing status or end-to-end configuration. The lack of system visibility has become a critical issue in the daily operation of the MGDS. A task analysis was conducted to determine what kinds of tools were needed to provide DSOT with useful status information and to prioritize the tool development. The analysis provided the formality and structure needed to get the right information exchange between development and operations. How even a small task analysis can improve developer-operator communications is described, and the challenges associated with conducting a task analysis in a real-time mission operations environment are examined.

  4. Sandia-Power Surety Task Force Hawaii foam analysis.

    Energy Technology Data Exchange (ETDEWEB)

    McIntyre, Annie

    2010-11-01

    The Office of Secretary of Defense (OSD) Power Surety Task Force was officially created in early 2008, after nearly two years of work in demand reduction and renewable energy technologies to support the Warfighter in Theater. The OSD Power Surety Task Force is tasked with identifying efficient energy solutions that support mission requirements. Spray foam insulation demonstrations were recently expanded beyond field structures to include military housing at Ft. Belvoir. Initial results to using the foam in both applications are favorable. This project will address the remaining key questions: (1) Can this technology help to reduce utility costs for the Installation Commander? (2) Is the foam cost effective? (3) What application differences in housing affect those key metrics? The critical need for energy solutions in Hawaii and the existing relationships among Sandia, the Department of Defense (DOD), the Department of Energy (DOE), and Forest City, make this location a logical choice for a foam demonstration. This project includes application and analysis of foam to a residential duplex at the Waikulu military community on Oahu, Hawaii, as well as reference to spray foam applied to a PACOM facility and additional foamed units on Maui, conducted during this project phase. This report concludes the analysis and describes the utilization of foam insulation at military housing in Hawaii and the subsequent data gathering and analysis.

  5. A predictive cognitive error analysis technique for emergency tasks

    International Nuclear Information System (INIS)

    This paper introduces an analysis framework and procedure for the support of cognitive error analysis of emergency tasks in nuclear power plants. The framework provides a new perspective in the utilization of error factors into error prediction. The framework can be characterized by two features. First, error factors that affect the occurrence of human error are classified into three groups, 'task characteristics factors (TCF)', 'situation factors (SF)', and 'performance assisting factors (PAF)', and are utilized in the error prediction. This classification aims to support error prediction from the viewpoint of assessing the adequacy of PAF under given TCF and SF. Second, the assessment of error factors is made in the perspective of the performance of each cognitive function. Through this, error factors assessment is made in an integrative way not independently. Furthermore, it enables analysts to identify vulnerable cognitive functions and error factors, and to obtain specific error reduction strategies. Finally, the framework and procedure was applied to the error analysis of the 'bleed and feed operation' of emergency tasks

  6. Metrics and Measures for Intelligence Analysis Task Difficulty

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.; Allwein, Kelcy M.

    2005-05-02

    Recent workshops and conferences supporting the intelligence community (IC) have highlighted the need to characterize the difficulty or complexity of intelligence analysis (IA) tasks in order to facilitate assessments of the impact or effectiveness of IA tools that are being considered for introduction into the IC. Some fundamental issues are: (a) how to employ rigorous methodologies in evaluating tools, given a host of problems such as controlling for task difficulty, effects of time or learning, small-sample size limitations; (b) how to measure the difficulty/complexity of IA tasks in order to establish valid experimental/quasi-experimental designs aimed to support evaluation of tools; and (c) development of more rigorous (summative), performance-based measures of human performance during the conduct of IA tasks, beyond the more traditional reliance on formative assessments (e.g., subjective ratings). Invited discussants will be asked to comment on one or more of these issues, with the aim of bringing the most salient issues and research needs into focus

  7. Analysis of the Discontinuities in Prioritized Tasks-Space Control Under Discrete Task Scheduling Operations

    OpenAIRE

    Keith, François; Wieber, Pierre-Brice; Mansard, Nicolas; Kheddar, Abderrahmane

    2011-01-01

    International audience This paper examines the control continuity in hierarchical task-space controllers. While the continuity is ensured for any a priori fixed number of tasks -even in ill-conditioned configurations-, the control resulting from a hierarchical stack-of-task computation may not be continuous under some discrete events. In particular, we study how the continuity of the stack-of-task control computation is affected under discreet scheduling operations such as on-the-fly prior...

  8. Learners Performing Tasks in a Japanese EFL Classroom: A Multimodal and Interpersonal Approach to Analysis

    Science.gov (United States)

    Stone, Paul

    2012-01-01

    In this paper I describe and analyse learner task-based interactions from a multimodal perspective with the aim of better understanding how learners' interpersonal relationships might affect task performance. Task-based pedagogy is focused on classroom interaction between learners, yet analysis of tasks has often neglected the analysis of this…

  9. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 4, Task 5, Operation of PFH on beneficiated shale, Task 6, Environmental data and mitigation analyses and Task 7, Sample procurement, preparation, and characterization: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    The objective of Task 5 (Operation of Pressurized Fluidized-Bed Hydro-Retorting (PFH) on Beneficiated Shale) was to modify the PFH process to facilitate its use for fine-sized, beneficiated Eastern shales. This task was divided into 3 subtasks: Non-Reactive Testing, Reactive Testing, and Data Analysis and Correlations. The potential environment impacts of PFH processing of oil shale must be assessed throughout the development program to ensure that the appropriate technologies are in place to mitigate any adverse effects. The overall objectives of Task 6 (Environmental Data and Mitigation Analyses) were to obtain environmental data relating to PFH and shale beneficiation and to analyze the potential environmental impacts of the integrated PFH process. The task was divided into the following four subtasks. Characterization of Processed Shales (IGT), 6.2. Water Availability and Treatment Studies, 6.3. Heavy Metals Removal and 6.4. PFH Systems Analysis. The objective of Task 7 (Sample Procurement, Preparation, and Characterization) was to procure, prepare, and characterize raw and beneficiated bulk samples of Eastern oil shale for all of the experimental tasks in the program. Accomplishments for these tasks are presented.

  10. Physical and cognitive task analysis in interventional radiology

    International Nuclear Information System (INIS)

    AIM: To identify, describe and detail the cognitive thought processes, decision-making, and physical actions involved in the preparation and successful performance of core interventional radiology procedures. MATERIALS AND METHODS: Five commonly performed core interventional radiology procedures were selected for cognitive task analysis. Several examples of each procedure being performed by consultant interventional radiologists were videoed. The videos of those procedures, and the steps required for successful outcome, were analysed by a psychologist and an interventional radiologist. Once a skeleton algorithm of the procedures was defined, further refinement was achieved using individual interview techniques with consultant interventional radiologists. Additionally a critique of each iteration of the established algorithm was sought from non-participating independent consultant interventional radiologists. RESULTS: Detailed task descriptions and decision protocols were developed for five interventional radiology procedures (arterial puncture, nephrostomy, venous access, biopsy-using both ultrasound and computed tomography, and percutaneous transhepatic cholangiogram). Identical tasks performed within these procedures were identified and standardized within the protocols. CONCLUSIONS: Complex procedures were broken down and their constituent processes identified. This might be suitable for use as a training protocol to provide a universally acceptable safe practice at the most fundamental level. It is envisaged that data collected in this way can be used as an educational resource for trainees and could provide the basis for a training curriculum in interventional radiology. It will direct trainees towards safe practice of the highest standard. It will also provide performance objectives of a simulator model

  11. Job task analysis: lessons learned from application in course development

    International Nuclear Information System (INIS)

    Those at Public Service Electric and Gas Company are committed to a systematic approach to training known as Instructional System Design. Our performance-based training emphasizes the ISD process to have trainees do or perform the task whenever and wherever it is possible for the jobs for which they are being trained. Included is a brief description of our process for conducting and validating job analyses. The major thrust of this paper is primarily on the lessons that we have learned in the design and development of training programs based upon job analysis results

  12. Using Goal Setting and Task Analysis to Enhance Task-Based Language Learning and Teaching

    Science.gov (United States)

    Rubin, Joan

    2015-01-01

    Task-Based Language Learning and Teaching has received sustained attention from teachers and researchers for over thirty years. It is a well-established pedagogy that includes the following characteristics: major focus on authentic and real-world tasks, choice of linguistic resources by learners, and a clearly defined non-linguistic outcome. This…

  13. Across-Task Priming Revisited: Response and Task Conflicts Disentangled Using Ex-Gaussian Distribution Analysis

    Science.gov (United States)

    Moutsopoulou, Karolina; Waszak, Florian

    2012-01-01

    The differential effects of task and response conflict in priming paradigms where associations are strengthened between a stimulus, a task, and a response have been demonstrated in recent years with neuroimaging methods. However, such effects are not easily disentangled with only measurements of behavior, such as reaction times (RTs). Here, we…

  14. The Indirect Effect of Age Group on Switch Costs via Gray Matter Volume and Task-Related Brain Activity

    Science.gov (United States)

    Steffener, Jason; Gazes, Yunglin; Habeck, Christian; Stern, Yaakov

    2016-01-01

    Healthy aging simultaneously affects brain structure, brain function, and cognition. These effects are often investigated in isolation ignoring any relationships between them. It is plausible that age related declines in cognitive performance are the result of age-related structural and functional changes. This straightforward idea is tested in within a conceptual research model of cognitive aging. The current study tested whether age-related declines in task-performance were explained by age-related differences in brain structure and brain function using a task-switching paradigm in 175 participants. Sixty-three young and 112 old participants underwent MRI scanning of brain structure and brain activation. The experimental task was an executive context dual task with switch costs in response time as the behavioral measure. A serial mediation model was applied voxel-wise throughout the brain testing all pathways between age group, gray matter volume, brain activation and increased switch costs, worsening performance. There were widespread age group differences in gray matter volume and brain activation. Switch costs also significantly differed by age group. There were brain regions demonstrating significant indirect effects of age group on switch costs via the pathway through gray matter volume and brain activation. These were in the bilateral precuneus, bilateral parietal cortex, the left precentral gyrus, cerebellum, fusiform, and occipital cortices. There were also significant indirect effects via the brain activation pathway after controlling for gray matter volume. These effects were in the cerebellum, occipital cortex, left precentral gyrus, bilateral supramarginal, bilateral parietal, precuneus, middle cingulate extending to medial superior frontal gyri and the left middle frontal gyri. There were no significant effects through the gray matter volume alone pathway. These results demonstrate that a large proportion of the age group effect on switch costs can

  15. Theoretical Analysis of Task-Based Language Teaching Pedagogy

    Institute of Scientific and Technical Information of China (English)

    HUANG Li-na

    2013-01-01

    Since the implementation of English class XinCheng, English teachers actively studying task-based language teaching approach, try to use task-based language teaching in the classroom teaching. This article will combine the implementation of task-based language teaching, and discussed the application of the task-based language teaching in English teaching.

  16. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    OpenAIRE

    Dr. Ismail Ipek; Dr. Ömer Faruk Sözcü

    2014-01-01

    The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologi...

  17. Task-positive and task-negative networks and their relation to depression: EEG beamformer analysis.

    Science.gov (United States)

    Knyazev, Gennady G; Savostyanov, Alexander N; Bocharov, Andrey V; Tamozhnikov, Sergey S; Saprigyn, Alexander E

    2016-06-01

    Major Depressive Disorder (MDD) has been associated with predominance of the default-mode network (DMN) over the task-positive network (TPN), which is considered a neurobiological base for ruminative responding. It is not known whether this predominance is a signature of the full-blown MDD or it already exists at preclinical stages. Besides, all relevant evidence has been obtained using fMRI, which allows for a precise spatial characterization of resting state networks (RSNs), but their neural correlates remain elusive. Here we show that after leakage correction of beamformer-projected resting EEG time series, seed-based oscillatory-power envelope correlation analysis allows revealing RSNs with significant similarity to respective fMRI RSNs. In a non-clinical sample, depressive symptoms, as measured by the Beck Depression Inventory, are associated with predominance of DMN over TPN connectivity in the right insula and the right temporal lobe in the delta frequency band. These findings imply that in individuals with heightened level of depressive symptoms, emotional circuits are stronger connected with DMN than TPN and should be more easily engaged in self-referential rumination than in responding to environmental challenges. The study's findings are in agreement with fMRI evidence, thus confirming the neural base of the observed in fMRI research effects and showing that implicated in depression neural mechanism may already be in action even at preclinical stages. PMID:27001453

  18. Spatio-temporal analysis reveals active control of both task-relevant and task-irrelevant variables

    Directory of Open Access Journals (Sweden)

    Kornelius eRácz

    2013-11-01

    Full Text Available The Uncontrolled Manifold hypothesis and Minimal Intervention principle propose that the observed differential variability across task relevant (i.e., task goals vs. irrelevant (i.e., in the null space of those goals variables is evidence of a separation of task variables for efficient neural control, ranked by their respective variabilities (sometimes referred to as hierarchy of control. Support for this comes from spatial domain analyses (i.e., structure of of kinematic, kinetic and EMG variability. While proponents admit the possibility of textsl{preferential} as opposed to strictly textsl{uncontrolled} variables, such distinctions have only begun to be quantified or considered in the temporal domain when inferring control action. Here we extend the study of task variability during tripod static grasp to the temporal domain by applying diffusion analysis. We show that both task-relevant and task-irrelevant parameters show corrective action at some time scales; and conversely, that task-relevant parameters do not show corrective action at other time scales. That is, the spatial fluctuations of fingertip forces show, as expected, greater ranges of variability in task-irrelevant variables (> 98% associated with changes in total grasp force; vs. only

  19. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  20. Physical Education-in-CLIL tasks. Determining tasks characteristics through the analysis of the diaries

    OpenAIRE

    Josep Coral Mateu; Teresa Lleixà Arribas

    2013-01-01

    This article focuses on the characteristics of Physical Education-in-CLIL (PE-in-CLIL) tasks. CLIL (Content and Language Integrated Learning) is a teaching approach which uses foreign language as a tool to enhance the subject learning process. We connect PE-in-CLIL with key competences and we introduce the CLIL 4Cs framework. We establish the aims of the study, that is; to describe the features of tasks which are most suitable to PE-in-CLIL and identify integrated tasks which appeal most to l...

  1. Computational stress analysis using finite volume methods

    OpenAIRE

    Fallah, Nosrat Allah

    2000-01-01

    There is a growing interest in applying finite volume methods to model solid mechanics problems and multi-physics phenomena. During the last ten years an increasing amount of activity has taken place in this area. Unlike the finite element formulation, which generally involves volume integrals, the finite volume formulation transfers volume integrals to surface integrals using the divergence theorem. This transformation for convection and diffusion terms in the governing equations, ensures...

  2. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  3. Taste perception analysis using a semantic verbal fluency task

    Directory of Open Access Journals (Sweden)

    Ghemulet M

    2014-09-01

    Full Text Available Maria Ghemulet,1,2 Maria Baskini,3 Lambros Messinis,2,4 Eirini Mouza,1 Hariklia Proios1,5 1Department of Speech Therapy, Anagennisis (Revival Physical Recovery and Rehabilitation Centre, Nea Raidestos, Filothei, Thessaloniki, Greece; 2Department of Speech and Language Therapy, Technological Institute of Western Greece, Patra, Greece; 3Department of Neurosurgery, Interbalkan European Medical Centre, Thessaloniki, Greece; 4Neuropsychology Section, Department of Neurology, University of Patras, Medical School, Patras, Greece; 5Department of Education and Social Policy, University of Macedonia, Thessaloniki, Greece Abstract: A verbal fluency (VF task is a test used to examine cognitive perception. The main aim of this study was to explore a possible relationship between taste perception in the basic taste categories (sweet, salty, sour, and bitter and subjects’ taste preferences, using a VF task in healthy and dysphagic subjects. In addition, we correlated the results of the VF task with body mass index (BMI. The hypothesis is that categorical preferences would be consistent with the number of verbal responses. We also hypothesized that higher BMI (.30 kg/m2 would correlate with more responses in either some or all four categories. VF tasks were randomly administered. Analysis criteria included number of verbally produced responses, number of clusters, number of switches, number and type of errors, and VF consistency with taste preferences. Sixty Greek-speaking individuals participated in this study. Forty-three healthy subjects were selected with a wide range of ages, sex, and education levels. Seventeen dysphagic patients were then matched with 17 healthy subjects according to age, sex, and BMI. Quantitative one-way analysis of variance (between groups as well as repeated measures, post hoc, and chi-square, and qualitative analyses were performed. In the healthy subjects’ group, the differences among the mean number of responses for the four

  4. Analysis of Brain Cognitive State for Arithmetic Task and Motor Task Using Electroencephalography Signal

    Directory of Open Access Journals (Sweden)

    R Kalpana

    2013-08-01

    Full Text Available To localize the brain dynamics for cognitive processes from EEG signature has been a challenging taskfrom last two decades. In this paper we explore the spatial-temporal correlations of brain electricalneuronal activity for cognitive task such as Arithmetic and Motor Task using 3D cortical distributionmethod. Ten healthy right handed volunteers participated in the experiment. EEG signal was acquiredduring resting state with eyes open and eyes closed; performing motor task and arithmetic calculations.The signal was then computed for three dimensional cortical distributions on realistic head model withMNI152 template using standardized low resolution brain electromagnetic tomography (sLORETA. Thiswas followed by an appropriate standardization of the current density, producing images of electricneuronal activity without localization bias. Neuronal generators responsible for cognitive state such asArithmetic Task and Motor Task were localized. The result was correlated with the previous neuroimaging(fMRI study investigation. Hence our result directed that the neuronal activity from EEG signal can bedemonstrated in cortical level with good spatial resolution. 3D cortical distribution method, thus, may beused to obtain both spatial and temporal information from EEG signal and may prove to be a significanttechnique to investigate the cognitive functions in mental health and brain dysfunctions. Also, it may behelpful for brain/human computer interfacing.

  5. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  6. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  7. Development of a task analysis tool to facilitate user interface design

    Science.gov (United States)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  8. Effects of immersion on visual analysis of volume data.

    Science.gov (United States)

    Laha, Bireswar; Sensharma, Kriti; Schiffbauer, James D; Bowman, Doug A

    2012-04-01

    Volume visualization has been widely used for decades for analyzing datasets ranging from 3D medical images to seismic data to paleontological data. Many have proposed using immersive virtual reality (VR) systems to view volume visualizations, and there is anecdotal evidence of the benefits of VR for this purpose. However, there has been very little empirical research exploring the effects of higher levels of immersion for volume visualization, and it is not known how various components of immersion influence the effectiveness of visualization in VR. We conducted a controlled experiment in which we studied the independent and combined effects of three components of immersion (head tracking, field of regard, and stereoscopic rendering) on the effectiveness of visualization tasks with two x-ray microscopic computed tomography datasets. We report significant benefits of analyzing volume data in an environment involving those components of immersion. We find that the benefits do not necessarily require all three components simultaneously, and that the components have variable influence on different task categories. The results of our study improve our understanding of the effects of immersion on perceived and actual task performance, and provide guidance on the choice of display systems to designers seeking to maximize the effectiveness of volume visualization applications. PMID:22402687

  9. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  10. Task analysis for the single-shell Tank Waste Retrieval Manipulator System

    Energy Technology Data Exchange (ETDEWEB)

    Draper, J.V.

    1993-03-01

    This document describes a task analysis for the Tank Waste Retrieval Manipulator System. A task analysis is a formal method of examining work that must be done by the operators of human-machine systems. The starting point for a task analysis is the mission that a human-machine system must perform, and the ending point is a list of requirements for human actions and the displays and controls that must be provided to support them. The task analysis approach started with a top-down definition of the steps in a tank retrieval campaign. It started by dividing a waste retrieval campaign for one single-shell tank into the largest logical components (mission phases), then subdivided these into secondary components (sub functions), and then further subdivided the secondary components into tertiary units (tasks). Finally, the tertiary units were divided into potentially observable operator behaviors (task elements). In the next stage of the task analysis, the task elements were evaluated by completing an electronic task analysis form patterned after one developed by the Nuclear Regulatory Commission for task analysis of nuclear power plant control rooms. In the final stage, the task analysis data base was used in a bottom-up approach to develop clusters of controls and displays called panel groups and to prioritize these groups for each subfunction. Panel groups are clusters of functionally related controls and displays. Actual control panels will be designed from panel groups, and panel groups will be organized within workstations to promote efficient operations during retrieval campaigns.

  11. The effects of bedrest on crew performance during simulated shuttle reentry. Volume 2: Control task performance

    Science.gov (United States)

    Jex, H. R.; Peters, R. A.; Dimarco, R. J.; Allen, R. W.

    1974-01-01

    A simplified space shuttle reentry simulation performed on the NASA Ames Research Center Centrifuge is described. Anticipating potentially deleterious effects of physiological deconditioning from orbital living (simulated here by 10 days of enforced bedrest) upon a shuttle pilot's ability to manually control his aircraft (should that be necessary in an emergency) a comprehensive battery of measurements was made roughly every 1/2 minute on eight military pilot subjects, over two 20-minute reentry Gz vs. time profiles, one peaking at 2 Gz and the other at 3 Gz. Alternate runs were made without and with g-suits to test the help or interference offered by such protective devices to manual control performance. A very demanding two-axis control task was employed, with a subcritical instability in the pitch axis to force a high attentional demand and a severe loss-of-control penalty. The results show that pilots experienced in high Gz flying can easily handle the shuttle manual control task during 2 Gz or 3 Gz reentry profiles, provided the degree of physiological deconditioning is no more than induced by these 10 days of enforced bedrest.

  12. Analysis of operators' diagnosis tasks based on cognitive process

    International Nuclear Information System (INIS)

    Diagnosis tasks in nuclear power plants characterized as high-dynamic uncertainties are complex reasoning tasks. Diagnosis errors are the main causes for the error of commission. Firstly, based on mental model theory and perception/action cycle theory, a cognitive model for analyzing operators' diagnosis tasks is proposed. Then, the model is used to investigate a trip event which occurred at crystal river nuclear power plant. The application demonstrates typical cognitive bias and mistakes which operators may make when performing diagnosis tasks. They mainly include the strong confirmation tendency, difficulty to produce complete hypothesis sets, group mindset, non-systematic errors in hypothesis testing, and etc. (authors)

  13. SLSF loop handling system. Volume I. Structural analysis

    International Nuclear Information System (INIS)

    SLSF loop handling system was analyzed for deadweight and postulated dynamic loading conditions, identified in Chapters II and III in Volume I of this report, using a linear elastic static equivalent method of stress analysis. Stress analysis of the loop handling machine is presented in Volume I of this report. Chapter VII in Volume I of this report is a contribution by EG and G Co., who performed the work under ANL supervision

  14. Life science payload definition and integration study, task C and D. Volume 3: Appendices

    Science.gov (United States)

    1973-01-01

    Research equipment requirements were based on the Mini-7 and Mini-30 laboratory concepts defined in Tasks A and B of the intial LSPD contract. Modified versions of these laboratories and the research equipment within them were to be used in three missions of Shuttle/Sortie Module. These were designated (1) the shared 7-day laboratory (a mission with the life sciences laboratory sharing the sortie module with another scientific laboratory), (2) the dedicated 7-day laboratory (full use of the sortie module), and (3) the dedicated 30-day laboratory (full sortie module use with a 30-day mission duration). In defining the research equipment requirements of these laboratories, the equipment was grouped according to its function, and equipment unit data packages were prepared.

  15. Response Time Analysis in Cognitive Tasks with Increasing Difficulty

    Science.gov (United States)

    Dodonov, Yury S.; Dodonova, Yulia A.

    2012-01-01

    In the present study, speeded tasks with differing assumed difficulties of the trials are regarded as a special class of simple cognitive tasks. Exploratory latent growth modeling with data-driven shape of a growth curve and nonlinear structured latent curve modeling with predetermined monotonically increasing functions were used to analyze…

  16. Revisiting the Task/Achievement Analysis of Teaching in Neo-Liberal Times

    Science.gov (United States)

    Marshall, James D.

    2009-01-01

    In 1975 I published an article on Gilbert Ryle's task/achievement analysis of teaching (Marshall, 1975), arguing that teaching was in Ryle's sense of the distinction a task verb. Philosophers of education were appealing to a distinction between tasks and achievements in their discussions of teaching, but they were often also appealing to Ryle's…

  17. Task analysis method for procedural training curriculum development.

    Science.gov (United States)

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments. PMID:24366759

  18. Simplex volume analysis for finding endmembers in hyperspectral imagery

    Science.gov (United States)

    Li, Hsiao-Chi; Song, Meiping; Chang, Chein-I.

    2015-05-01

    Using maximal simplex volume as an optimal criterion for finding endmembers is a common approach and has been widely studied in the literature. Interestingly, very little work has been reported on how simplex volume is calculated. It turns out that the issue of calculating simplex volume is much more complicated and involved than what we may think. This paper investigates this issue from two different aspects, geometric structure and eigen-analysis. The geometric structure is derived from its simplex structure whose volume can be calculated by multiplying its base with its height. On the other hand, eigen-analysis takes advantage of the Cayley-Menger determinant to calculate the simplex volume. The major issue of this approach is that when the matrix is ill-rank where determinant is desired. To deal with this problem two methods are generally considered. One is to perform data dimensionality reduction to make the matrix to be of full rank. The drawback of this method is that the original volume has been shrunk and the found volume of a dimensionality-reduced simplex is not the real original simplex volume. Another is to use singular value decomposition (SVD) to find singular values for calculating simplex volume. The dilemma of this method is its instability in numerical calculations. This paper explores all of these three methods in simplex volume calculation. Experimental results show that geometric structure-based method yields the most reliable simplex volume.

  19. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  20. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  1. Development of real-time multitask OSS based on cognitive task analysis

    International Nuclear Information System (INIS)

    A Real-time Multi-task Operator Support System (RMOSS) has been developed to support the operator's decision making process in the control room of NPP. VxWorks, one embedded real-time operation system, is used for RMOSS software development. According to the SRK modeling analysis result of the operator' decision making process, RMOSS is divided into five system subtasks, including Data Collection and Validation Task (DCVT), Operation Monitor Task (OMT), Fault Diagnostic Task (FDT), Operation Guideline Task (OGT) and Human Machine Interface Task (HMIT). The task test of RMOSS has been done in a real-time full scope simulator. The results showed that each task of RMOSS is capable of accomplishing their functions. (authors)

  2. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Being supported by scarce empirical data, most of the performance influencing factors in human reliability analysis (HRA) have to be assessed on the basis of the analyst's knowledge on the human performance in given tasks and their context. Therefore, the outcome of HRA may only be warranted by a proper application of their knowledge based on sufficient information about the tasks and situations. However, most of the HRA methodologies, including the newly developed ones, focus on the provision of cognitive models, error mechanisms, error types and analysis method while leaving the information collection mostly in the hands of the analyst. This paper suggests structured information analysis (SIA), which helps HRA analysts in collecting and structuring such information on tasks and contexts. The SIA consists of three parts: the scenario analysis, the goal-means analysis, and the cognitive function analysis. An expert evaluation showed that this three-part information analysis allowed more expressiveness and hence more confidence on the error prediction than ASEP HRA

  3. Life sciences payload definition and integration study, task C and D. Volume 1: Management summary

    Science.gov (United States)

    1973-01-01

    The findings of a study to define the required payloads for conducting life science experiments in space are presented. The primary objectives of the study are: (1) identify research functions to be performed aboard life sciences spacecraft laboratories and necessary equipment, (2) develop conceptual designs of potential payloads, (3) integrate selected laboratory designs with space shuttle configurations, and (4) establish cost analysis of preliminary program planning.

  4. A comparative critical analysis of modern task-parallel runtimes.

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Kyle Bruce; Stark, Dylan; Murphy, Richard C. [Micron Technology, Inc., Boise, ID

    2012-12-01

    The rise in node-level parallelism has increased interest in task-based parallel runtimes for a wide array of application areas. Applications have a wide variety of task spawning patterns which frequently change during the course of application execution, based on the algorithm or solver kernel in use. Task scheduling and load balance regimes, however, are often highly optimized for specific patterns. This paper uses four basic task spawning patterns to quantify the impact of specific scheduling policy decisions on execution time. We compare the behavior of six publicly available tasking runtimes: Intel Cilk, Intel Threading Building Blocks (TBB), Intel OpenMP, GCC OpenMP, Qthreads, and High Performance ParalleX (HPX). With the exception of Qthreads, the runtimes prove to have schedulers that are highly sensitive to application structure. No runtime is able to provide the best performance in all cases, and those that do provide the best performance in some cases, unfortunately, provide extremely poor performance when application structure does not match the scheduler's assumptions.

  5. Economic Analysis. Volume V. Course Segments 65-79.

    Science.gov (United States)

    Sterling Inst., Washington, DC. Educational Technology Center.

    The fifth volume of the multimedia, individualized course in economic analysis produced for the United States Naval Academy covers segments 65-79 of the course. Included in the volume are discussions of monopoly markets, monopolistic competition, oligopoly markets, and the theory of factor demand and supply. Other segments of the course, the…

  6. Heliostat manufacturing cost analysis. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Drumheller, K; Schulte, S C; Dilbeck, R A; Long, L W

    1979-10-01

    This study has two primary objectives. The first is to provide a detailed cost evaluation of the second generation of DOE heliostats, from which repowering heliostat designs are likely to be derived. A second objective is to provide an analytical foundation for the evaluation of futue heliostat designs. The approach taken for this study was to produce a cost estimate for the production of the McDonnell Douglas prototype design by generating estimates of the materials, labor, overhead, and facilities costs for two different production scenarios, 25,000 heliostats per year and 250,000 heliostats per year. The primary conclusion of this study is that the second generation of heliostat designs should cost approximately $100/m/sup 2/ at volumes of 25,000 units/year. This price falls to approximately $80/m/sup 2/ at volumes of 250,000 units/year. A second conclusion is that cost reduction begins at relatively low production volumes and that many production benefits can be obtained at production rates of 5,000 to 15,000 units/year. A third conclusion is that the SAMICS model and the SAMIS III program can be useful tools in heliostat manufacturing, costing, and economic studies.

  7. Task Descriptions in Diagnostic Radiology. Research Report No. 7. Volume 3, Machine-Related, Patient Care and Administrative Tasks: What Radiologists, Technologists, Nurses and Physicists Do to Run Things and Look After Patients and Equipment.

    Science.gov (United States)

    Gilpatrick, Eleanor

    The third of four volumes in Research Report No. 7 of the Health Services Mobility Study (HSMS), this book contains 149 diagnostic radiologist task descriptions that cover activities in the area of nursing (patient care), film processing, quality assurance, radiation protection, machine maintenance, housekeeping, and administration at the…

  8. Designing Preclinical Instruction for Psychomotor Skills (II)--Instructional Engineering: Task Analysis.

    Science.gov (United States)

    Knight, G. William; And Others

    1994-01-01

    The first step in engineering the instruction of dental psychomotor skills, task analysis, is explained. A chart details the procedural, cognitive, desired-criteria, and desired-performance analysis of a single task, occlusal preparation for amalgam restoration with carious lesion. (MSE)

  9. Steam Generator Group Project: Task 9 final report, Nondestructive evaluation round robin: Volume 1, Description and summary data

    International Nuclear Information System (INIS)

    The Steam Generator Group Project (SGGP) is using the retired-from-service Surry 2A pressurized water reactor steam generator as a test bed to investigate the reliability and effectiveness of inservice nondestructive eddy current inspection equipment and procedures. The information developed will provide the technical basis for updating the Regulatory Guides governing inservice inspection and tube plugging criteria of steam generators. This report describes Task 9 of the multi-task project. The objective of Task 9 was to plan, perform and analyze results of four round robin nondestructive examinations on a subset of tubes from the Surry generator. A description of the objectives, conduct and analysis of results for each round robin is presented. Validation of inspection results will be provided by removal of specimens from the generator for destructive and out-of-generator characterization (in progress). Preliminary comparisons of the inspection results in terms of agreement among teams of defect detection and sizing are given for each round robin. The majority of indications were reported at the hot leg top of the tube sheet. The best results on detection agreement ranged between 70 and 90 percent, but reported defect sizes varied significantly between teams for all round robins. The variability in detection and sizing appears to be due to analyst interpretation of the complex eddy current signals rather than differences in inspection equipment. 5 refs., 20 figs., 33 tabs

  10. Nuclear power plant control room task analysis. Pilot study for pressurized water reactors

    International Nuclear Information System (INIS)

    The purposes of this nuclear plant task analysis pilot study: to demonstrate the use of task analysis techniques on selected abnormal or emergency operation events in a nuclear power plant; to evaluate the use of simulator data obtained from an automated Performance Measurement System to supplement and validate data obtained by traditional task analysis methods; and to demonstrate sample applications of task analysis data to address questions pertinent to nuclear power plant operational safety: control room layout, staffing and training requirements, operating procedures, interpersonal communications, and job performance aids. Five data sources were investigated to provide information for a task analysis. These sources were (1) written operating procedures (event-based); (2) interviews with subject matter experts (the control room operators); (3) videotapes of the control room operators (senior reactor operators and reactor operators) while responding to each event in a simulator; (4) walk-/talk-throughs conducted by control room operators for each event; and (5) simulator data from the PMS

  11. Corporate Data Network (CDN) data requirements task. Enterprise Model. Volume 1

    International Nuclear Information System (INIS)

    The NRC has initiated a multi-year program to centralize its information processing in a Corporate Data Network (CDN). The new information processing environment will include shared databases, telecommunications, office automation tools, and state-of-the-art software. Touche Ross and Company was contracted with to perform a general data requirements analysis for shared databases and to develop a preliminary plan for implementation of the CDN concept. The Enterprise Model (Vol. 1) provided the NRC with agency-wide information requirements in the form of data entities and organizational demand patterns as the basis for clustering the entities into logical groups. The Data Dictionary (Vol. 2) provided the NRC with definitions and example attributes and properties for each entity. The Data Model (Vol. 3) defined logical databases and entity relationships within and between databases. The Preliminary Strategic Data Plan (Vol. 4) prioritized the development of databases and included a workplan and approach for implementation of the shared database component of the Corporate Data Network

  12. New Uses for Sensitivity Analysis: How Different Movement Tasks Effect Limb Model Parameter Sensitivity

    Science.gov (United States)

    Winters, J. M.; Stark, L.

    1984-01-01

    Original results for a newly developed eight-order nonlinear limb antagonistic muscle model of elbow flexion and extension are presented. A wider variety of sensitivity analysis techniques are used and a systematic protocol is established that shows how the different methods can be used efficiently to complement one another for maximum insight into model sensitivity. It is explicitly shown how the sensitivity of output behaviors to model parameters is a function of the controller input sequence, i.e., of the movement task. When the task is changed (for instance, from an input sequence that results in the usual fast movement task to a slower movement that may also involve external loading, etc.) the set of parameters with high sensitivity will in general also change. Such task-specific use of sensitivity analysis techniques identifies the set of parameters most important for a given task, and even suggests task-specific model reduction possibilities.

  13. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  14. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  15. Laser power conversion system analysis, volume 2

    Science.gov (United States)

    Jones, W. S.; Morgan, L. L.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The orbit-to-ground laser power conversion system analysis investigated the feasibility and cost effectiveness of converting solar energy into laser energy in space, and transmitting the laser energy to earth for conversion to electrical energy. The analysis included space laser systems with electrical outputs on the ground ranging from 100 to 10,000 MW. The space laser power system was shown to be feasible and a viable alternate to the microwave solar power satellite. The narrow laser beam provides many options and alternatives not attainable with a microwave beam.

  16. Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory

    Science.gov (United States)

    Fiester, Herbert R.

    2010-01-01

    The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…

  17. Commentary: A Closer Look at Task Analysis--Reactions to Wang, Schnipke, and Witt

    Science.gov (United States)

    LaDuca, Tony

    2006-01-01

    In the Spring 2005 issue, Wang, Schnipke, and Witt provided an informative description of the task inventory approach that centered on four functions of job analysis. The discussion included persuasive arguments for making systematic connections between tasks and KSAs. But several other facets of the discussion were much less persuasive. This…

  18. Job and Task Analysis project at Brookhaven National Laboratory's high flux beam reactor

    International Nuclear Information System (INIS)

    The presenter discussed the Job and Task Analysis (JTA) project conducted at Brookhaven National Laboratory's High Flux Beam Reactor (HFBR). The project's goal was to provide JTA guidelines for use by DOE contractors, then, using the guidelines conduct a JTA for the reactor operator and supervisor positions at the HFBR. Details of the job analysis and job description preparation as well as details of the task selection and task analysis were given. Post JTA improvements to the HFBR training programs were covered. The presentation concluded with a listing of the costs and impacts of the project

  19. Task analysis of nuclear-power-plant control-room crews

    International Nuclear Information System (INIS)

    A task analysis of nuclear-power-plant control-room crews was performed by General Physics Corporation and BioTechnology, Inc., for the Office of Nuclear Regulatory Research. The task analysis methodology used in the project is discussed and compared to traditional task-analysis and job-analysis methods. The objective of the project was to conduct a crew task analysis that would provide data for evaluating six areas: (1) human engineering design of control rooms and retrofitting of current control roooms, (2) the numbers and types of control room operators needed with requisite skills and knowledge, (3) operator qualification and training requirements, (4) normal, off-normal, and emergency operating procedures, (5) job performance aids, and (6) communications. The data collection approach focused on a generic structural framework for assembling the multitude of task data that were observed. The results of the data-collection effort were compiled in a computerized task database. Results including a description of the computerized task analysis data format

  20. Life sciences payload definition and integration study, task C and D. Volume 2: Payload definition, integration, and planning studies

    Science.gov (United States)

    1973-01-01

    The Life Sciences Payload Definition and Integration Study was composed of four major tasks. Tasks A and B, the laboratory definition phase, were the subject of prior NASA study. The laboratory definition phase included the establishment of research functions, equipment definitions, and conceptual baseline laboratory designs. These baseline laboratories were designated as Maxi-Nom, Mini-30, and Mini-7. The outputs of Tasks A and B were used by the NASA Life Sciences Payload Integration Team to establish guidelines for Tasks C and D, the laboratory integration phase of the study. A brief review of Tasks A and B is presented provide background continuity. The tasks C and D effort is the subject of this report. The Task C effort stressed the integration of the NASA selected laboratory designs with the shuttle sortie module. The Task D effort updated and developed costs that could be used by NASA for preliminary program planning.

  1. Fluid Vessel Quantity Using Non-invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  2. Fluid Vessel Quantity using Non-Invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A.

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  3. Performance Analysis of Software to Hardware Task Migration in Codesign

    CERN Document Server

    Sebai, Dorsaf; Bennour, Imed

    2010-01-01

    The complexity of multimedia applications in terms of intensity of computation and heterogeneity of treated data led the designers to embark them on multiprocessor systems on chip. The complexity of these systems on one hand and the expectations of the consumers on the other hand complicate the designers job to conceive and supply strong and successful systems in the shortest deadlines. They have to explore the different solutions of the design space and estimate their performances in order to deduce the solution that respects their design constraints. In this context, we propose the modeling of one of the design space possible solutions: the software to hardware task migration. This modeling exploits the synchronous dataflow graphs to take into account the different migration impacts and estimate their performances in terms of throughput.

  4. Interpreting Stroop interference: an analysis of differences between task versions.

    Science.gov (United States)

    Salo, R; Henik, A; Robertson, L C

    2001-10-01

    The present study investigated methodological differences between the clinical version of the Stroop Color and Word Test and the computerized single-trial version. Three experiments show that different presentations of the Stroop task can produce different levels of interference. The 1st experiment examined the effect of blocking; the 2nd experiment examined different control conditions. Greater interference in the blocked clinical version appears to result from lower response times (RTs) in the neutral condition, not from greater RTs in the incongruent condition. Experiment 3 examined the impact of shifting attention across locations while responding to Stroop stimuli. The present set of findings sheds light on the inconsistency in the clinical literature and demonstrates that the method and selection of neutral stimuli (that provide the baseline by which interference is measured) are critical because they clearly can change performance. PMID:11761035

  5. Using of fuzzy and imitation models and cluster analysis for decision of marketing tasks

    OpenAIRE

    Borisova, Tetyana Myhaylivna; Bryndzya, Zinoviy Fedorovych

    2011-01-01

    Actuality of using of some economic-mathematical methods at the decision of marketing tasks is considered in the article. The examples of using of fuzzy evaluation, cluster analysis and imitation modelling in marketing are presented here.

  6. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    Science.gov (United States)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  7. Dashboard Task Monitor for managing ATLAS user analysis on the Grid

    CERN Document Server

    Sargsyan, L; The ATLAS collaboration; Jha, M; Karavakis, E; Kokoszkiewicz, L; Saiz, P; Schovancova, J; Tuckett, D

    2013-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is operating system and GRID environment independent. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  8. Dashboard task monitor for managing ATLAS user analysis on the grid

    International Nuclear Information System (INIS)

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  9. Thermodynamic analysis of a Stirling engine including regenerator dead volume

    Energy Technology Data Exchange (ETDEWEB)

    Puech, Pascal; Tishkova, Victoria [Universite de Toulouse, UPS, CNRS, CEMES, 29 rue Jeanne Marvig, F-31055 Toulouse (France)

    2011-02-15

    This paper provides a theoretical investigation on the thermodynamic analysis of a Stirling engine with linear and sinusoidal variations of the volume. The regenerator in a Stirling engine is an internal heat exchanger allowing to reach high efficiency. We used an isothermal model to analyse the net work and the heat stored in the regenerator during a complete cycle. We show that the engine efficiency with perfect regeneration doesn't depend on the regenerator dead volume but this dead volume strongly amplifies the imperfect regeneration effect. An analytical expression to estimate the improvement due to the regenerator has been proposed including the combined effects of dead volume and imperfect regeneration. This could be used at the very preliminary stage of the engine design process. (author)

  10. Analysis for Secondary Task in Advanced Main Control Room Using Soft Controls

    International Nuclear Information System (INIS)

    The purpose of this study is to analyze operator tasks using soft controls from the simulation data of an advanced MCR. In this study, the primary and secondary tasks of eighteen simulation data were analyzed. The results showed that secondary tasks were required to perform scenarios more than primary task needs. Among these secondary tasks, the 'switch screen' made up the largest portion. This indicates that operator workload would increase with an increase of the 'switch screen' task. To reduce operator workload, Cps designers put the screen link buttons in the Cps. According to an analysis of secondary tasks using the screen link buttons, it is recognized that using the screen link buttons of the Cps helps reduce the number of secondary tasks and reduce errors of the 'switch screen'. Therefore, although increased secondary tasks can affect the increase of operator workload according to the adapting soft controls in advanced MCRs, using supporting designs such as the screen link buttons helps to reduce operator workload and errors

  11. Task analysis of nuclear-power-plant control-room crews: project approach methodology

    International Nuclear Information System (INIS)

    A task analysis of nuclear-power-plant control-room crews was performed by General Physics Corporation and BioTechnology, Inc., for the Office of Nuclear Regulatory Research. The task-analysis methodology used in the project is discussed and compared to traditional task-analysis and job-analysis methods. The objective of the project was to conduct a crew task analysis that would provide data for evaluating six areas: (1) human-engineering design of control rooms and retrofitting of current control rooms; (2) the numbers and types of control-room operators needed with requisite skills and knowledge; (3) operator qualification and training requirements; (4) normal, off-normal, and emergency operating procedures; (5) job-performance aids; and (6) communications. The data-collection approach focused on a generic structural framework for assembling the multitude of task data that were observed. The results of the data-collection effort were compiled in a coputerized task database. Six demonstrations for suitability analysis were subsequently conducted in each of the above areas and are described in this report

  12. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  13. Initiating an ergonomic analysis. A process for jobs with highly variable tasks.

    Science.gov (United States)

    Conrad, K M; Lavender, S A; Reichelt, P A; Meyer, F T

    2000-09-01

    Occupational health nurses play a vital role in addressing ergonomic problems in the workplace. Describing and documenting exposure to ergonomic risk factors is a relatively straightforward process in jobs in which the work is repetitive. In other types of work, the analysis becomes much more challenging because tasks may be repeated infrequently, or at irregular time intervals, or under different environmental and temporal conditions, thereby making it difficult to observe a "representative" sample of the work performed. This article describes a process used to identify highly variable job tasks for ergonomic analyses. The identification of tasks for ergonomic analysis was a two step process involving interviews and a survey of firefighters and paramedics from a consortium of 14 suburban fire departments. The interviews were used to generate a list of frequently performed, physically strenuous job tasks and to capture clear descriptions of those tasks and associated roles. The goals of the survey were to confirm the interview findings across the entire target population and to quantify the frequency and degree of strenuousness of each task. In turn, the quantitative results from the survey were used to prioritize job tasks for simulation. Although this process was used to study firefighters and paramedics, the approach is likely to be suitable for many other types of occupations in which the tasks are highly variable in content and irregular in frequency. PMID:11760289

  14. Real-time applications with stochastic task execution times analysis and optimisation

    CERN Document Server

    Manolache, Sorin; Peng, Zebo

    2007-01-01

    Presents three approaches to the analysis of the deadline miss ratio of applications with stochastic task execution times. The first approach is applicable to monoprocessor systems. The second allows for designer-controlled trade-off between analysis accuracy and analysis speed. The third is fast in order to be placed inside optimisation loops.

  15. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    Science.gov (United States)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  16. Analysis of Spectral Features of EEG during four different Cognitive Tasks

    Directory of Open Access Journals (Sweden)

    S.BAGYARAJ

    2014-05-01

    Full Text Available Cognition is a group of information processing activities that involves the visual attention, visual awareness, problem solving and decision making. Finding the cognitive task related regional cerebral activations are of great interest among researchers in cognitive neuroscience. In this study four different types of cognitive tasks, namely tracking pendulum movement and counting, red flash counting, sequential subtraction, spot the difference is performed by 32 subjects and the EEG signals are acquired by using 24 channels RMS EEG-32 Super Spec machine. The analyses of the EEG signal are done by using well known spectral methods. The band powers are calculated in the frequency domain by using the Welch method. The task- relaxes relative band power values and the ratios of theta band power/ beta band power are the two variables used to find the regional cerebral activations during the four different cognitive tasks. The statistical paired t test is used to evaluate the significant difference between the particular tasks related cerebral activations and relaxation. The statistical significance level is set at p< 0.05. During the tracking pendulum movement and counting task, the cerebral activations are found to be bilateral prefrontal, frontal, right central and temporal regions. Red flash counting task has activations in bilateral prefrontal, frontal, right central, right parietal and right occipital lobes. Bilateral prefrontal regions are activated during the sequence subtraction task. The spot the difference task has activations in the left and right prefrontal cortex. The unique and common activations regions for the selected four different cognitive tasks are found to be left and right prefrontal cortex. The pre frontal lobe electrodes namely Fp1 & Fp2 can be used as the recording electrodes for detailed cognitive task analysis were cerebral activations are observed when compared with the other cerebral regions.

  17. Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis

    Science.gov (United States)

    Wiens, Stefan; Szychowska, Malina; Nilsson, Mats E.

    2016-01-01

    Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN. PMID:26741815

  18. Volume conduction effects on wavelet cross-bicoherence analysis

    International Nuclear Information System (INIS)

    Cross-bicoherence analysis is one of the important nonlinear signal processing tools which is used to measure quadratic phase coupling between frequencies of two different time series. It is frequently used in the diagnosis of various cognitive and neurological disorders in EEG (Electroencephalography) analysis. Volume conduction effects of various uncorrelated sources present in the brain can produce biased estimates into the estimated values of cross-bicoherence function. Previous studies have discussed volume conduction effects on coherence function which is used to measure linear relationship between EEG signals in terms of their phase and amplitude. However, volume conduction effect on cross-bicoherence analysis which is quite a different technique has not been investigated up to now to the best of our knowledge. This study is divided into two major parts, the first part deals with the investigation of VCUS (Volume Conduction effects due to Uncorrelated Sources) characteristics on EEG-cross-bicoherence analysis. The simulated EEG data due to uncorrelated sources present in the brain was used in this part of study. The next part of study is based upon investigating the effects of VCUS on the statistical analysis of results of EEG-based cross-bicoherence analysis. The study provides an important clinical application because most of studies based on EEG cross-bicoherence analysis have avoided the issue of VCUS. The cross-bicoherence analysis was performed by detecting the change in MSCB (Magnitude Square Cross-Bicoherence Function) between EEG activities of change detection and no-change detection trials. The real EEG signals were used. (author)

  19. Operationally efficient propulsion system study (OEPSS) data book. Volume 6; Space Transfer Propulsion Operational Efficiency Study Task of OEPSS

    Science.gov (United States)

    Harmon, Timothy J.

    1992-01-01

    This document is the final report for the Space Transfer Propulsion Operational Efficiency Study Task of the Operationally Efficient Propulsion System Study (OEPSS) conducted by the Rocketdyne Division of Rockwell International. This Study task studied, evaluated and identified design concepts and technologies which minimized launch and in-space operations and optimized in-space vehicle propulsion system operability.

  20. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 1: Issues, Impacts, and Economics of Wind and Hydropower Integration

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  1. Analysis of the chemical equilibrium of combustion at constant volume

    Directory of Open Access Journals (Sweden)

    Marius BREBENEL

    2014-04-01

    Full Text Available Determining the composition of a mixture of combustion gases at a given temperature is based on chemical equilibrium, when the equilibrium constants are calculated on the assumption of constant pressure and temperature. In this paper, an analysis of changes occurring when combustion takes place at constant volume is presented, deriving a specific formula of the equilibrium constant. The simple reaction of carbon combustion in pure oxygen in both cases (constant pressure and constant volume is next considered as example of application, observing the changes occurring in the composition of the combustion gases depending on temperature.

  2. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  3. Implementationsl Aspects of linear Discriminant Analysis for Classification Tasks

    Czech Academy of Sciences Publication Activity Database

    Duintjer Tebbens, Jurjen; Schlesinger, P.

    Bari : Instituto Nazionale di Alta Matematica, 2006. s. 23-23. [Numerical Linear Algebra in Signals and Systems. International Workshop. 11.09.2006-15.09.2006, Bari] R&D Projects: GA AV ČR 1ET400300415; GA MŠk LC536 Institutional research plan: CEZ:AV0Z10300504 Keywords : linear discriminant analysis * sparsity * small sample size problem * dimension reduction Subject RIV: BA - General Mathematics

  4. Task 11 - systems analysis of environmental management technologies

    Energy Technology Data Exchange (ETDEWEB)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy & Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech.

  5. Task 11 - systems analysis of environmental management technologies. Topical report

    International Nuclear Information System (INIS)

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy ampersand Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech

  6. Task 7: Endwall treatment inlet flow distortion analysis

    Science.gov (United States)

    Hall, E. J.; Topp, D. A.; Heidegger, N. J.; McNulty, G. S.; Weber, K. F.; Delaney, R. A.

    1996-01-01

    The overall objective of this study was to develop a 3-D numerical analysis for compressor casing treatment flowfields, and to perform a series of detailed numerical predictions to assess the effectiveness of various endwall treatments for enhancing the efficiency and stall margin of modern high speed fan rotors. Particular attention was given to examining the effectiveness of endwall treatments to counter the undesirable effects of inflow distortion. Calculations were performed using three different gridding techniques based on the type of casing treatment being tested and the level of complexity desired in the analysis. In each case, the casing treatment itself is modeled as a discrete object in the overall analysis, and the flow through the casing treatment is determined as part of the solution. A series of calculations were performed for both treated and untreated modern fan rotors both with and without inflow distortion. The effectiveness of the various treatments were quantified, and several physical mechanisms by which the effectiveness of endwall treatments is achieved are discussed.

  7. Task Analysis in Action: The Role of Information Systems in Communicable Disease Reporting

    OpenAIRE

    Pina, Jamie; Turner, Anne; Kwan-Gett, Tao; Duchin, Jeff

    2009-01-01

    In order to improve the design of information systems for notifiable conditions reporting, it is essential to understand the role of such systems in public health practice. Using qualitative techniques, we performed a task analysis of the activities associated with notifiable conditions reporting at a large urban health department. We identified seventeen primary tasks associated with the use of the department’s information system. The results of this investigation suggest that communicable d...

  8. AGAPE-ET for human error analysis of emergency tasks and its application

    International Nuclear Information System (INIS)

    The paper presents a proceduralised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), covering both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET method is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of the performance influencing factors (PIFs) on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations and a human error analysis procedure based on the error analysis items is organised to help the analysts cue or guide overall human error analysis. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The paper also presents the application of AGAPE-ET to 31 nuclear emergency tasks and its results

  9. Micro analysis of fringe field formed inside LDA measuring volume

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.

    2016-05-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement.

  10. Energy use in the marine transportation industry: Task III. Efficiency improvements; Task IV. Industry future. Final report, Volume IV. [Projections for year 2000

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Tasks III and IV measure the characteristics of potential research and development programs that could be applied to the maritime industry. It was necessary to identify potential operating scenarios for the maritime industry in the year 2000 and determine the energy consumption that would result given those scenarios. After the introductory chapter the operational, regulatory, and vessel-size scenarios for the year 2000 are developed in Chapter II. In Chapter III, future cargo flows and expected levels of energy use for the baseline 2000 projection are determined. In Chapter IV, the research and development programs are introduced into the future US flag fleet and the energy-savings potential associated with each is determined. The first four appendices (A through D) describe each of the generic technologies. The fifth appendix (E) contains the baseline operating and cost parameters against which 15 program areas were evaluated. (MCW)

  11. District heating and cooling systems for communities through power-plant retrofit and distribution network. Volume 2. Tasks 1-3. Final report. [Downtown Toledo steam system

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Each of the tasks is described separately: Task 1 - Demonstration Team; Task 2 - Identify Thermal Energy Source(s) and Potential Service Area(s); and Task 3 - Energy Market Analysis. The purpose of the project is to establish and implement measures in the downtown Toledo steam system for conserving scarce fuel supplies through cogeneration, by retrofit of existing base- or intermediate-loaded electric-generating plants to provide for central heating and cooling systems, with the ultimate purpose of applying the results to other communities. For Task 1, Toledo Edison Company has organized a Demonstration Team (Battelle Columbus Laboratories; Stone and Webster; Ohio Dept. of Energy; Public Utilities Commission of Ohio; Toledo Metropolitan Area Council of Governments; and Toledo Edison) that it hopes has the expertise to evaluate the technical, legal, economic, and marketing issues related to the utilization of by-product heat from power generation to supply district heating and cooling services. Task 2 gives a complete technical description of the candidate plant(s), its thermodynamic cycle, role in load dispatch, ownership, and location. It is concluded that the Toledo steam distribution system can be the starting point for developing a new district-heating system to serve an expanding market. Battelle is a member of the team employed as a subcontractor to complete the energy market analysis. The work is summarized in Task 3. (MCW)

  12. The Relative Efficiency of Two Strategies for Conducting Cognitive Task Analysis

    Science.gov (United States)

    Flynn, Catherine L.

    2012-01-01

    Cognitive task analysis (CTA) has evolved over the past half century to capture the mental decisions and analysis that experts have learned to implement when solving complex problems. Since expertise is largely automated and nonconscious, a variety of observation and interview strategies have been developed to identify the most critical cognitive…

  13. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  14. Trading Volume and Stock Indices: A Test of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Paul Abbondante

    2010-01-01

    Full Text Available Problem statement: Technical analysis and its emphasis on trading volume has been used to analyze movements in individual stock prices and make investment recommendations to either buy or sell that stock. Little attention has been paid to investigating the relationship between trading volume and various stock indices. Approach: Since stock indices track overall stock market movements, trends in trading volume could be used to forecast future stock market trends. Instead of focusing only on individual stocks, this study will examine movements in major stock markets as a whole. Regression analysis was used to investigate the relationship between trading volume and five popular stock indices using daily data from January, 2000 to June, 2010. A lag of 5 days was used because this represents the prior week of trading volume. The total sample size ranges from 1,534-2,638 to observations. Smaller samples were used to test the investment horizon that explains movements of the indices more completely. Results: The F statistics were significant for samples using 6 and 16 months of data. The F statistic was not significant using a sample of 1 month of data. This is surprising given the short term focus of technical analysis. The results indicate that above-average returns can be achieved using futures, options and exchange traded funds which track these indices. Conclusion: Future research efforts will include out-of-sample forecasting to determine if above-average returns can be achieved. Additional research can be conducted to determine the optimal number of lags for each index.

  15. Co-Constructional Task Analysis: Moving beyond Adult-Based Models to Assess Young Children's Task Performance

    Science.gov (United States)

    Lee, Scott Weng Fai

    2013-01-01

    The assessment of young children's thinking competence in task performances has typically followed the novice-to-expert regimen involving models of strategies that adults use when engaged in cognitive tasks such as problem-solving and decision-making. Socio-constructivists argue for a balanced pedagogical approach between the adult and child…

  16. Analysis of Member State RED implementation. Final Report (Task 2)

    Energy Technology Data Exchange (ETDEWEB)

    Peters, D.; Alberici, S.; Toop, G. [Ecofys, Utrecht (Netherlands); Kretschmer, B. [Institute for European Environmental Policy IEEP, London (United Kingdom)

    2012-12-15

    This report describes the way EU Member States have transposed the sustainability and chain of custody requirements for biofuels as laid down in the Renewable Energy Directive (RED) and Fuel Quality Directive (FQD). In the assessment of Member States' implementation, the report mainly focuses on effectiveness and administrative burden. Have Member States transposed the Directives in such a way that compliance with the sustainability criteria can be ensured as effectively as possible? To what extent does the Member States' implementation lead to unnecessary administrative burden for economic operators in the (bio)fuel supply chain? The report focuses specifically on the transposition of the sustainability and chain of custody requirements, not on the target for renewables on transport. This means that for example the double counting provision is not included as part of the scope of this report. This report starts with an introduction covering the implementation of the Renewable Energy (and Fuel Quality) Directive into national legislation, the methodology by which Member States were assessed against effectiveness and administrative burden and the categorisation of Member State's national systems for RED-implementation (Chapter 1). The report continues with a high level description of each Member State system assessed (Chapter 2). Following this, the report includes analysis of the Member States on the effectiveness and administrative burden of a number of key ('major') measures (Chapter 3). The final chapter presents the conclusions and recommendations (Chapter 4)

  17. Repowering analysis: Hanford Generating Project (HGP), Task Order Number 6

    International Nuclear Information System (INIS)

    The Hanford Generating Project (HGP), owned by the Washington Public Power Supply System, consists of two low pressure steam turbines, generators, and associated equipment located adjacent to the Department of Energy's (DOE) N-Reactor. HGP has been able to produce approximately 800 MWe with low pressure steam supplied by N-Reactor. DOE has placed N-Reactor in cold standby status for an undetermined length of time. This results in the idling of the HGP since no alternative source of steam is available. Bonneville Power Administration contracted with Fluor Daniel, Inc. to investigate the feasibility and cost of constructing a new source of steam for (repowering) one of the HGP turbines. The steam turbine is currently operated with 135 psia steam. The turbines can be rebuilt to operate with 500 psia steam pressure by adding additional stages, buckets, nozzles, and diaphragms. Because of the low pressure design, this turbine can never achieve the efficiencies possible in new high pressure turbines by the presences of existing equipment reduces the capital cost of a new generating resource. Five repowering options were investigated in this study. Three cases utilizing gas turbine combined cycle steam generation equipment, one case utilizing a gas fired boiler, and a case utilizing a coal fired boiler. This report presents Fluor Daniel's analysis of these repowering options

  18. Development of contextual task analysis for NPP control room operators' work

    International Nuclear Information System (INIS)

    The paper introduces a contextual approach to task analysis concerning control room operators' tasks and task conditions in nuclear power plants. The approach is based on the ecological concept of the situational appropriateness of activity. The task demands are dependent on the ultimate task of the operators which is to maintain the critical safety functions of the process. The context also sets boundary conditions to the fulfilment of these demands. The conceptualisation of the context affords possibilities to comprehend and make visible the core demands of the operators' work. Characteristic to the approach is that the conceptualisation is made both from the point of the operators who are making interpretations of the situation and from the point of the process to be controlled. The context is described as a world of operators' possibilities and constraints and, at the same time, in relation to the demands set by the nature of the process. The method is under development and has been applied in simulator training, in the evaluation of the control room information and in the integrated development of reliability analysis. The method emphasizes the role of explicit conceptualisation of the task situations. Explicity enhances its role as a conceptual tool and, therefore, promotes common awareness in these domains. (orig.)

  19. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. PMID:26851473

  20. Using plant procedures as the basis for conducting a job and task analysis

    International Nuclear Information System (INIS)

    Plant procedures were selected, by Northeast Utilities (NU), as the basis for conducting Job and Task Analyses (JTA). The resultant JTA was used to design procedure based simulator training programs for Millstone 1, 2, and Connecticut Yankee. The task listings were both plant specific and exhibited excellent correlation to INPO's generic PWR and BWR task analyses. Using the procedures based method enabled us to perform the JTA using plant and training staff. This proved cost effective in terms of both time and money. Learning objectives developed from the JTA were easily justified and correlated directly to job performance within the context of the plant procedures. In addition, the analysis generated a comprehensive review of plant procedures and, conversely, the plant's normal procedure revision process generated an automatic trigger for updating the task data

  1. Analysis of the chemical equilibrium of combustion at constant volume

    OpenAIRE

    Marius BREBENEL

    2014-01-01

    Determining the composition of a mixture of combustion gases at a given temperature is based on chemical equilibrium, when the equilibrium constants are calculated on the assumption of constant pressure and temperature. In this paper, an analysis of changes occurring when combustion takes place at constant volume is presented, deriving a specific formula of the equilibrium constant. The simple reaction of carbon combustion in pure oxygen in both cases (constant pressure and constant ...

  2. Graphic analysis of flow-volume curves: a pilot study

    OpenAIRE

    Lee, Jungsil; Lee, Choon-Taek; Lee, Jae Ho; Cho, Young-Jae; Park, Jong Sun; Oh, Yeon-Mok; Lee, Sang-Do; Yoon, Ho Il; ,

    2016-01-01

    Background Conventional spirometric parameters have shown poor correlation with symptoms and health status of chronic obstructive pulmonary disease (COPD). While it is well-known that the pattern of the expiratory flow-volume curve (EFVC) represents ventilatory dysfunction, little attempts have been made to derive quantitative parameters by analyzing the curve. In this study, we aimed to derive useful parameters from EFVC via graphic analysis and tried to validate them in patients with COPD. ...

  3. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    Science.gov (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  4. Using cost – volume – profit analysis by management

    OpenAIRE

    Trifan, A.; Anton, C. E.

    2011-01-01

    Founded on the distinction between variable costs and fixed costs, the analysis of the relationship between the volume of activity, costs and profits is directed to decision-making in order to guide an entity’s management to obtain optimal results. It is known that the models that individualize the development of the expenses at an entity’s level represent the basis of cost analysis. Then, given the fact that foresight imposes taken into account fluctuations in an activity, the grouping of ex...

  5. Using cost – volume – profit analysis by management

    Directory of Open Access Journals (Sweden)

    Trifan, A.

    2011-01-01

    Full Text Available Founded on the distinction between variable costs and fixed costs, the analysis of the relationship between the volume of activity, costs and profits is directed to decision-making in order to guide an entity’s management to obtain optimal results. It is known that the models that individualize the development of the expenses at an entity’s level represent the basis of cost analysis. Then, given the fact that foresight imposes taken into account fluctuations in an activity, the grouping of expenses into variable and fixed will be used for forecasting management, for evaluating an entity’s performance and for analyzing decisional alternatives.

  6. Parallel runway requirement analysis study. Volume 2: Simulation manual

    Science.gov (United States)

    Ebrahimi, Yaghoob S.; Chun, Ken S.

    1993-01-01

    This document is a user manual for operating the PLAND_BLUNDER (PLB) simulation program. This simulation is based on two aircraft approaching parallel runways independently and using parallel Instrument Landing System (ILS) equipment during Instrument Meteorological Conditions (IMC). If an aircraft should deviate from its assigned localizer course toward the opposite runway, this constitutes a blunder which could endanger the aircraft on the adjacent path. The worst case scenario would be if the blundering aircraft were unable to recover and continue toward the adjacent runway. PLAND_BLUNDER is a Monte Carlo-type simulation which employs the events and aircraft positioning during such a blunder situation. The model simulates two aircraft performing parallel ILS approaches using Instrument Flight Rules (IFR) or visual procedures. PLB uses a simple movement model and control law in three dimensions (X, Y, Z). The parameters of the simulation inputs and outputs are defined in this document along with a sample of the statistical analysis. This document is the second volume of a two volume set. Volume 1 is a description of the application of the PLB to the analysis of close parallel runway operations.

  7. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  8. Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia

    Science.gov (United States)

    Gucev, Gligor V.

    2012-01-01

    Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…

  9. Task Analysis and Profile Construction in the Remediation of LD Students.

    Science.gov (United States)

    Silber, Leon D.

    1982-01-01

    The author presents a task analysis model for interpreting diagnostic test performance and for planning a remedial program for learning disabled students. A survey of evaluation instruments is given for eight skill areas, including oral expression, listening comprehension, and basic reading skills. A case study illustrates the approach. (CL)

  10. Use of Job Task Analysis (JTA) in the development of craft training programs

    International Nuclear Information System (INIS)

    Northern States Power Company is making a major effort to develop performance based training. It is finding the use of JTA data very helpful in the revision of its maintenance craft training programs. The technique being used involves a group of interns from the Training and Development Program of the University of Minnesota. These interns are largely graduate students, but with no nuclear and little mechanical/electrical experience. A Job Analysis for each discipline was used to: guide the following task analysis, determine program content, evaluate existing OJT check lists, and to define the four crafts used for mechanical maintenance. From the Job Analysis, a Training Task List was developed and correlated to training materials. The analysis of the tasks on the Training Task List is proceeding. Taxonomies of systems or subjects are compared to existing lesson plans. These taxonomies are useful when writing new lesson plans. The taxonomies are an excellent start for the development of enabling objectives. A Nine-Step Plan is being followed in the application of JTA data to the development and refinement of performance based training

  11. Analysis of Tasks in Pre-Service Elementary Teacher Education Courses

    Science.gov (United States)

    Sierpinska, Anna; Osana, Helena

    2012-01-01

    This paper presents some results of research aimed at contributing to the development of a professional knowledge base for teachers of elementary mathematics methods courses, called here "teacher educators." We propose that a useful unit of analysis for this knowledge could be the tasks in which teacher-educators engage pre-service…

  12. Teacher Analysis of Student Knowledge (TASK): A Measure of Learning Trajectory-Oriented Formative Assessment

    Science.gov (United States)

    Supovitz, Jonathan; Ebby, Caroline B.; Sirinides, Philip

    2013-01-01

    This interactive electronic report provides an overview of an innovative new instrument developed by researchers at the Consortium for Policy Research in Education (CPRE) to authentically measure teachers' formative assessment practices in mathematics. The Teacher Analysis of Student Knowledge, or TASK, instrument assesses mathematics…

  13. Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture

    Science.gov (United States)

    Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie

    2015-01-01

    This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…

  14. The Use of Cognitive Task Analysis to Capture Expertise for Tracheal Extubation Training in Anesthesiology

    Science.gov (United States)

    Embrey, Karen K.

    2012-01-01

    Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…

  15. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    International Nuclear Information System (INIS)

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  16. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    Energy Technology Data Exchange (ETDEWEB)

    Torralba, B.; Martinez-Arias, R.

    2007-07-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  17. Toward the Development of Cognitive Task Difficulty Metrics to Support Intelligence Analysis Research

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.

    2005-08-08

    Intelligence analysis is a cognitively complex task that is the subject of considerable research aimed at developing methods and tools to aid the analysis process. To support such research, it is necessary to characterize the difficulty or complexity of intelligence analysis tasks in order to facilitate assessments of the impact or effectiveness of tools that are being considered for deployment. A number of informal accounts of ''What makes intelligence analysis hard'' are available, but there has been no attempt to establish a more rigorous characterization with well-defined difficulty factors or dimensions. This paper takes an initial step in this direction by describing a set of proposed difficulty metrics based on cognitive principles.

  18. Video-task acquisition in rhesus monkeys (Macaca mulatta) and chimpanzees (Pan troglodytes): a comparative analysis

    Science.gov (United States)

    Hopkins, W. D.; Washburn, D. A.; Hyatt, C. W.; Rumbaugh, D. M. (Principal Investigator)

    1996-01-01

    This study describes video-task acquisition in two nonhuman primate species. The subjects were seven rhesus monkeys (Macaca mulatta) and seven chimpanzees (Pan troglodytes). All subjects were trained to manipulate a joystick which controlled a cursor displayed on a computer monitor. Two criterion levels were used: one based on conceptual knowledge of the task and one based on motor performance. Chimpanzees and rhesus monkeys attained criterion in a comparable number of trials using a conceptually based criterion. However, using a criterion based on motor performance, chimpanzees reached criterion significantly faster than rhesus monkeys. Analysis of error patterns and latency indicated that the rhesus monkeys had a larger asymmetry in response bias and were significantly slower in responding than the chimpanzees. The results are discussed in terms of the relation between object manipulation skills and video-task acquisition.

  19. Analysis of volume holographic storage allowing large-angle illumination

    Science.gov (United States)

    Shamir, Joseph

    2005-05-01

    Advanced technological developments have stimulated renewed interest in volume holography for applications such as information storage and wavelength multiplexing for communications and laser beam shaping. In these and many other applications, the information-carrying wave fronts usually possess narrow spatial-frequency bands, although they may propagate at large angles with respect to each other or a preferred optical axis. Conventional analytic methods are not capable of properly analyzing the optical architectures involved. For mitigation of the analytic difficulties, a novel approximation is introduced to treat narrow spatial-frequency band wave fronts propagating at large angles. This approximation is incorporated into the analysis of volume holography based on a plane-wave decomposition and Fourier analysis. As a result of the analysis, the recently introduced generalized Bragg selectivity is rederived for this more general case and is shown to provide enhanced performance for the above indicated applications. The power of the new theoretical description is demonstrated with the help of specific examples and computer simulations. The simulations reveal some interesting effects, such as coherent motion blur, that were predicted in an earlier publication.

  20. Analysis of Cavity Volumes in Proteins Using Percolation Theory

    Science.gov (United States)

    Green, Sheridan; Jacobs, Donald; Farmer, Jenny

    Molecular packing is studied in a diverse set of globular proteins in their native state ranging in size from 34 to 839 residues An new algorithm has been developed that builds upon the classic Hoshen-Kopelman algorithm for site percolation combined with a local connection criterion that classifies empty space within a protein as a cavity when large enough to hold a spherical shaped probe of radius, R, otherwise a microvoid. Although microvoid cannot fit an object (e.g. molecule or ion) that is the size of the probe or larger, total microvoid volume is a major contribution to protein volume. Importantly, the cavity and microvoid classification depends on probe radius. As probe size decreases, less microvoid forms in favor of more cavities. As probe size is varied from large to small, many disconnected cavities merge to form a percolating path. For fixed probe size, microvoid, cavity and solvent accessible boundary volume properties reflect conformational fluctuations. These results are visualized on three-dimensional structures. Analysis of the cluster statistics within the framework of percolation theory suggests interconversion between microvoid and cavity pathways regulate the dynamics of solvent penetration during partial unfolding events important to protein function.

  1. Factor analysis for imperfect maintenance planning at nuclear power plants by cognitive task analysis

    International Nuclear Information System (INIS)

    Imperfect maintenance planning was frequently identified in domestic nuclear power plants. To prevent such an event, we analyzed causal factors in maintenance planning stages and showed the directionality of countermeasures in this study. There is a pragmatic limit in finding the causal factors from the items based on report descriptions. Therefore, the idea of the systemic accident model, which is used to monitor the performance variability in normal circumstances, is taken as a new concept instead of investigating negative factors. As an actual method for analyzing usual activities, cognitive task analysis (CTA) was applied. Persons who experienced various maintenance activities at one electric power company were interviewed about sources related to decision making during maintenance planning, and then usual factors affecting planning were extracted as performance variability factors. The tendency of domestic events was analyzed using the classification item of those factors, and the directionality of countermeasures was shown. The following are critical for preventing imperfect maintenance planning: the persons in charge should fully understand the situation of the equipment for which they are responsible in the work planning and maintenance evaluation stages, and they should definitely understand, for example, the maintenance bases of that equipment. (author)

  2. Parallel runway requirement analysis study. Volume 1: The analysis

    Science.gov (United States)

    Ebrahimi, Yaghoob S.

    1993-01-01

    The correlation of increased flight delays with the level of aviation activity is well recognized. A main contributor to these flight delays has been the capacity of airports. Though new airport and runway construction would significantly increase airport capacity, few programs of this type are currently underway, let alone planned, because of the high cost associated with such endeavors. Therefore, it is necessary to achieve the most efficient and cost effective use of existing fixed airport resources through better planning and control of traffic flows. In fact, during the past few years the FAA has initiated such an airport capacity program designed to provide additional capacity at existing airports. Some of the improvements that that program has generated thus far have been based on new Air Traffic Control procedures, terminal automation, additional Instrument Landing Systems, improved controller display aids, and improved utilization of multiple runways/Instrument Meteorological Conditions (IMC) approach procedures. A useful element to understanding potential operational capacity enhancements at high demand airports has been the development and use of an analysis tool called The PLAND_BLUNDER (PLB) Simulation Model. The objective for building this simulation was to develop a parametric model that could be used for analysis in determining the minimum safety level of parallel runway operations for various parameters representing the airplane, navigation, surveillance, and ATC system performance. This simulation is useful as: a quick and economical evaluation of existing environments that are experiencing IMC delays, an efficient way to study and validate proposed procedure modifications, an aid in evaluating requirements for new airports or new runways in old airports, a simple, parametric investigation of a wide range of issues and approaches, an ability to tradeoff air and ground technology and procedures contributions, and a way of considering probable

  3. Motion analysis of knee joint using dynamic volume images

    Science.gov (United States)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  4. Nonparametric estimation receiver operating characteristic analysis for performance evaluation on combined detection and estimation tasks.

    Science.gov (United States)

    Wunderlich, Adam; Goossens, Bart

    2014-10-01

    In an effort to generalize task-based assessment beyond traditional signal detection, there is a growing interest in performance evaluation for combined detection and estimation tasks, in which signal parameters, such as size, orientation, and contrast are unknown and must be estimated. One motivation for studying such tasks is their rich complexity, which offers potential advantages for imaging system optimization. To evaluate observer performance on combined detection and estimation tasks, Clarkson introduced the estimation receiver operating characteristic (EROC) curve and the area under the EROC curve as a summary figure of merit. This work provides practical tools for EROC analysis of experimental data. In particular, we propose nonparametric estimators for the EROC curve, the area under the EROC curve, and for the variance/covariance matrix of a vector of correlated EROC area estimates. In addition, we show that reliable confidence intervals can be obtained for EROC area, and we validate these intervals with Monte Carlo simulation. Application of our methodology is illustrated with an example comparing magnetic resonance imaging [Formula: see text]-space sampling trajectories. MATLAB® software implementing the EROC analysis estimators described in this work is publicly available at http://code.google.com/p/iqmodelo/. PMID:26158044

  5. Articulating training methods using Job Task Analysis (JTA) - determined proficiency levels

    International Nuclear Information System (INIS)

    The INPO task analysis process, as well as that of many utilities, is based on the approach used by the US Navy. This is undoubtedly due to the Navy nuclear background of many of those involved in introducing the systems approach to training to the nuclear power industry. This report outlines an approach, used by a major North-Central utility, which includes a process developed by the Air Force. Air Force task analysis and instructional system development includes the use of a proficiency code. The code includes consideration of three types of learning - task performance, task knowledge, and subject knowledge - and four levels of competence for each. The use of this classification system facilitates the identification of desired competency levels at the completion of formal training in the classroom and lab, and of informal training on the job. By using the Air Force's proficiency code. The utility's program developers were able to develop generic training for its main training facility and site-specific training at its nuclear plants, using the most efficiency and cost-effective training methods

  6. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  7. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    Directory of Open Access Journals (Sweden)

    Guan Yu

    Full Text Available Accurately identifying mild cognitive impairment (MCI individuals who will progress to Alzheimer's disease (AD is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI and fluorodeoxyglucose positron emission tomography (FDG-PET. However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI subjects and 226 stable MCI (sMCI subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images and also the single-task classification method (using only MRI or only subjects with both MRI and

  8. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  9. Change Best: Task 2.3. Analysis of policy mix and development of Energy Efficiency Services

    International Nuclear Information System (INIS)

    The aim of the Change Best project is to promote the development of an energy efficiency service (EES) market and to give good practice examples of changes in energy service business, strategies, and supportive policies and measures in the course of the implementation of Directive 2006/32/EC on Energy End-Use Efficiency and Energy Services. This report addresses task 2.3: Analysis of policy mix and development of Energy Efficiency Services.

  10. The Quantitative Overhead Analysis for Effective Task Migration in Biosensor Networks

    OpenAIRE

    Sung-Min Jung; Tae-Kyung Kim; Jung-Ho Eom; Tai-Myoung Chung

    2013-01-01

    We present a quantitative overhead analysis for effective task migration in biosensor networks. A biosensor network is the key technology which can automatically provide accurate and specific parameters of a human in real time. Biosensor nodes are typically very small devices, so the use of computing resources is restricted. Due to the limitation of nodes, the biosensor network is vulnerable to an external attack against a system for exhausting system availability. Since biosensor nodes gener...

  11. The Emotional Stroop Task and Posttraumatic Stress Disorder: a Meta-Analysis

    OpenAIRE

    Cisler, Josh M.; Wolitzky-Taylor, Kate B.; Adams, Thomas G.; Babson, Kimberly A.; Badour, Christal L.; Willems, Jeffrey L.

    2011-01-01

    Posttraumatic stress disorder (PTSD) is associated with significant impairment and lowered quality of life. The emotional Stroop task (EST) has been one means of elucidating some of the core deficits in PTSD, but this literature has remained inconsistent. We conducted a meta-analysis of EST studies in PTSD populations in order to synthesize this body of research. Twenty-six studies were included with 538 PTSD participants, 254 non-trauma exposed control participants (NTC), and 276 trauma expo...

  12. Brief experimental analysis of stimulus prompts for accurate responding on academic tasks in an outpatient clinic.

    OpenAIRE

    McComas, J J; Wacker, D P; Cooper, L J; Asmus, J M; Richman, D; Stoner, B

    1996-01-01

    Brief multielement designs were used to examine the effects of specific instructional strategies on accuracy of academic performance during outpatient evaluations of 4 children with learning disorders. Instructional strategies that improved accuracy on academic tasks were identified for all participants. These results suggest that the application of experimental analysis methodologies to instructional variables may facilitate the identification of stimulus prompts that are associated with enh...

  13. Energy Consumption Analysis Procedure for Robotic Applications in different task motion

    Science.gov (United States)

    Ahmed, Iman; Aris, Ishak b.; Hamiruce Marhaban, Mohammad; Juraiza Ishak, Asnor

    2015-11-01

    This work proposes energy analysis method for humanoid robot, seen from simple motion task to complex one in energy chain. The research developed a procedure suitable for analysis, saving and modelling of energy consumption not only in this type of robot but also in most robots that based on electrical power as an energy source. This method has validated by an accurate integration using Matlab software for the power consumption curve to calculate the energy of individual and multiple servo motors. Therefore, this study can be considered as a procedure for energy analysis by utilizing the laboratory instruments capabilities to measure the energy parameters. We performed a various task motions with different angular speed to find out the speed limits in terms of robot stability and control strategy. A battery capacity investigation have been searched for several types of batteries to extract the power modelling equation and energy density parameter for each battery type, Matlab software have been built to design the algorithm and to evaluate experimental amount of the energy which is represented by area under the curve of the power curves. This will provide a robust estimation for the required energy in different task motions to be considered in energy saving (i.e., motion planning and real time scheduling).

  14. Proposal of Constraints Analysis Method Based on Network Model for Task Planning

    Science.gov (United States)

    Tomiyama, Tomoe; Sato, Tatsuhiro; Morita, Toyohisa; Sasaki, Toshiro

    Deregulation has been accelerating several activities toward reengineering business processes, such as railway through service and modal shift in logistics. Making those activities successful, business entities have to regulate new business rules or know-how (we call them ‘constraints’). According to the new constraints, they need to manage business resources such as instruments, materials, workers and so on. In this paper, we propose a constraint analysis method to define constraints for task planning of the new business processes. To visualize each constraint's influence on planning, we propose a network model which represents allocation relations between tasks and resources. The network can also represent task ordering relations and resource grouping relations. The proposed method formalizes the way of defining constraints manually as repeatedly checking the network structure and finding conflicts between constraints. Being applied to crew scheduling problems shows that the method can adequately represent and define constraints of some task planning problems with the following fundamental features, (1) specifying work pattern to some resources, (2) restricting the number of resources for some works, (3) requiring multiple resources for some works, (4) prior allocation of some resources to some works and (5) considering the workload balance between resources.

  15. Hawaii Energy Strategy Project 2: Fossil Energy Review. Task IV. Scenario development and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, N.D.; Breazeale, K. [ed.

    1993-12-01

    The Hawaii Energy Strategy (HES) Program is a seven-project effort led by the State of Hawaii Department of Business, Economic Development & Tourism (DBEDT) to investigate a wide spectrum of Hawaii energy issues. The East-West Center`s Program on Resources: Energy and Minerals, has been assigned HES Project 2, Fossil Energy Review, which focuses on fossil energy use in Hawaii and the greater regional and global markets. HES Project 2 has four parts: Task I (World and Regional Fossil Energy Dynamics) covers petroleum, natural gas, and coal in global and regional contexts, along with a discussion of energy and the environment. Task II (Fossil Energy in Hawaii) focuses more closely on fossil energy use in Hawaii: current utilization and trends, the structure of imports, possible future sources of supply, fuel substitutability, and energy security. Task III`s emphasis is Greenfield Options; that is, fossil energy sources not yet used in Hawaii. This task is divided into two sections: first, an in-depth {open_quotes}Assessment of Coal Technology Options and Implications for the State of Hawaii,{close_quotes} along with a spreadsheet analysis model, which was subcontracted to the Environmental Assessment and Information Sciences Division of Argonne National Laboratory; and second, a chapter on liquefied natural gas (LNG) in the Asia-Pacific market and the issues surrounding possible introduction of LNG into the Hawaii market.

  16. Analysis of target volumes for gliomas; Volumes-cibles anatomocliniques (GTV et CTV) des tumeurs gliales

    Energy Technology Data Exchange (ETDEWEB)

    Kantor, G. [Centre Regional de Lutte Contre le Cancer, Service de Radiotherapie, Institut Bergonie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France); Loiseau, H. [Hopital Pellegrin-Tripode, Service de Neurochirurgie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France)

    2005-06-15

    Gliomas are the most frequent tumors of the central nervous system of the adult. These intra-parenchymal tumors are infiltrative and the most important criterion for definition of GTV and CTV is the extent of infiltration. Delineation of GTV and CTV for untreated and resected glioma remains a controversial and difficult issue because of the discrepancy between real tumor invasion and that estimated by CT or MRI. Is particularly helpful a joint analysis of the four different methods as histopathological correlations with CT and MRI, use of new modality imaging, pattern of relapses after treatment and interobserver studies. The presence of isolated tumor cells in intact brain, oedema or adjacent structures requires the definition of two different options for CTV: i) a geometrical option with GTV defined as the tumor mass revealed by the contrast-enhanced zone on CT or MRI and a CTV with an expanded margin of 2 or 3 cm; ii) an anatomic option including the entire zone of oedema or isolated tumor cell infiltration extending at least as far as the limits of the hyperintense zone on T2-weighted MRI. Inclusion of adjacent structures (such as white matter, corpus callosum, subarachnoid spaces) in the CTV mainly depends on the site of the tumor and size of the volume is generally enlarged. (authors)

  17. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOAR\\CR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This document is the final report describing the theoretical basis and analytical results from the ADPAC-AOACR codes developed under task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR Program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  18. Coal gasification systems engineering and analysis. Volume 1: Executive summary

    Science.gov (United States)

    1980-01-01

    Feasibility analyses and systems engineering studies for a 20,000 tons per day medium Btu (MBG) coal gasification plant to be built by TVA in Northern Alabama were conducted. Major objectives were as follows: (1) provide design and cost data to support the selection of a gasifier technology and other major plant design parameters, (2) provide design and cost data to support alternate product evaluation, (3) prepare a technology development plan to address areas of high technical risk, and (4) develop schedules, PERT charts, and a work breakdown structure to aid in preliminary project planning. Volume one contains a summary of gasification system characterizations. Five gasification technologies were selected for evaluation: Koppers-Totzek, Texaco, Lurgi Dry Ash, Slagging Lurgi, and Babcock and Wilcox. A summary of the trade studies and cost sensitivity analysis is included.

  19. Dependability analysis of a very large volume neutrino telescope

    International Nuclear Information System (INIS)

    This work considers a first order approximation to the dependability analysis of complex large scale installations. The dependability criterion used here is quantitative unavailability, and an appropriate unavailability model is presented. The model assumes that the system is symmetrical, has various levels of hierarchy, and components found in the same level are similar and function independently. The application example comes from very large volume neutrino telescopes installed under water or ice, consisting of several thousands of optical modules. The readout architecture of the detector has several levels of multiplexing including optical detection towers, branches and tower sectors. The paper presents results for various alternative detector layouts and distances of the detector from the onshore facilities. It also develops dependability requirements for major components and/or subsystems consistent with an overall system performance target. The results depict the dependence of the system unavailability on the number of optical modules and the alternative deep sea infrastructure configurations for transferring the measured signals.

  20. Analysis of Partial Volume Effects on Accurate Measurement of the Hippocampus Volume

    Institute of Scientific and Technical Information of China (English)

    Maryam Hajiesmaeili; Jamshid Dehmeshki; Tim Ellis

    2014-01-01

    Hippocampal volume loss is an important biomarker in distinguishing subjects with Alzheimer’s disease (AD) and its measurement in magnetic resonance images (MRI) is influenced by partial volume effects (PVE). This paper describes a post-processing approach to quantify PVE for correction of the hippocampal volume by using a spatial fuzzyC-means (SFCM) method. The algorithm is evaluated on a dataset of 20 T1-weighted MRI scans sampled at two different resolutions. The corrected volumes for left and right hippocampus (HC) which are 23% and 18% for the low resolution and 6% and 5% for the high resolution datasets, respectively are lower than hippocampal volume results from manual segmentation. Results show the importance of applying this technique in AD detection with low resolution datasets.

  1. Synfuel program analysis. Volume 1: Procedures-capabilities

    Science.gov (United States)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    The analytic procedures and capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative are described. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specify cases and interpret outputs. It contains an explicit description (with examples) of the types of results which can be obtained when applied for the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. The objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  2. Task 7a: Dynamic analysis of Paks NPP structures reactor building

    International Nuclear Information System (INIS)

    This report describes dynamic response calculation of the NPP Paks, reactor building to the full scale blast testing. All calculations described in this report have been elaborated within the scope of IAEA co-ordinated research - Benchmark Study for for seismic analysis/testing of NPPs type WWER - Task 7a - Dynamic Analysis of PAKS NPP structures, i.e. reactor building. The input in the form of time history of velocities or accelerations on the free field caused by blast testing was only available for the participants of the task No.7a. The aim of this task is to calculate the dynamic response to the blast load in the form of floor response spectra in selected nodes of the structure without knowing the measured data. The data measured by the full scale blast test are published and the results of different calculations compared. The following structures were taken into account: turbine hall, intermediate multi-storey building, lateral multi-storey building, reactor building, ventilation center and condenser towers

  3. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  4. Thermal-Hydraulic Analysis Tasks for ANAV NPPs in Support of Plant Operation and Control

    Directory of Open Access Journals (Sweden)

    L. Batet

    2007-11-01

    Full Text Available Thermal-hydraulic analysis tasks aimed at supporting plant operation and control of nuclear power plants are an important issue for the Asociación Nuclear Ascó-Vandellòs (ANAV. ANAV is the consortium that runs the Ascó power plants (2 units and the Vandellòs-II power plant. The reactors are Westinghouse-design, 3-loop PWRs with an approximate electrical power of 1000 MW. The Technical University of Catalonia (UPC thermal-hydraulic analysis team has jointly worked together with ANAV engineers at different levels in the analysis and improvement of these reactors. This article is an illustration of the usefulness of computational analysis for operational support. The contents presented were operational between 1985 and 2001 and subsequently changed slightly following various organizational adjustments. The paper has two different parts. In the first part, it describes the specific aspects of thermal-hydraulic analysis tasks related to operation and control and, in the second part, it briefly presents the results of three examples of analyses that were performed. All the presented examples are related to actual situations in which the scenarios were studied by analysts using thermal-hydraulic codes and prepared nodalizations. The paper also includes a qualitative evaluation of the benefits obtained by ANAV through thermal-hydraulic analyses aimed at supporting operation and plant control.

  5. Implementation of Hierarchical Task Analysis for User Interface Design in Drawing Application for Early Childhood Education

    Directory of Open Access Journals (Sweden)

    Mira Kania Sabariah

    2016-05-01

    Full Text Available Draw learning in early childhood is an important lesson and full of stimulation of the process of growth and development of children which could help to train the fine motor skills. We have had a lot of applications that can be used to perform learning, including interactive learning applications. Referring to the observations that have been conducted showed that the experiences given by the applications that exist today are very diverse and have not been able to represent the model of learning and characteristics of early childhood (4-6 years. Based on the results, Hierarchical Task Analysis method generated a list of tasks that must be done in designing an user interface that represents the user experience in draw learning. Then by using the Heuristic Evaluation method the usability of the model has fulfilled a very good level of understanding and also it can be enhanced and produce a better model.

  6. Aviation and programmatic analyses; Volume 1, Task 1: Aviation data base development and application. [for NASA OAST programs

    Science.gov (United States)

    1977-01-01

    A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.

  7. 3-D volume reconstruction of skin lesions for melanin and blood volume estimation and lesion severity analysis.

    Science.gov (United States)

    D'Alessandro, Brian; Dhawan, Atam P

    2012-11-01

    Subsurface information about skin lesions, such as the blood volume beneath the lesion, is important for the analysis of lesion severity towards early detection of skin cancer such as malignant melanoma. Depth information can be obtained from diffuse reflectance based multispectral transillumination images of the skin. An inverse volume reconstruction method is presented which uses a genetic algorithm optimization procedure with a novel population initialization routine and nudge operator based on the multispectral images to reconstruct the melanin and blood layer volume components. Forward model evaluation for fitness calculation is performed using a parallel processing voxel-based Monte Carlo simulation of light in skin. Reconstruction results for simulated lesions show excellent volume accuracy. Preliminary validation is also done using a set of 14 clinical lesions, categorized into lesion severity by an expert dermatologist. Using two features, the average blood layer thickness and the ratio of blood volume to total lesion volume, the lesions can be classified into mild and moderate/severe classes with 100% accuracy. The method therefore has excellent potential for detection and analysis of pre-malignant lesions. PMID:22829392

  8. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  9. Feasibility study of modern airships, phase 1. Volume 2: Parametric analysis (task 3). [lift, weight (mass)

    Science.gov (United States)

    Lancaster, J. W.

    1975-01-01

    Various types of lighter-than-air vehicles from fully buoyant to semibuoyant hybrids were examined. Geometries were optimized for gross lifting capabilities for ellipsoidal airships, modified delta planform lifting bodies, and a short-haul, heavy-lift vehicle concept. It is indicated that: (1) neutrally buoyant airships employing a conservative update of materials and propulsion technology provide significant improvements in productivity; (2) propulsive lift for VTOL and aerodynamic lift for cruise significantly improve the productivity of low to medium gross weight ellipsoidal airships; and (3) the short-haul, heavy-lift vehicle, consisting of a simple combination of an ellipsoidal airship hull and existing helicopter componentry, provides significant potential for low-cost, near-term applications for ultra-heavy lift missions.

  10. Imalytics Preclinical: Interactive Analysis of Biomedical Volume Data.

    Science.gov (United States)

    Gremse, Felix; Stärk, Marius; Ehling, Josef; Menzel, Jan Robert; Lammers, Twan; Kiessling, Fabian

    2016-01-01

    A software tool is presented for interactive segmentation of volumetric medical data sets. To allow interactive processing of large data sets, segmentation operations, and rendering are GPU-accelerated. Special adjustments are provided to overcome GPU-imposed constraints such as limited memory and host-device bandwidth. A general and efficient undo/redo mechanism is implemented using GPU-accelerated compression of the multiclass segmentation state. A broadly applicable set of interactive segmentation operations is provided which can be combined to solve the quantification task of many types of imaging studies. A fully GPU-accelerated ray casting method for multiclass segmentation rendering is implemented which is well-balanced with respect to delay, frame rate, worst-case memory consumption, scalability, and image quality. Performance of segmentation operations and rendering are measured using high-resolution example data sets showing that GPU-acceleration greatly improves the performance. Compared to a reference marching cubes implementation, the rendering was found to be superior with respect to rendering delay and worst-case memory consumption while providing sufficiently high frame rates for interactive visualization and comparable image quality. The fast interactive segmentation operations and the accurate rendering make our tool particularly suitable for efficient analysis of multimodal image data sets which arise in large amounts in preclinical imaging studies. PMID:26909109

  11. Analysis of Mexico wind tunnel measurements. Final report of IEA Task 29, Mexnext (Phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G.; Boorsma, K. [Energy research Center of the Netherlands ECN, Petten (Netherlands); Cho, T. [Korea Aerospace Research Institute KARI, Daejeon (Korea, Republic of); Gomez-Iradi, S. [National Renewable Energy Center of Spain CENER, Sarriguren (Spain); Schaffarczyk, P. [A. Jeromin University of Applied Sciences, CEWind EG, Kiel (Germany); Shen, W.Z. [The Technical University of Denmark, Kongens Lyngby (Denmark); Lutz, T. [K. Meister University of Stuttgart, Stuttgart (Germany); Stoevesandt, B. [ForWind, Zentrum fuer Windenergieforschung, Oldenburg (Germany); Schreck, S. [National Renewable Energy Laboratory NREL, Golden, CO (United States); Micallef, D.; Pereira, R.; Sant, T. [Delft University of Technology TUD, Delft (Netherlands); Madsen, H.A.; Soerensen, N. [Risoe-DTU, Roskilde (Denmark)

    2012-02-15

    This report describes the work performed within the first phase of IEA Task 29 Mexnext. In this IEA Task 29 a total of 20 organisations from 11 different countries collaborated in analysing the measurements which have been performed in the EU project 'Mexico'. Within this Mexico project 9 European institutes carried out a wind tunnel experiment in the Large Low Speed Facility (LLF) of the German Dutch Wind Facilities DNW on a rotor with a diameter of 4.5 m. Pressure distributions were measured at five locations along the blade along with detailed flow field measurements around the rotor plane using stereo PIV. As a result of the international collaboration within this task a very thorough analysis of the data could be carried out and a large number of codes were validated not only in terms of loads but also in terms of underlying flow field. The detailed pressure measurements along the blade in combination with the detailed flow field measurements gave a unique opportunity to better understand the response of a wind turbine to the incoming flow field. Deficiencies in modelling have been established and directions for model improvement can be given.

  12. Utilizing job/task analysis to establish content validity in the design of training programs

    International Nuclear Information System (INIS)

    The decade of the 1980's has been a turbulent time for the Department of Energy. With concern mounting about the terrorist threat, a wave of congressional inquiries and internal inspections crossed the nation and engulfed many of the nuclear laboratories and facilities operated by DOE contractors. A typical finding was the need to improve, and increase, the training of the protective force. The immediate reaction resulted in a wide variety of responses, with most contractors feeling safer with too much, rather than not enough training. As soon as the initial pressures to upgrade subsided, a task force was established to evaluate the overall training needs. Representatives from the contractor facilities worked together to conduct a job analysis of the protective force. A generic task inventory was established, and validated at the different sites. This list has been invaluable for determining the tasks, conditions, and standards needed to develop well stated learning objectives. The enhanced training programs are being refined to ensure job content validity based on the data collected

  13. Reliability of steam-turbine rotors. Task 1. Lifetime prediction analysis system. Final report

    International Nuclear Information System (INIS)

    Task 1 of RP 502, Reliability of Steam Turbine Rotors, resulted in the development of a computerized lifetime prediction analysis system (STRAP) for the automatic evaluation of rotor integrity based upon the results of a boresonic examination of near-bore defects. Concurrently an advanced boresonic examination system (TREES), designed to acquire data automatically for lifetime analysis, was developed and delivered to the maintenance shop of a major utility. This system and a semi-automated, state-of-the-art system (BUCS) were evaluated on two retired rotors as part of the Task 2 effort. A modified nonproprietary version of STRAP, called SAFER, is now available for rotor lifetime prediction analysis. STRAP and SAFER share a common fracture analysis postprocessor for rapid evaluation of either conventional boresonic amplitude data or TREES cell data. The final version of this postprocessor contains general stress intensity correlations for elliptical cracks in a radial stress gradient and provision for elastic-plastic instability of the ligament between an imbedded crack and the bore surface. Both linear elastic and ligament rupture models were developed for rapid analysis of linkup within three-dimensional clusters of defects. Bore stress-rupture criteria are included, but a creep-fatigue crack growth data base is not available. Physical and mechanical properties of air-melt 1CrMoV forgings are built into the program; however, only bounding values of fracture toughness versus temperature are available. Owing to the lack of data regarding the probability of flaw detection for the boresonic systems and of quantitative verification of the flaw linkup analysis, automatic evlauation of boresonic results is not recommended, and the lifetime prediction system is currently restricted to conservative, deterministic analysis of specified flaw geometries

  14. Brain connectivity analysis from EEG signals using stable phase-synchronized states during face perception tasks

    Science.gov (United States)

    Jamal, Wasifa; Das, Saptarshi; Maharatna, Koushik; Pan, Indranil; Kuyucu, Doga

    2015-09-01

    Degree of phase synchronization between different Electroencephalogram (EEG) channels is known to be the manifestation of the underlying mechanism of information coupling between different brain regions. In this paper, we apply a continuous wavelet transform (CWT) based analysis technique on EEG data, captured during face perception tasks, to explore the temporal evolution of phase synchronization, from the onset of a stimulus. Our explorations show that there exists a small set (typically 3-5) of unique synchronized patterns or synchrostates, each of which are stable of the order of milliseconds. Particularly, in the beta (β) band, which has been reported to be associated with visual processing task, the number of such stable states has been found to be three consistently. During processing of the stimulus, the switching between these states occurs abruptly but the switching characteristic follows a well-behaved and repeatable sequence. This is observed in a single subject analysis as well as a multiple-subject group-analysis in adults during face perception. We also show that although these patterns remain topographically similar for the general category of face perception task, the sequence of their occurrence and their temporal stability varies markedly between different face perception scenarios (stimuli) indicating toward different dynamical characteristics for information processing, which is stimulus-specific in nature. Subsequently, we translated these stable states into brain complex networks and derived informative network measures for characterizing the degree of segregated processing and information integration in those synchrostates, leading to a new methodology for characterizing information processing in human brain. The proposed methodology of modeling the functional brain connectivity through the synchrostates may be viewed as a new way of quantitative characterization of the cognitive ability of the subject, stimuli and information integration

  15. Finite-Volume Analysis for the Cahn-Hilliard equation with Dynamic boundary conditions

    OpenAIRE

    Nabet, Flore

    2014-01-01

    This work is devoted to the convergence analysis of a finite-volume approximation of the 2D Cahn-Hilliard equation with dynamic boundary conditions. The method that we propose couples a 2d-finite-volume method in a bounded, smooth domain and a 1d-finite-volume method on its boundary. We prove convergence of the sequence of approximate solutions.

  16. Performance Analysis Of A Upnp/Dhcompliant Robotic Adapter For Collaborative Tasks Development

    Directory of Open Access Journals (Sweden)

    Alejandro Alvarez Vazquez

    2012-02-01

    Full Text Available This paper describes the performance analysis of an adapter in accordance with standard UPnP DHCompliant (Digital Home Compliant for a service robot. The DHCompliant adapter has been developed to solve some limitations that UPnP protocol suffers and to develop new DHC concepts. Moreover, it showcases with a particular example how the open protocol DHC is useful for the development of collaborative tasks, localization, energy management and other fields altogether. That interoperability is being done between devices obtaining a virtual device which can obtain the controlpoint logic and the device logic simultaneously.

  17. Task and error analysis balancing benefits over business of electronic medical records.

    Science.gov (United States)

    Carstens, Deborah Sater; Rodriguez, Walter; Wood, Michael B

    2014-01-01

    Task and error analysis research was performed to identify: a) the process for healthcare organisations in managing healthcare for patients with mental illness or substance abuse; b) how the process can be enhanced and; c) if electronic medical records (EMRs) have a role in this process from a business and safety perspective. The research question is if EMRs have a role in enhancing the healthcare for patients with mental illness or substance abuse. A discussion on the business of EMRs is addressed to understand the balancing act between the safety and business aspects of an EMR. PMID:25161108

  18. Rapid analysis of hay attributes using NIRS. Final report, Task II alfalfa supply system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-24

    This final report provides technical information on the development of a near infrared reflectance spectroscopy (NIRS) system for the analysis of alfalfa hay. The purpose of the system is to provide consistent quality for processing alfalfa stems for fuel and alfalfa leaf meal products for livestock feed. Project tasks were to: (1) develop an NIRS driven analytical system for analysis of alfalfa hay and processed alfalfa products; (2) assist in hiring a qualified NIRS technician and recommend changes in testing equipment necessary to provide accurate analysis; (3) calibrate the NIRS instrument for accurate analyses; and (4) develop prototype equipment and sampling procedures as a first step towards development of a totally automated sampling system that would rapidly sample and record incoming feedstock and outbound product. An accurate hay testing program was developed, along with calibration equations for analyzing alfalfa hay and sun-cured alfalfa pellets. A preliminary leaf steam calibration protocol was also developed. 7 refs., 11 figs., 10 tabs.

  19. Analysis of brain activity and response to colour stimuli during learning tasks: an EEG study

    Science.gov (United States)

    Folgieri, Raffaella; Lucchiari, Claudio; Marini, Daniele

    2013-02-01

    The research project intends to demonstrate how EEG detection through BCI device can improve the analysis and the interpretation of colours-driven cognitive processes through the combined approach of cognitive science and information technology methods. To this end, firstly it was decided to design an experiment based on comparing the results of the traditional (qualitative and quantitative) cognitive analysis approach with the EEG signal analysis of the evoked potentials. In our case, the sensorial stimulus is represented by the colours, while the cognitive task consists in remembering the words appearing on the screen, with different combination of foreground (words) and background colours. In this work we analysed data collected from a sample of students involved in a learning process during which they received visual stimuli based on colour variation. The stimuli concerned both the background of the text to learn and the colour of the characters. The experiment indicated some interesting results concerning the use of primary (RGB) and complementary (CMY) colours.

  20. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  1. Volume-Rendering-Based Interactive 3D Measurement for Quantitative Analysis of 3D Medical Images

    OpenAIRE

    Yakang Dai; Jian Zheng; Yuetao Yang; Duojie Kuai; Xiaodong Yang

    2013-01-01

    3D medical images are widely used to assist diagnosis and surgical planning in clinical applications, where quantitative measurement of interesting objects in the image is of great importance. Volume rendering is widely used for qualitative visualization of 3D medical images. In this paper, we introduce a volume-rendering-based interactive 3D measurement framework for quantitative analysis of 3D medical images. In the framework, 3D widgets and volume clipping are integrated with volume render...

  2. District heating and cooling systems for communities through power plant retrofit and distribution network. Volume 3. Tasks 4-6. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Stone and Webster Engineering Corporation is a member of the Demonstration Team to review and assess the technical aspects of cogeneration for district heating. Task 4 details the most practical retrofit schemes. Of the cogeneration schemes studied, a back-pressure turbine is considered the best source of steam for district heating. Battelle Columbus Laboratories is a member of the Demonstration Team employed to investigate several institutional issues affecting the success of district heating. The Toledo Edison legal staff reviewed the legal aspects of mandate to serve, easement and franchise requirements, and corporate charter requirements. The principal findings of both the Battelle investigations and the legal research are summarized in Task 5. A complete discussion of each issue is included in the two sections labeled Legal Issues and Institutional Issues. In Task 6, Battelle Columbus Laboratories completed a preliminary economic analysis, incorporating accurate input parameters applicable to utility ownership of the proposed district-heating system. The methodology used is summarized, the assumptions are listed, and the results are briefly reviewed.

  3. Government agencies and alternative environmental conflict management: the Michigan Oil and Gas Leasing Task Force as a dispute resolution process. (Volumes I and II)

    Energy Technology Data Exchange (ETDEWEB)

    Lesnick, M.T.

    1986-01-01

    Environmental decision making by government agencies is typically controversial and the focal point for many environmental disputes. New, alternative environmental conflict management processes that emphasize negotiation and problem-solving, hold the potential to more effectively manage these complex disputes. However, because these processes are relatively new, little is known about the actual advantages and disadvantages of participating. Using six conceptual dimensions and twelve hypotheses, this dissertation examined the impacts of participation on an agency's: organizational structures and processes; ability to formulate and implement policy; and decision making authority. Data were collected from participants and observers of the Oil and Gas Leasing Task Force conducted by the Michigan Department of Natural Resources (DNR). This was a two-year, policy-level process initiated by the agency to resolve internal and external disputes over the state's leasing policy. The seventeen member Task Force of DNR staff, environmentalists and oil and gas industry representatives; revised the state's oil and gas lease; developed and implemented administrative rules; and revised the DNR's environmental field review process. Data analysis showed that the Task Force resolved internal differences between DNR divisions over the goals of the oil and gas program.

  4. Demonstration project as a procedure for accelerating the application of new technology (Charpie Task Force report). Volume II

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-02-01

    This report examines the issues associated with government programs proposed for the ''commercialization'' of new energy technologies; these programs are intended to hasten the pace at which target technologies are adopted by the private sector. The ''commercial demonstration'' is the principal tool used in these programs. Most previous government interventions in support of technological change have focused on R and D and left to the private sector the decision as to adoption for commercial utilization; thus there is relatively little in the way of analysis or experience which bears direct application. The analysis is divided into four sections. First, the role of R, D, and D within the structure of the national energy goals and policies is examined. The issue of ''prices versus gaps'' is described as a crucial difference of viewpoint concerning the role of the government in the future of the energy system. Second, the process of technological change as it occurs with respect to energy technologies is then examined for possible sources of misalignment of social and private incentives. The process is described as a series of investments. Third, correction of these sources of misalignment then becomes the goal of commercial demonstration programs as this goal and the means for attaining it are explored. Government-supported commercialization may be viewed as a subsidy to the introduction stage of the process; the circumstances under which such subsidies are likely to affect the success of the subsequent diffusion stage are addressed. The discussion then turns to the political, legal, and institutional problems. Finally, methods for evaluation and planning of commercial demonstration programs are analyzed. The critical areas of ignorance are highlighted and comprise a research agenda for improved analytical techniques to support decisions in this area.

  5. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  6. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    Leeuwen, van Marieke; Peper, Jiska S.; Berg, van den Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler I

  7. Volume component analysis for classification of LiDAR data

    Science.gov (United States)

    Varney, Nina M.; Asari, Vijayan K.

    2015-03-01

    One of the most difficult challenges of working with LiDAR data is the large amount of data points that are produced. Analysing these large data sets is an extremely time consuming process. For this reason, automatic perception of LiDAR scenes is a growing area of research. Currently, most LiDAR feature extraction relies on geometrical features specific to the point cloud of interest. These geometrical features are scene-specific, and often rely on the scale and orientation of the object for classification. This paper proposes a robust method for reduced dimensionality feature extraction of 3D objects using a volume component analysis (VCA) approach.1 This VCA approach is based on principal component analysis (PCA). PCA is a method of reduced feature extraction that computes a covariance matrix from the original input vector. The eigenvectors corresponding to the largest eigenvalues of the covariance matrix are used to describe an image. Block-based PCA is an adapted method for feature extraction in facial images because PCA, when performed in local areas of the image, can extract more significant features than can be extracted when the entire image is considered. The image space is split into several of these blocks, and PCA is computed individually for each block. This VCA proposes that a LiDAR point cloud can be represented as a series of voxels whose values correspond to the point density within that relative location. From this voxelized space, block-based PCA is used to analyze sections of the space where the sections, when combined, will represent features of the entire 3-D object. These features are then used as the input to a support vector machine which is trained to identify four classes of objects, vegetation, vehicles, buildings and barriers with an overall accuracy of 93.8%

  8. NASA TLA workload analysis support. Volume 3: FFD autopilot scenario validation data

    Science.gov (United States)

    Sundstrom, J. L.

    1980-01-01

    The data used to validate a seven time line analysis of forward flight deck autopilot mode for the pilot and copilot for NASA B737 terminal configured vehicle are presented. Demand workloads are given in two forms: workload histograms and workload summaries (bar graphs). A report showing task length and task interaction is also presented.

  9. A cultural task analysis of implicit independence: comparing North America, Western Europe, and East Asia.

    Science.gov (United States)

    Kitayama, Shinobu; Park, Hyekyung; Sevincer, A Timur; Karasawa, Mayumi; Uskul, Ayse K

    2009-08-01

    Informed by a new theoretical framework that assigns a key role to cultural tasks (culturally prescribed means to achieve cultural mandates such as independence and interdependence) in mediating the mutual influences between culture and psychological processes, the authors predicted and found that North Americans are more likely than Western Europeans (British and Germans) to (a) exhibit focused (vs. holistic) attention, (b) experience emotions associated with independence (vs. interdependence), (c) associate happiness with personal achievement (vs. communal harmony), and (d) show an inflated symbolic self. In no cases were the 2 Western European groups significantly different from one another. All Western groups showed (e) an equally strong dispositional bias in attribution. Across all of the implicit indicators of independence, Japanese were substantially less independent (or more interdependent) than the three Western groups. An explicit self-belief measure of independence and interdependence showed an anomalous pattern. These data were interpreted to suggest that the contemporary American ethos has a significant root in both Western cultural heritage and a history of voluntary settlement. Further analysis offered unique support for the cultural task analysis. PMID:19634973

  10. The flight telerobotic servicer Tinman concept: System design drivers and task analysis

    Science.gov (United States)

    Andary, J. F.; Hewitt, D. R.; Hinkal, S. W.

    1989-01-01

    A study was conducted to develop a preliminary definition of the Flight Telerobotic Servicer (FTS) that could be used to understand the operational concepts and scenarios for the FTS. Called the Tinman, this design concept was also used to begin the process of establishing resources and interfaces for the FTS on Space Station Freedom, the National Space Transportation System shuttle orbiter, and the Orbital Maneuvering vehicle. Starting with an analysis of the requirements and task capabilities as stated in the Phase B study requirements document, the study identified eight major design drivers for the FTS. Each of these design drivers and their impacts on the Tinman design concept are described. Next, the planning that is currently underway for providing resources for the FTS on Space Station Freedom is discussed, including up to 2000 W of peak power, up to four color video channels, and command and data rates up to 500 kbps between the telerobot and the control station. Finally, an example is presented to show how the Tinman design concept was used to analyze task scenarios and explore the operational capabilities of the FTS. A structured methodology using a standard terminology consistent with the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) was developed for this analysis.

  11. Performance Task using Video Analysis and Modelling to promote K12 eight practices of science

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    We will share on the use of Tracker as a pedagogical tool in the effective learning and teaching of physics performance tasks taking root in some Singapore Grade 9 (Secondary 3) schools. We discuss the pedagogical use of Tracker help students to be like scientists in these 6 to 10 weeks where all Grade 9 students are to conduct a personal video analysis and where appropriate the 8 practices of sciences (1. ask question, 2. use models, 3. Plan and carry out investigation, 4. Analyse and interpret data, 5. Using mathematical and computational thinking, 6. Construct explanations, 7. Discuss from evidence and 8. Communicating information). We will situate our sharing on actual students work and discuss how tracker could be an effective pedagogical tool. Initial research findings suggest that allowing learners conduct performance task using Tracker, a free open source video analysis and modelling tool, guided by the 8 practices of sciences and engineering, could be an innovative and effective way to mentor authent...

  12. The human factors and job task analysis in nuclear power plant operation

    International Nuclear Information System (INIS)

    After a long period of time, during the development of the NPP technology, where the plant hardware has been considered to be the main factor for a safe, reliable and economic operation, the industry is now changing to an adequate responsibility of plant hardware and operation. Since the human factors has been not discussed methodically so far, there is still a lack of improved classification systems for human errors as well as a lack of methods for the systematic approach in designing the operator's working system, as for instance by using the job task analysis (J.T.A.). The J.T.A. appears to be an adequate method to study the human factor in the nuclear power plant operation, enabling an easy conversion to operational improvements. While the results of the analysis of human errors tell 'what' is to be improved, the J.T.A. shows 'how' to improve, for increasing the quality of the work and the safety of the operator's working system. The paper analyses the issue of setting the task and displays four criteria used to select aspects in NPP operation which require special consideration as personal training, design of control room, content and layout of the procedure manual, or organizing the operating personnel. The results are given as three tables giving: 1- Evaluation of deficiencies in the Working System; 2- Evaluation of the Deficiencies of Operator's Disposition; 3- Evaluation of the Mental Structure of Operation

  13. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  14. Manual-control Analysis Applied to the Money-supply Control Task

    Science.gov (United States)

    Wingrove, R. C.

    1984-01-01

    The recent procedure implemented by the Federal Reserve Board to control the money supply is formulated in the form of a tracking model as used in the study of manual-control tasks. Using this model, an analysis is made to determine the effect of monetary control on the fluctuations in economic output. The results indicate that monetary control can reduce the amplitude of fluctuations at frequencies near the region of historic business cycles. However, with significant time lags in the control loop, monetary control tends to increase the amplitude of the fluctuations at the higher frequencies. How the investigator or student can use the tools developed in the field of manual-control analysis to study the nature of economic fluctuations and to examine different strategies for stabilization is examined.

  15. Box truss analysis and technology development. Task 1: Mesh analysis and control

    Science.gov (United States)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.

    1985-01-01

    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.

  16. Job/task analysis for I ampersand C [Instrumentation and Controls] instrument technicians at the High Flux Isotope Reactor

    International Nuclear Information System (INIS)

    To comply with Department of Energy Order 5480.XX (Draft), a job/task analysis was initiated by the Maintenance Management Department at Oak Ridge National Laboratory (ORNL). The analysis was applicable to instrument technicians working at the ORNL High Flux Isotope Reactor (HFIR). This document presents the procedures and results of that analysis. 2 refs., 2 figs

  17. Toward mutual support: a task analysis of the relational justice approach to infidelity.

    Science.gov (United States)

    Williams, Kirstee; Galick, Aimee; Knudson-Martin, Carmen; Huenergardt, Douglas

    2013-07-01

    Gender, culture, and power issues are intrinsic to the etiology of infidelity, but the clinical literature offers little guidance on how to work with these concerns. The Relational Justice Approach (RJA) to infidelity (Williams, Family Process, 2011, 50, 516) uniquely places gender and power issues at the heart of clinical change; however, this approach has not been systematically studied. Therefore a qualitative task analysis was utilized to understand how change occurs in RJA. The findings indicated four necessary tasks: (a) creating an equitable foundation for healing, (b) creating space for alternate gender discourse, (c) pursuing relational responsibility of powerful partner, and (d) new experience of mutual support. Therapists' attention to power dynamics that organize couple relationships, leadership in intervening in power processes, and socio-cultural attunement to gender discourses were foundational to this work. These findings help clarify the processes by which mutual healing from the trauma of infidelity may occur and offer empirically based actions that therapists can take to facilitate mutual support. PMID:25059297

  18. Space Station data system analysis/architecture study. Task 1: Functional requirements definition, DR-5

    Science.gov (United States)

    1985-01-01

    The initial task in the Space Station Data System (SSDS) Analysis/Architecture Study is the definition of the functional and key performance requirements for the SSDS. The SSDS is the set of hardware and software, both on the ground and in space, that provides the basic data management services for Space Station customers and systems. The primary purpose of the requirements development activity was to provide a coordinated, documented requirements set as a basis for the system definition of the SSDS and for other subsequent study activities. These requirements should also prove useful to other Space Station activities in that they provide an indication of the scope of the information services and systems that will be needed in the Space Station program. The major results of the requirements development task are as follows: (1) identification of a conceptual topology and architecture for the end-to-end Space Station Information Systems (SSIS); (2) development of a complete set of functional requirements and design drivers for the SSIS; (3) development of functional requirements and key performance requirements for the Space Station Data System (SSDS); and (4) definition of an operating concept for the SSIS. The operating concept was developed both from a Space Station payload customer and operator perspective in order to allow a requirements practicality assessment.

  19. Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis

    Science.gov (United States)

    Caulkins, J. L.

    2010-12-01

    We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever

  20. Performance Analysis for Segment Stretch Transformation of Parallel Real-time Tasks

    OpenAIRE

    Qamhieh, Manar; Fauberteau, Frédéric; Midonnet, Serge

    2011-01-01

    The Segment Stretch Transformation (SST) is an algorithm that transforms parallel Fork-Join (FJ) tasks into sequential tasks on multiprocessor systems when possible, in order to increase the schedulability of the tasksets of this model. SST is based on Task Stretch Transformation (TST) which is a transformation for the same model of tasks, but it uses segment migrations while SST eliminates their use. In this paper, we prove that SST transformation has the same performance of TST transformati...

  1. Mission analysis of photovoltaic solar energy conversion. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, S.L.; Rattin, E.J.; Siegel, B.

    1977-03-01

    An investigation of terrestrial applications for the photovoltaic conversion of solar energy is summarized. The specific objectives of the study were: (a) to survey and evaluate near-term (1976--1985) civilian photovoltaic applications in the United States; (b) to evaluate the most promising major missions for the mid-term period (1986--2000) and to determine the conditions under which photovoltaic technology can compete in those applications at array prices consistent with ERDA goals; (c) to address critical external issues and identify the sensitivity of photovoltaic system technical requirements to such factors; and (d) to quantify the societal costs of alternative energy sources and identify equalizing incentives. The study was divided into six separate but interrelated tasks: Task 1, Analysis of Near-Term Applications; Task 2, Analysis of Major Mid-Term Missions; Task 3, Review and Updating of the ERDA Technology Implementation Plan; Task 4, Critical External Issues; Task 5, The Impact of Incentives; and Task 6, The Societal Costs of Conventional Power Generation. The emphasis of the study was on the first two of these tasks, the other four serving to provide supplementary information.

  2. An analysis of the application of AI to the development of intelligent aids for flight crew tasks

    Science.gov (United States)

    Baron, S.; Feehrer, C.

    1985-01-01

    This report presents the results of a study aimed at developing a basis for applying artificial intelligence to the flight deck environment of commercial transport aircraft. In particular, the study was comprised of four tasks: (1) analysis of flight crew tasks, (2) survey of the state-of-the-art of relevant artificial intelligence areas, (3) identification of human factors issues relevant to intelligent cockpit aids, and (4) identification of artificial intelligence areas requiring further research.

  3. Human factors assessment in PRA using Task Analysis Linked Evaluation Technique (TALENT)

    International Nuclear Information System (INIS)

    Thirty years ago the US military and US aviation industry, and more recently, in response to the US Three Mile Island and USSR Chernobyl accidents, the US commercial nuclear power industry, acknowledged that human error, as an immediate precursor, and as a latent or indirect influence in the form of training, maintainability, inservice test, and surveillance programs, is a primary contributor to unreality and risk in complex high-reliability systems. A 1985 Nuclear Regulatory Commission (NRC) study of Licensee Event Reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Despite the magnitude and nature of human error cited in that study, there has been limited attention to personnel-centered issues, especially person-to-person issues involving group processes, management and organizational environment. The paper discusses NRC integration and applications research with respect to the Task Analysis Linked Evaluation Technique (TALENT) in risk assessment applications

  4. Performance-based training: from job and task analysis to training materials

    International Nuclear Information System (INIS)

    Historically, the smoke filled room approach has been used to revise training programs: instructors would sit down and design a program based on existing training materials and any federal requirements that applied. This failure to reflect a systematic definition of required job functions, responsibilities and performance standards in training programs has resulted in generic program deficiencies: they do not provide complete training of required skills and knowledge. Recognition of this need for change, coupled with a decrease in experienced industry personnel inputs and long training pipelines, has heightened the need for efficient performance-based training programs which are derived from and referenced to job performance criteria. This paper presents the process for developing performance-based training materials based on job and task analysis products

  5. Nonnegative least-correlated component analysis for separation of dependent sources by volume maximization.

    Science.gov (United States)

    Wang, Fa-Yu; Chi, Chong-Yung; Chan, Tsung-Han; Wang, Yue

    2010-05-01

    Although significant efforts have been made in developing nonnegative blind source separation techniques, accurate separation of positive yet dependent sources remains a challenging task. In this paper, a joint correlation function of multiple signals is proposed to reveal and confirm that the observations after nonnegative mixing would have higher joint correlation than the original unknown sources. Accordingly, a new nonnegative least-correlated component analysis (n/LCA) method is proposed to design the unmixing matrix by minimizing the joint correlation function among the estimated nonnegative sources. In addition to a closed-form solution for unmixing two mixtures of two sources, the general algorithm of n/LCA for the multisource case is developed based on an iterative volume maximization (IVM) principle and linear programming. The source identifiability and required conditions are discussed and proven. The proposed n/LCA algorithm, denoted by n/LCA-IVM, is evaluated with both simulation data and real biomedical data to demonstrate its superior performance over several existing benchmark methods. PMID:20299711

  6. Analysis on volume grating induced by femtosecond laser pulses.

    Science.gov (United States)

    Zhou, Keya; Guo, Zhongyi; Ding, Weiqiang; Liu, Shutian

    2010-06-21

    We report on a kind of self-assembled volume grating in silica glass induced by tightly focused femtosecond laser pulses. The formation of the volume grating is attributed to the multiple microexplosion in the transparent materials induced by the femtosecond pulses. The first order diffractive efficiency is in dependence on the energy of the pulses and the scanning velocity of the laser greatly, and reaches as high as 30%. The diffraction pattern of the fabricated grating is numerically simulated and analyzed by a two dimensional FDTD method and the Fresnel Diffraction Integral. The numerical results proved our prediction on the formation of the volume grating, which agrees well with our experiment results. PMID:20588497

  7. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks.

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes. PMID:26193332

  8. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  9. Thermodynamic analysis of a Stirling engine including dead volumes of hot space, cold space and regenerator

    Energy Technology Data Exchange (ETDEWEB)

    Kongtragool, Bancha; Wongwises, Somchai [Fluid Mechanics, Thermal Engineering and Multiphase Flow Research Laboratory (FUTURE), Department of Mechanical Engineering, Faculty of Engineering, King Mongkut' s University of Technology Thonburi, 91 Suksawas 48, Bangmod, Bangkok 10140 (Thailand)

    2006-03-01

    This paper provides a theoretical investigation on the thermodynamic analysis of a Stirling engine. An isothermal model is developed for an imperfect regeneration Stirling engine with dead volumes of hot space, cold space and regenerator that the regenerator effective temperature is an arithmetic mean of the heater and cooler temperature. Numerical simulation is performed and the effects of the regenerator effectiveness and dead volumes are studied. Results from this study indicate that the engine net work is affected by only the dead volumes while the heat input and engine efficiency are affected by both the regenerator effectiveness and dead volumes. The engine net work decreases with increasing dead volume. The heat input increases with increasing dead volume and decreasing regenerator effectiveness. The engine efficiency decreases with increasing dead volume and decreasing regenerator effectiveness. (author)

  10. An analysis of a partial task training strategy for profoundly retarded institutionalized clients.

    Science.gov (United States)

    Cipani, E

    1985-03-01

    This study investigated the effects of a partial task training strategy on productivity and on-task behavior in three profoundly retarded institutionalized clients in a pre-skills workshop classroom. Partial task training consisted of the presentation of "mini-tasks," with reinforcement for completion of those tasks. Additionally, behavior monitors were used to provide the clients with further positive comments and prompts. The results indicated that this strategy was effective in decreasing high rates of off-task behavior and in substantially increasing the number of pieces completed during the session. However, the effect on other inappropriate behaviors was minimal. This strategy demonstrated that profoundly retarded clients could be taught to increase on-task behavior and productivity in pre-skills workshop classes. PMID:3998174

  11. Unpacking High and Low Efficacy Teachers' Task Analysis and Competence Assessment in Teaching Low-Achieving Students in Secondary Schools

    Science.gov (United States)

    Wang, Li-Yi; Jen-Yi, Li; Tan, Liang-See; Tan, Irene; Lim, Xue-Fang; Wu, Bing Sheng

    2016-01-01

    This study adopted a pragmatic qualitative research design to unpack high and low efficacy teachers' task analysis and competence assessment in the context of teaching low-achieving students. Nine secondary school English and Science teachers were recruited and interviewed. Results of thematic analysis show that helping students perform well in…

  12. Automated segmentation and dose-volume analysis with DICOMautomaton

    Science.gov (United States)

    Clark, H.; Thomas, S.; Moiseenko, V.; Lee, R.; Gill, B.; Duzenli, C.; Wu, J.

    2014-03-01

    Purpose: Exploration of historical data for regional organ dose sensitivity is limited by the effort needed to (sub-)segment large numbers of contours. A system has been developed which can rapidly perform autonomous contour sub-segmentation and generic dose-volume computations, substantially reducing the effort required for exploratory analyses. Methods: A contour-centric approach is taken which enables lossless, reversible segmentation and dramatically reduces computation time compared with voxel-centric approaches. Segmentation can be specified on a per-contour, per-organ, or per-patient basis, and can be performed along either an embedded plane or in terms of the contour's bounds (e.g., split organ into fractional-volume/dose pieces along any 3D unit vector). More complex segmentation techniques are available. Anonymized data from 60 head-and-neck cancer patients were used to compare dose-volume computations with Varian's EclipseTM (Varian Medical Systems, Inc.). Results: Mean doses and Dose-volume-histograms computed agree strongly with Varian's EclipseTM. Contours which have been segmented can be injected back into patient data permanently and in a Digital Imaging and Communication in Medicine (DICOM)-conforming manner. Lossless segmentation persists across such injection, and remains fully reversible. Conclusions: DICOMautomaton allows researchers to rapidly, accurately, and autonomously segment large amounts of data into intricate structures suitable for analyses of regional organ dose sensitivity.

  13. Job task and functional analysis of the Division of Reactor Projects, office of Nuclear Reactor Regulation. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, J.A.; Gilmore, W.; Hahn, H.A.

    1998-07-10

    A job task and functional analysis was recently completed for the positions that make up the regional Divisions of Reactor Projects. Among the conclusions of that analysis was a recommendation to clarify roles and responsibilities among site, regional, and headquarters personnel. As that analysis did not cover headquarters personnel, a similar analysis was undertaken of three headquarters positions within the Division of Reactor Projects: Licensing Assistants, Project Managers, and Project Directors. The goals of this analysis were to systematically evaluate the tasks performed by these headquarters personnel to determine job training requirements, to account for variations due to division/regional assignment or differences in several experience categories, and to determine how, and by which positions, certain functions are best performed. The results of this analysis include recommendations for training and for job design. Data to support this analysis was collected by a survey instrument and through several sets of focus group meetings with representatives from each position.

  14. Analysis of Skeletal Muscle Metrics as Predictors of Functional Task Performance

    Science.gov (United States)

    Ryder, Jeffrey W.; Buxton, Roxanne E.; Redd, Elizabeth; Scott-Pandorf, Melissa; Hackney, Kyle J.; Fiedler, James; Ploutz-Snyder, Robert J.; Bloomberg, Jacob J.; Ploutz-Snyder, Lori L.

    2010-01-01

    PURPOSE: The ability to predict task performance using physiological performance metrics is vital to ensure that astronauts can execute their jobs safely and effectively. This investigation used a weighted suit to evaluate task performance at various ratios of strength, power, and endurance to body weight. METHODS: Twenty subjects completed muscle performance tests and functional tasks representative of those that would be required of astronauts during planetary exploration (see table for specific tests/tasks). Subjects performed functional tasks while wearing a weighted suit with additional loads ranging from 0-120% of initial body weight. Performance metrics were time to completion for all tasks except hatch opening, which consisted of total work. Task performance metrics were plotted against muscle metrics normalized to "body weight" (subject weight + external load; BW) for each trial. Fractional polynomial regression was used to model the relationship between muscle and task performance. CONCLUSION: LPMIF/BW is the best predictor of performance for predominantly lower-body tasks that are ambulatory and of short duration. LPMIF/BW is a very practical predictor of occupational task performance as it is quick and relatively safe to perform. Accordingly, bench press work best predicts hatch-opening work performance.

  15. Study on the utilization of the cognitive architecture EPIC to the task analysis of a nuclear power plant operator

    International Nuclear Information System (INIS)

    This work presents a study of the use of the integrative cognitive architecture EPIC - Executive-Process - Interactive-Control, designed to evaluate the performance of a person performing tasks in parallel in a man-machine interface, as a methodology for Cognitive Task Analysis of a nuclear power plant operator. A comparison of the results obtained by the simulation by EPIC and the results obtained by application of the MHP model to the tasks performed by a shift operator during the execution of the procedure PO-E-3 - Steam Generator Tube Rupture of Angra 1 Nuclear Power Plant is done. To subsidize that comparison, an experiment was performed at the Angra 2 Nuclear Power Plant Full Scope Simulator in which three operator tasks were executed, its completion time measured and compared with the results of MHP and EPIC modeling. (author)

  16. Set-based Tasks within the Singularity-robust Multiple Task-priority Inverse Kinematics Framework: General Formulation, Stability Analysis and Experimental Results

    Directory of Open Access Journals (Sweden)

    Signe eMoe

    2016-04-01

    Full Text Available Inverse kinematics algorithms are commonly used in robotic systems to transform tasks to joint references, and several methods exist to ensure the achievement of several tasks simultaneously. The multiple task-priority inverse kinematicsframework allows tasks to be considered in a prioritized order by projecting task velocities through the nullspaces of higherpriority tasks. This paper extends this framework to handle setbased tasks, i.e. tasks with a range of valid values, in addition to equality tasks, which have a specific desired value. Examples of set-based tasks are joint limit and obstacle avoidance. The proposed method is proven to ensure asymptotic convergence of the equality task errors and the satisfaction of all high-priority set-based tasks. The practical implementation of the proposed algorithm is discussed, and experimental results are presented where a number of both set-based and equality tasks have been implemented on a 6 degree of freedom UR5 which is an industrial robotic arm from Universal Robots. The experiments validate thetheoretical results and confirm the effectiveness of the proposed approach.

  17. Video Analysis and Modeling Performance Task to promote becoming like scientists in classrooms

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    This paper aims to share the use of Tracker a free open source video analysis and modeling tool that is increasingly used as a pedagogical tool for the effective learning and teaching of Physics for Grade 9 Secondary 3 students in Singapore schools to make physics relevant to the real world. We discuss the pedagogical use of Tracker, guided by the Framework for K-12 Science Education by National Research Council, USA to help students to be more like scientists. For a period of 6 to 10 weeks, students use a video analysis coupled with the 8 practices of sciences such as 1. Ask question, 2. Use models, 3. Plan and carry out investigation, 4. Analyse and interpret data, 5. Use mathematical and computational thinking, 6. Construct explanations, 7. Argue from evidence and 8. Communicate information. This papers focus in on discussing some of the performance task design ideas such as 3.1 flip video, 3.2 starting with simple classroom activities, 3.3 primer science activity, 3.4 integrative dynamics and kinematics l...

  18. Human factors assessment in PRA using task analysis linked evaluation technique (TALENT)

    International Nuclear Information System (INIS)

    Human error is a primary contributor to risk in complex high-reliability systems. A 1985 U.S. Nuclear Regulatory Commission (USNRC) study of licensee event reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Since then, the USNRC has initiated research to fully and properly integrate human errors into the probabilistic risk assessment (PRA) process. The resulting implementation procedure is known as the Task Analysis Linked Evaluation Technique (TALENT). As indicated, TALENT is a broad-based method for integrating human factors expertise into the PRA process. This process achieves results which: (1) provide more realistic estimates of the impact of human performance on nuclear power safety, (2) can be fully audited, (3) provide a firm technical base for equipment-centered and personnel-centered retrofit/redesign of plants enabling them to meet internally and externally imposed safety standards, and (4) yield human and hardware data capable of supporting inquiries into human performance issues that transcend the individual plant. The TALENT procedure is being field-tested to verify its effectiveness and utility. The objectives of the field-test are to examine (1) the operability of the process, (2) its acceptability to the users, and (3) its usefulness for achieving measurable improvements in the credibility of the analysis. The field-test will provide the information needed to enhance the TALENT process

  19. Frequency analysis of a task-evoked pupillary response: Luminance-independent measure of mental effort.

    Science.gov (United States)

    Peysakhovich, Vsevolod; Causse, Mickaël; Scannella, Sébastien; Dehais, Frédéric

    2015-07-01

    Pupil diameter is a widely-studied cognitive load measure, which, despite its convenience for non-intrusive operator state monitoring in complex environments, is still not available for in situ measurements because of numerous methodological limitations. The most important of these limitations is the influence of pupillary light reflex. Hence, there is the need of providing a pupil-based cognitive load measure that is independent of light conditions. In this paper, we present a promising technique of pupillary signal analysis resulting in luminance-independent measure of mental effort that could be used in real-time without a priori on luminous conditions. Twenty-two participants performed a short-term memory task under different screen luminance conditions. Our results showed that the amplitude of pupillary dilation due to load on memory was luminance-dependent with higher amplitude corresponding to lower-luminance condition. Furthermore, our experimentation showed that load on memory and luminance factors express themselves differently according to frequency. Therefore, as our statistical analysis revealed, the ratio between low (0-1.6 Hz) and high frequency (1.6-4 Hz) bands (LF/HF ratio) of power spectral densities of pupillary signal is sensitive to the cognitive load but not to luminance. Our results are promising for the measurement of load on memory in ecological settings. PMID:25941013

  20. Requirements for psychological models to support design: Towards ecological task analysis

    Science.gov (United States)

    Kirlik, Alex

    1991-01-01

    Cognitive engineering is largely concerned with creating environmental designs to support skillful and effective human activity. A set of necessary conditions are proposed for psychological models capable of supporting this enterprise. An analysis of the psychological nature of the design product is used to identify a set of constraints that models must meet if they can usefully guide design. It is concluded that cognitive engineering requires models with resources for describing the integrated human-environment system, and that these models must be capable of describing the activities underlying fluent and effective interaction. These features are required in order to be able to predict the cognitive activity that will be required given various design concepts, and to design systems that promote the acquisition of fluent, skilled behavior. These necessary conditions suggest that an ecological approach can provide valuable resources for psychological modeling to support design. Relying heavily on concepts from Brunswik's and Gibson's ecological theories, ecological task analysis is proposed as a framework in which to predict the types of cognitive activity required to achieve productive behavior, and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The framework is described in terms, and illustrated with an example from the previous research on modeling skilled human-environment interaction.

  1. Analysis of airborne radiometric data. Volume 3. Topical reports

    International Nuclear Information System (INIS)

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors

  2. Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory

    OpenAIRE

    Lo, Andrew W.; Jiang W. Wang

    2000-01-01

    We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. Two-fund separation theorems suggest a natural definition for trading activity: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that turnover satisfies an approximately linear K-factor structure. These implications are examined empirically using individual weekly turnover data for NYSE and AMEX sec...

  3. Multifamily Quality Control Inspector Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Quality Control Inspector JTA identifies and catalogs all of the tasks performed by multifamily quality control inspectors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  4. Multifamily Building Operator Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Building Operator JTA identifies and catalogs all of the tasks performed by multifamily building operators, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  5. Multifamily Energy Auditor Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Energy Auditor JTA identifies and catalogs all of the tasks performed by multifamily energy auditors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  6. Design and Analysis of Self-Adapted Task Scheduling Strategies in Wireless Sensor Networks

    OpenAIRE

    Sajid Hussain; Guolong Chen; Naixue Xiong; Han-Chieh Chao; Wenzhong Guo

    2011-01-01

    In a wireless sensor network (WSN), the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and  scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, th...

  7. Multifamily Retrofit Project Manager Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Retrofit Project Manager JTA identifies and catalogs all of the tasks performed by multifamily retrofit project managers, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  8. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks. PMID:25622107

  9. Assessment of solar options for small power systems applications. Volume III. Analysis of concepts

    Energy Technology Data Exchange (ETDEWEB)

    Laity, W.W.; Aase, D.T.; Apley, W.J.; Bird, S.P.; Drost, M.K.; Garrett-Price, B.A.; Williams, T.A.

    1980-09-01

    A comparative analysis of solar thermal conversion concepts that are potentially suitable for development as small electric power systems (1 to 10 MWe) is given. Seven generic types of collectors, together with associated subsystems for electric power generation, were considered. The collectors can be classified into three categories: (1) two-axis tracking (with compound-curvature reflecting surfaces; (2) one-axis tracking (with single-curvature reflecting suraces; and (3) nontracking (with low-concentration reflecting surfaces). All seven collectors were analyzed in conceptual system configurations with Rankine-cycle engines. In addition, two of the collectors (the Point Focus Central Receiver and the Point Focus Distributed Receiver) were analyzed with Brayton-cycle engines, and the latter of the two also was analyzed with Stirling-cycle engines. This volume describes the systems analyses performed on all the alternative configurations of the seven generic collector concepts and the results obtained. The SOLSTEP computer code used to determine each configuration's system cost and performance is briefly described. The collector and receiver performance calculations used are also presented. The capital investment and related costs that were obtained from the systems studies are presented, and the levelized energy costs are given as a function of capacity factor obtained from the systems studies. Included also are the values of the other attributes used in the concepts' final ranking. The comments, conclusions, and recommendations developed by the PNL study team during the concept characterization and systems analysis tasks of the study are presented. (WHK)

  10. Comparison of Network Analysis Approaches on EEG Connectivity in Beta during Visual Short-Term Memory Binding Tasks

    OpenAIRE

    Smith, Keith; Azami, Hamed; Parra, Mario A.; Escudero, Javier; Starr, John M.

    2016-01-01

    We analyse the electroencephalogram signals in the beta band of working memory representation recorded from young healthy volunteers performing several different Visual Short-Term Memory (VSTM) tasks which have proven useful in the assessment of clinical and preclinical Alzheimer's disease. We compare network analysis using Maximum Spanning Trees (MSTs) with network analysis obtained using 20% and 25% connection thresholds on the VSTM data. MSTs are a promising method of network analysis nega...

  11. Control Volume Analysis, Entropy Balance and the Entropy Production in Flow Systems

    CERN Document Server

    Niven, Robert K

    2014-01-01

    This chapter concerns "control volume analysis", the standard engineering tool for the analysis of flow systems, and its application to entropy balance calculations. Firstly, the principles of control volume analysis are enunciated and applied to flows of conserved quantities (e.g. mass, momentum, energy) through a control volume, giving integral (Reynolds transport theorem) and differential forms of the conservation equations. Several definitions of steady state are discussed. The concept of "entropy" is then established using Jaynes' maximum entropy method, both in general and in equilibrium thermodynamics. The thermodynamic entropy then gives the "entropy production" concept. Equations for the entropy production are then derived for simple, integral and infinitesimal flow systems. Some technical aspects are examined, including discrete and continuum representations of volume elements, the effect of radiation, and the analysis of systems subdivided into compartments. A Reynolds decomposition of the entropy ...

  12. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Tozer-Loft, S.M

    2000-12-01

    A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy is explained in detail, and some improvements proposed. These 'natural' histograms are extended to show the effects of real point sources which do not exactly follow the inverse-square law, and to demonstrate the in-target dose-volume distribution, previously unpublished. The histograms are used as a way of mathematically analysing the properties of theoretical mono-energetic radionuclides, and for demonstrating the dosimetric properties of a potential new brachytherapy source (Ytterbium-169). A new modification of the Anderson formalism is then described for producing Anderson Inverse-Square Shifted (AISS) histograms for the Gamma Knife, which are shown to be useful for demonstrating the quality of stereotactic radiosurgery dose distributions. A study is performed analysing the results of Gamma Knife treatments on 44 patients suffering from a benign brain tumour (acoustic neuroma). Follow-up data is used to estimate the volume shrinkage or growth of each tumour, and this measure of outcome is compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising

  13. Detecting Hidden Encrypted Volume Files via Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Mario Piccinelli

    2015-05-01

    Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.

  14. Geometrical considerations in dose volume analysis in intracavitary treatment

    Energy Technology Data Exchange (ETDEWEB)

    Deshpande, D.D. [Dept. of Medical Physics, Tata Memorial Hospital, Bombay (India); Shrivastava, S.K. [Dept. of Radiation Oncology, Tata Memorial Hospital, Bombay (India); Pradhan, A.S. [Bhabha Atomic Research Centre, Bombay (India); Viswanathan, P.S. [Dept. of Medical Physics, Tata Memorial Hospital, Bombay (India); Dinshaw, K.A. [Dept. of Radiation Oncology, Tata Memorial Hospital, Bombay (India)

    1996-06-01

    The present work was aimed at to study the relationship between the volume enclosed by reference iodose surface and various geometrical parameters of the intracavitary applicator in treatment of carcinoma of cervix. Pearshape volume of the reference isodose derived from the Total Reference Air Kerma (TRAK) and the product of its dimensions, height H, width W and thickness T which is dependent on the applicator geometry, were estimated for 100 intracavitary applications treated by Selectron LDR machine. Orthogonal radiographs taken for each patient were used for measurement of actual geometric dimensions of the applicator and carrying out the dosimetry on TP-11 treatment planning system. The dimensions H, W and T of reference isodose surface (60 Gy) were also noted. Ratio of the product HWT and the pearshape volume was found mainly to be a function of colpostat separation and not of other geometrical parameters like maximum vertical and anterio-posterior dimension of the applicator. The ratio remained almost constant for a particular combination of uterine tandem and colpostat. Variation in the ratios were attributed to the non-standard geometry. The ratio of the volume of reference isodose surface to the product of its dimensions in the applicator depends upon the colpostat separation. (orig./MG) [Deutsch] Die vorliegende Arbeit hatte zum Ziel, die Beziehung zwischen dem von der Referenzisodose umhuellten Volumen und verschiedenen geometrischen Parametern bei der intrakavitaeren Applikation in der Behandlung des Zervixkarzinoms zu untersuchen. Ein birnenfoermiges Volumen, welches von der Referenzisodose umhuellt und von der Total Refernce Air Kerma (TRAK) und dem Produkt der aus der Applikatorgeometrie ableitbaren Dimensionen Hoehe, Breite und Dicke (H, W, T) bestimmt wurde, wurde bei 100 Applikationen (Selectron LDR) abgeschaetzt. Zur Messung der geometrischen Anordnung des Applikators wurden orthogonale Roentgenbilder, die bei jedem Patienten angefertigt wurden

  15. Chemical analysis and volume reduction of radioactive HEPA filter waste

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, In Ho; Choi, Wang Kyu; Lee, Suk Chol; Min, Byung Youn; Yang, Hee Chul; Lee, Kun Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-05-15

    According to the active operation of nuclide facilities at KAERI, many spent filters used in a ventilation system of the nuclear facilities have been generated as a spent filter wastes. These spent filter wastes have generally consisted of a HEPA filter after filtering of all the contaminants in the air stream generated during the operation of nuclide facilities. Therefore, this study is conducted to investigate the radionuclide and heavy metals in HEPA filters, and the characteristics of the melting as a decontamination and volume reduction

  16. Turnaround operations analysis for OTV. Volume 2: Detailed technical report

    Science.gov (United States)

    1988-01-01

    The objectives and accomplishments were to adapt and apply the newly created database of Shuttle/Centaur ground operations. Previously defined turnaround operations analyses were to be updated for ground-based OTVs (GBOTVs) and space-based OTVs (SBOTVs), design requirements identified for both OTV and Space Station accommodations hardware, turnaround operations costs estimated, and a technology development plan generated to develop the required capabilities. Technical and programmatic data were provided for NASA pertinent to OTV round and space operations requirements, turnaround operations, task descriptions, timelines and manpower requirements, OTV modular design and booster and Space Station interface requirements. SBOTV accommodations development schedule, cost and turnaround operations requirements, and a technology development plan for ground and space operations and space-based accommodations facilities and support equipment. Significant conclusion are discussed.

  17. Task-based optimization of flip angle for texture analysis in MRI

    Science.gov (United States)

    Brand, Jonathan F.; Furenlid, Lars R.; Altbach, Maria I.; Galons, Jean-Phillippe; Bhattacharyya, Achyut; Sharma, Puneet; Bhattacharyya, Tulshi; Bilgin, Ali; Martin, Diego R.

    2016-03-01

    Chronic liver disease is a worldwide health problem, and hepatic fibrosis (HF) is one of the hallmarks of the disease. The current reference standard for diagnosing HF is biopsy followed by pathologist examination, however this is limited by sampling error and carries risk of complications. Pathology diagnosis of HF is based on textural change in the liver as a lobular collagen network that develops within portal triads. The scale of collagen lobules is characteristically on order of 1-5 mm, which approximates the resolution limit of in vivo gadolinium-enhanced magnetic resonance imaging in the delayed phase. We have shown that MRI of formalin fixed human ex vivo liver samples mimic the textural contrast of in vivo Gd-MRI and can be used as MRI phantoms. We have developed local texture analysis that is applied to phantom images, and the results are used to train model observers. The performance of the observer is assessed with the area-under-the-receiveroperator- characteristic curve (AUROC) as the figure of merit. To optimize the MRI pulse sequence, phantoms are scanned with multiple times at a range of flip angles. The flip angle that associated with the highest AUROC is chosen as optimal based on the task of detecting HF.

  18. Cognitive Task Analysis of Business Jet Pilots' Weather Flying Behaviors: Preliminary Results

    Science.gov (United States)

    Latorella, Kara; Pliske, Rebecca; Hutton, Robert; Chrenka, Jason

    2001-01-01

    This report presents preliminary findings from a cognitive task analysis (CTA) of business aviation piloting. Results describe challenging weather-related aviation decisions and the information and cues used to support these decisions. Further, these results demonstrate the role of expertise in business aviation decision-making in weather flying, and how weather information is acquired and assessed for reliability. The challenging weather scenarios and novice errors identified in the results provide the basis for experimental scenarios and dependent measures to be used in future flight simulation evaluations of candidate aviation weather information systems. Finally, we analyzed these preliminary results to recommend design and training interventions to improve business aviation decision-making with weather information. The primary objective of this report is to present these preliminary findings and to document the extended CTA methodology used to elicit and represent expert business aviator decision-making with weather information. These preliminary findings will be augmented with results from additional subjects using this methodology. A summary of the complete results, absent the detailed treatment of methodology provided in this report, will be documented in a separate publication.

  19. Functional connectivity changes during a Working memory task in rat via NMF analysis

    Directory of Open Access Journals (Sweden)

    Jing Wei

    2015-01-01

    Full Text Available Working memory (WM is necessary in higher cognition. The brain as a complex network is formed by interconnections among neurons. Connectivity results in neural dynamics to support cognition. The first aim is to investigate connectivity dynamics in medial prefrontal cortex (mPFC networks during WM. As brain neural activity is sparse, the second aim is to find the intrinsic connectivity property in a feature space. Using multi-channel electrode recording techniques, spikes were simultaneously obtained from mPFC of rats that performed a Y-maze WM task. Continuous time series converted from spikes were embedded in a low-dimensional space by non-negative matrix factorization (NMF. mPFC network in original space was constructed by measuring connections among neurons. And the same network in NMF space was constructed by computing connectivity values between the extracted NMF components. Causal density (Cd and global efficiency (E were estimated to present the network property. The results showed that Cd and E significantly peaked in the interval right before the maze choice point in correct trials. However, the increase did not emerge in error trials. Additionally, Cd and E in two spaces displayed similar trends in correct trials. The difference was that the measures in NMF space were significantly greater than those in original space. Our findings indicated that the anticipatory changes in mPFC networks may have an effect on future WM behavioral choices. Moreover, the NMF analysis achieves a better characterization for a brain network.

  20. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.

  1. Evaluation of the Field Test of Project Information Packages: Volume III--Resource Cost Analysis.

    Science.gov (United States)

    Al-Salam, Nabeel; And Others

    The third of three volumes evaluating the first year field test of the Project Information Packages (PIPs) provides a cost analysis study as a key element in the total evaluation. The resource approach to cost analysis is explained and the specific resource methodology used in the main cost analysis of the 19 PIP field-test projects detailed. The…

  2. Structural integrity of welded bi-metallic components (BIMET) - Task Group 5 'Analysis'. Prediction by EAM and FEA

    International Nuclear Information System (INIS)

    Investigations in the EU BIMET project focused on two pipe segments of ferritic and austenitic steel with a special weld in between which is characterised by a multiphase transition with strongly diverging characteristics (strength mis-matching). The BIMET project is described, and some of the findings of Task Group 5 'Analysis' are presented which are based on EAM and FEA

  3. Obtaining Content Weights for Test Specifications from Job Analysis Task Surveys: An Application of the Many-Facets Rasch Model

    Science.gov (United States)

    Wang, Ning; Stahl, John

    2012-01-01

    This article discusses the use of the Many-Facets Rasch Model, via the FACETS computer program (Linacre, 2006a), to scale job/practice analysis survey data as well as to combine multiple rating scales into single composite weights representing the tasks' relative importance. Results from the Many-Facets Rasch Model are compared with those…

  4. Analysis of Cloud Network Management Using Resource Allocation and Task Scheduling Services

    Directory of Open Access Journals (Sweden)

    K.C. Okafor

    2016-01-01

    Full Text Available Network failure in cloud datacenter could result from inefficient resource allocation; scheduling and logical segmentation of physical machines (network constraints. This is highly undesirable in Distributed Cloud Computing Networks (DCCNs running mission critical services. Such failure has been identified in the University of Nigeria datacenter network situated in the south eastern part of Nigeria. In this paper, the architectural decomposition of a proposed DCCN was carried out while exploring its functionalities for grid performance. Virtualization services such as resource allocation and task scheduling were employed in heterogeneous server clusters. The validation of the DCCN performance was carried out using trace files from Riverbed Modeller 17.5 in order to ascertain the influence of virtualization on server resource pool. The QoS metrics considered in the analysis are: the service delay time, resource availability, throughput and utilization. From the validation analysis of the DCCN, the following results were obtained: average throughput (bytes/Sec for DCCN = 40.00%, DCell = 33.33% and BCube = 26.67%. Average resource availability response for DCCN = 38.46%, DCell = 33.33%, and BCube = 28.21%. DCCN density on resource utilization = 40% (when logically isolated and 60% (when not logically isolated. From the results, it was concluded that using virtualization in a cloud DataCenter servers will result in enhanced server performance offering lower average wait time even with a higher request rate and longer duration of resource use (service availability. By evaluating these recursive architectural designs for network operations, enterprises ready for Spine and leaf model could further develop their network resource management schemes for optimal performance.

  5. Multiple-task real-time PDP-15 operating system for data acquisition and analysis

    International Nuclear Information System (INIS)

    The RAMOS operating system is capable of handling up to 72 simultaneous tasks in an interrupt-driven environment. The minimum viable hardware configuration includes a Digital Equipment Corporation PDP-15 computer with 16384 words of memory, extended arithmetic element, automatic priority interrupt, a 256K-word RS09 DECdisk, two DECtape transports, and an alphanumeric keyboard/typer. The monitor executes major tasks by loading disk-resident modules to memory for execution; modules are written in a format that allows page-relocation by the monitor, and can be loaded into any available page. All requests for monitor service by tasks, including input/output, floating point arithmetic, request for additional memory, task initiation, etc., are implemented by privileged monitor calls (CAL). All IO device handlers are capable of queuing requests for service, allowing several tasks ''simultaneous'' use of all resources. All alphanumeric IO (including the PC05) is completely buffered and handled by a single multiplexing routine. The floating point arithmetic software is re-entrant to all operating modules and includes matrix arithmetic functions. One of the system tasks can be a ''batch'' job, controlled by simulating an alphanumeric command terminal through cooperative functions of the disk handler and alphanumeric device software. An alphanumeric control sequence may be executed, automatically accessing disk-resident tasks in any prescribed order; a library of control sequences is maintained on bulk storage for access by the monitor. (auth)

  6. Analysis of human-in-the-loop tele-operated maintenance inspection tasks using VR

    International Nuclear Information System (INIS)

    Highlights: ► Execution of tele-operated inspection tasks for ITER maintenance was analyzed. ► Human factors experiments using Virtual Reality showed to be a valuable approach. ► A large variation in time performance and number of collisions was found. ► Results indicate significant room for improvement for teleoperated free space tasks. ► A promising solution is haptic shared control: assist operator with guiding forces. -- Abstract: One of the challenges in future fusion plants such as ITER is the remote maintenance of the plant. Foreseen human-in-the-loop tele-operation is characterized by limited visual and haptic feedback from the environment, which results in degraded task performance and increased operator workload. For improved tele-operated task performance it is required to get insight in the expected tasks and problems during maintenance at ITER. By means of an exploratory human factor experiment, this paper analyses problems and bottlenecks during the execution of foreseen tele-operated maintenance at ITER, identifying most promising areas of improvement. The focus of this paper is on free space (sub)tasks where contact with the environment needs to be avoided. A group of 5 subjects was asked to carry-out an ITER related free space task (visual inspection), using a six degree of freedom master device connected to a simulated hot cell environment. The results show large variation in time performance between subjects and an increasing number of collisions for more difficult tasks, indicating room for improvement for free space (sub)tasks. The results will be used in future research on the haptic guidance strategies in the ITER Remote Handling framework

  7. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    CERN Document Server

    Tozer-Loft, S M

    2000-01-01

    compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising, but significant association with outcome. A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy i...

  8. Aerodynamic analysis of flapping foils using volume grid deformation code

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Jin Hwan [Seoul National University, Seoul (Korea, Republic of); Kim, Jee Woong; Park, Soo Hyung; Byun, Do Young [Konkuk University, Seoul (Korea, Republic of)

    2009-06-15

    Nature-inspired flapping foils have attracted interest for their high thrust efficiency, but the large motions of their boundaries need to be considered. It is challenging to develop robust, efficient grid deformation algorithms appropriate for the large motions in three dimensions. In this paper, a volume grid deformation code is developed based on finite macro-element and transfinite interpolation, which successfully interfaces to a structured multi-block Navier-Stokes code. A suitable condition that generates the macro-elements with efficiency and improves the robustness of grid regularity is presented as well. As demonstrated by an airfoil with various motions related to flapping, the numerical results of aerodynamic forces by the developed method are shown to be in good agreement with those of an experimental data or a previous numerical solution

  9. Economic analysis of the space shuttle system, volume 1

    Science.gov (United States)

    1972-01-01

    An economic analysis of the space shuttle system is presented. The analysis is based on economic benefits, recurring costs, non-recurring costs, and ecomomic tradeoff functions. The most economic space shuttle configuration is determined on the basis of: (1) objectives of reusable space transportation system, (2) various space transportation systems considered and (3) alternative space shuttle systems.

  10. Task analysis revisited: refining the phlebotomy technician scope of practice and assessing longitudinal change in competencies.

    Science.gov (United States)

    Fidler, James R

    2007-06-01

    A random sample of 500 phlebotomy technicians certified by a national organization was queried regarding perceptions of importance of 53 specific practice-related tasks representative of various departmental areas. The sample was surveyed via a mail questionnaire. Role centrality was assessed by considering mean importance ratings and by applying the Rasch measurement model to assigned importance ratings. Approximately 36% of the questionnaires received by respondents were returned. The results revealed which tasks were fundamental to the phlebotomy technician scope of practice. To assess longitudinal change in core duties, task saliency was considered with respect to similar data collected a decade earlier. Task importance may be considered by agencies that educate, credential, or employ phlebotomy technicians in providing current job function descriptions. The longitudinal methodology employed may be applicable to other job roles for which the assessment of change is of interest. PMID:17476028

  11. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  12. Employee Involvement, Technology, and Evolution in Job Skills: A Task-Based Analysis

    OpenAIRE

    Francis Green

    2012-01-01

    The author investigates the evolution of job skill distribution using task data derived from the U.K. Skills Surveys of 1997, 2001, and 2006, and the 1992 Employment Survey in Britain. He determines the extent to which employee involvement in the workplace and computer technologies promote the use of higher order cognitive and interactive skills. He finds that literacy, other communication tasks, and self-planning skills have grown especially fast. Numerical and problem-solving skills have al...

  13. Uniprocessor Schedulability and Sensitivity Analysis of Multiple Criticality Tasks with Fixed-Priorities

    OpenAIRE

    Dorin, François; Richard, Pascal; Richard, Michael; Goossens, Joël

    2009-01-01

    Safety-critical real-time standards define several criticality levels for the tasks (e.g., DO-178B - Software Considerations in Airborne Systems and Equipment Certification). Classical models do not take into account these levels. Vestal introduced a new multiple criticality model, to model more precisely existing real-time systems, and algorithms to schedule such systems. Such task model represents a potentially very significant advance in the modeling of safety-critical real-time systems. B...

  14. An analysis of the processing requirements of a complex perceptual-motor task

    Science.gov (United States)

    Kramer, A. F.; Wickens, C. D.; Donchin, E.

    1983-01-01

    Current concerns in the assessment of mental workload are discussed, and the event-related brain potential (ERP) is introduced as a promising mental-workload index. Subjects participated in a series of studies in which they were required to perform a target acquisition task while also covertly counting either auditory or visual probes. The effects of several task-difficulty manipulations on the P300 component of the ERP elicited by the counted stimulus probes were investigated. With sufficiently practiced subjects the amplitude of the P300 was found to decrease with increases in task difficulty. The second experiment also provided evidence that the P300 is selectively sensitive to task-relevant attributes. A third experiment demonstrated a convergence in the amplitude of the P300s elicited in the simple and difficult versions of the tracking task. The amplitude of the P300 was also found to covary with the measures of tracking performance. The results of the series of three experiments illustrate the sensitivity of the P300 to the processing requirements of a complex target acquisition task. The findings are discussed in terms of the multidimensional nature of processing resources.

  15. SLUDGE TREATMENT PROJECT ALTERNATIVES ANALYSIS SUMMARY REPORT (VOLUME 1)

    International Nuclear Information System (INIS)

    Highly radioactive sludge (containing up to 300,000 curies of actinides and fission products) resulting from the storage of degraded spent nuclear fuel is currently stored in temporary containers located in the 105-K West storage basin near the Columbia River. The background, history, and known characteristics of this sludge are discussed in Section 2 of this report. There are many compelling reasons to remove this sludge from the K-Basin. These reasons are discussed in detail in Section1, and they include the following: (1) Reduce the risk to the public (from a potential release of highly radioactive material as fine respirable particles by airborne or waterborn pathways); (2) Reduce the risk overall to the Hanford worker; and (3) Reduce the risk to the environment (the K-Basin is situated above a hazardous chemical contaminant plume and hinders remediation of the plume until the sludge is removed). The DOE-RL has stated that a key DOE objective is to remove the sludge from the K-West Basin and River Corridor as soon as possible, which will reduce risks to the environment, allow for remediation of contaminated areas underlying the basins, and support closure of the 100-KR-4 operable unit. The environmental and nuclear safety risks associated with this sludge have resulted in multiple legal and regulatory remedial action decisions, plans,and commitments that are summarized in Table ES-1 and discussed in more detail in Volume 2, Section 9

  16. SLUDGE TREATMENT PROJECT ALTERNATIVES ANALYSIS SUMMARY REPORT [VOLUME 1

    Energy Technology Data Exchange (ETDEWEB)

    FREDERICKSON JR; ROURK RJ; HONEYMAN JO; JOHNSON ME; RAYMOND RE

    2009-01-19

    Highly radioactive sludge (containing up to 300,000 curies of actinides and fission products) resulting from the storage of degraded spent nuclear fuel is currently stored in temporary containers located in the 105-K West storage basin near the Columbia River. The background, history, and known characteristics of this sludge are discussed in Section 2 of this report. There are many compelling reasons to remove this sludge from the K-Basin. These reasons are discussed in detail in Section1, and they include the following: (1) Reduce the risk to the public (from a potential release of highly radioactive material as fine respirable particles by airborne or waterborn pathways); (2) Reduce the risk overall to the Hanford worker; and (3) Reduce the risk to the environment (the K-Basin is situated above a hazardous chemical contaminant plume and hinders remediation of the plume until the sludge is removed). The DOE-RL has stated that a key DOE objective is to remove the sludge from the K-West Basin and River Corridor as soon as possible, which will reduce risks to the environment, allow for remediation of contaminated areas underlying the basins, and support closure of the 100-KR-4 operable unit. The environmental and nuclear safety risks associated with this sludge have resulted in multiple legal and regulatory remedial action decisions, plans,and commitments that are summarized in Table ES-1 and discussed in more detail in Volume 2, Section 9.

  17. Lessons from a pilot project in cognitive task analysis: the potential role of intermediates in preclinical teaching in dental education.

    Science.gov (United States)

    Walker, Judith; von Bergmann, HsingChi

    2015-03-01

    The purpose of this study was to explore the use of cognitive task analysis to inform the teaching of psychomotor skills and cognitive strategies in clinical tasks in dental education. Methods used were observing and videotaping an expert at one dental school thinking aloud while performing a specific preclinical task (in a simulated environment), interviewing the expert to probe deeper into his thinking processes, and applying the same procedures to analyze the performance of three second-year dental students who had recently learned the analyzed task and who represented a spectrum of their cohort's ability to undertake the procedure. The investigators sought to understand how experts (clinical educators) and intermediates (trained students) overlapped and differed at points in the procedure that represented the highest cognitive load, known as "critical incidents." Findings from this study and previous research identified possible limitations of current clinical teaching as a result of expert blind spots. These findings coupled with the growing evidence of the effectiveness of peer teaching suggest the potential role of intermediates in helping novices learn preclinical dentistry tasks. PMID:25729022

  18. Connectivity Analysis and Feature Classification in Attention Deficit Hyperactivity Disorder Sub-Types: A Task Functional Magnetic Resonance Imaging Study.

    Science.gov (United States)

    Park, Bo-Yong; Kim, Mansu; Seo, Jongbum; Lee, Jong-Min; Park, Hyunjin

    2016-05-01

    Attention deficit hyperactivity disorder (ADHD) is a pervasive neuropsychiatric disorder. Patients with different ADHD subtypes show different behaviors under different stimuli and thus might require differential approaches to treatment. This study explores connectivity differences between ADHD subtypes and attempts to classify these subtypes based on neuroimaging features. A total of 34 patients (13 ADHD-IA and 21 ADHD-C subtypes) underwent functional magnetic resonance imaging (fMRI) with six task paradigms. Connectivity differences between ADHD subtypes were assessed for the whole brain in each task paradigm. Connectivity measures of the identified regions were used as features for the support vector machine classifier to distinguish between ADHD subtypes. The effectiveness of connectivity measures of the regions were tested by predicting ADHD-related Diagnostic and Statistical Manual of Mental Disorders (DSM) scores. Significant connectivity differences between ADHD subtypes were identified mainly in the frontal, cingulate, and parietal cortices and partially in the temporal, occipital cortices and cerebellum. Classifier accuracy for distinguishing between ADHD subtypes was 91.18 % for both gambling punishment and emotion task paradigms. Linear prediction under the two task paradigms showed significant correlation with DSM hyperactive/impulsive score. Our study identified important brain regions from connectivity analysis based on an fMRI paradigm using gambling punishment and emotion task paradigms. The regions and associated connectivity measures could serve as features to distinguish between ADHD subtypes. PMID:26602102

  19. Task analysis of human-in-the-loop tele-operated maintenance: What can be learned from JET?

    International Nuclear Information System (INIS)

    Highlights: •Maintenance task execution at JET was analyzed to guide improvements for ITER. •A large variation in task duration was found for various operator experience levels. •Results indicate significant room for improvement for tele-operated performance. •Improvent of visual feedback and artificial guiding forces was considered promising. -- Abstract: Remote maintenance will determine the available uptime of future fusion plants such as ITER. Experience at predecessor JET showed that a human-in-the-loop tele-operated approach is crucial, although this approach entails drawbacks such as the unavoidable extensive operator training and relatively long execution times. These drawbacks are common knowledge, but little quantitative research is available to guide improvements (such as improved training methods, or active operator support systems). The aim of this paper is to identify the key areas for further improvement of tele-operated maintenance. This is achieved by a detailed task analysis based on recent maintenance at JET, using task logbooks and video data as well as interviews with experienced master–slave operators. The resulting task analysis shows the (sub)tasks that were most time-consuming and shows a large variance in time performance within operators, but also substantial differences between qualified operators with different levels of experience. The operator interviews indicate that intuitive (virtual) visual feedback and artificial (guiding) forces are promising directions for improvement. The results found in this study will be used for future research and development activities focusing on haptic guiding strategies, with the aim to further design and optimize RH maintenance systems for ITER and beyond

  20. Photovoltaic venture analysis. Final report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    A description of the integrating model for photovoltaic venture analysis is given; input assumptions for the model are described; and the integrating model program listing is given. The integrating model is an explicit representation of the interactions between photovoltaic markets and supply under alternative sets of assumptions. It provides a consistent way of assembling and integrating the various assumptions, data, and information that have been obtained on photovoltaic systems supply and demand factors. Secondly, it provides a mechanism for understanding the implications of all the interacting assumptions. By representing the assumptions in a common, explicit framework, much more complex interactions can be considered than are possible intuitively. The integrating model therefore provides a way of examining the relative importance of different assumptions, parameters, and inputs through sensitivity analysis. Also, detailed results of model sensitivity analysis and detailed market and systems information are presented. (WHK)

  1. Photovoltaic venture analysis. Final report. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    The objective of the study, government programs under investigation, and a brief review of the approach are presented. Potential markets for photovoltaic systems relevant to the study are described. The response of the photovoltaic supply industry is then considered. A model which integrates the supply and demand characteristics of photovoltaics over time was developed. This model also calculates the economic benefits associated with various government subsidy programs. Results are derived under alternative possible supply, demand, and macroeconomic conditions. A probabilistic analysis of the costs and benefits of a $380 million federal photovoltaic procurement initiative, as well as certain alternative strategies, is summarized. Conclusions and recommendations based on the analysis are presented.

  2. Incorporating Language Structure in a Communicative Task: An Analysis of the Language Component of a Communicative Task in the LINC Home Study Program

    Science.gov (United States)

    Lenchuk, Iryna

    2014-01-01

    The purpose of this article is to analyze a task included in the LINC Home Study (LHS) program. LHS is a federally funded distance education program offered to newcomers to Canada who are unable to attend regular LINC classes. A task, in which a language structure (a gerund) is chosen and analyzed, was selected from one instructional module of LHS…

  3. A task analysis-linked approach for integrating the human factor in reliability assessments of nuclear power plants

    International Nuclear Information System (INIS)

    This paper describes an emerging Task Analysis-Linked Evaluation Technique (TALENT) for assessing the contributions of human error to nuclear power plant systems unreliability and risk. Techniques such as TALENT are emerging as a recognition that human error is a primary contributor to plant safety, however, it has been a peripheral consideration to data in plant reliability evaluations. TALENT also recognizes that involvement of persons with behavioral science expertise is required to support plant reliability and risk analyses. A number of state-of-knowledge human reliability analysis tools are also discussed which support the TALENT process. The core of TALENT is comprised of task, timeline and interface analysis data which provide the technology base for event and fault tree development, serve as criteria for selecting and evaluating performance shaping factors, and which provide a basis for auditing TALENT results. Finally, programs and case studies used to refine the TALENT process are described along with future research needs in the area. (author)

  4. Do skeletal cephalometric characteristics correlate with condylar volume, surface and shape? A 3D analysis

    Directory of Open Access Journals (Sweden)

    Saccucci Matteo

    2012-05-01

    Full Text Available Abstract Objective The purpose of this study was to determine the condylar volume in subjects with different mandibular divergence and skeletal class using cone-beam computed tomography (CBCT and analysis software. Materials and methods For 94 patients (46 females and 48 males; mean age 24.3 ± 6.5 years, resultant rendering reconstructions of the left and right temporal mandibular joints (TMJs were obtained. Subjects were then classified on the base of ANB angle the GoGn-SN angle in three classes (I, II, III . The data of the different classes were compared. Results No significant difference was observed in the whole sample between the right and the left sides in condylar volume. The analysis of mean volume among low, normal and high mandibular plane angles revealed a significantly higher volume and surface in low angle subjects (p  Class III subjects also tended to show a higher condylar volume and surface than class I and class II subjects, although the difference was not significant. Conclusions Higher condylar volume was a common characteristic of low angle subjects compared to normal and high mandibular plane angle subjects. Skeletal class also appears to be associated to condylar volume and surface.

  5. District heating and cooling systems for communities through power plant retrofit and distribution network. Volume 4. Tasks 7-9. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    In Task 7, Proposal for Further Work, it is concluded that the combination of the Acme Station, a coal-fired station, and the downtown Toledo area will serve as a successful Demonstration Project. The existing steam system can serve as an excellent base from which various expansion plans can be derived. Replacing the steam system with a coal-fired cogeneration source provides an excellent opportunity for scarce fuel savings. The engineering consultant recommended the installation of a back-pressure turbine as the best method of obtaining cogeneration steam. Task 7 concludes with the site-specific Scope of Work to address the requirements of Phases 2 and 3. Approval is being sought for both phases. Letters of cooperation and commitment are displayed in Task 8. A detailed work management plan is described in Task 9. Information on the demonstration team members is included.

  6. Oak Ridge Health Studies Phase 1 report, Volume 2: Part D, Dose Reconstruction Feasibility Study. Tasks 6, Hazard summaries for important materials at the Oak Ridge Reservation

    Energy Technology Data Exchange (ETDEWEB)

    Bruce, G.M.; Walker, L.B.; Widner, T.E.

    1993-09-01

    The purpose of Task 6 of Oak Ridge Phase I Health Studies is to provide summaries of current knowledge of toxic and hazardous properties of materials that are important for the Oak Ridge Reservation. The information gathered in the course of Task 6 investigations will support the task of focussing any future health studies efforts on those operations and emissions which have likely been most significant in terms of off-site health risk. The information gathered in Task 6 efforts will likely also be of value to individuals evaluating the feasibility of additional health,study efforts (such as epidemiological investigations) in the Oak Ridge area and as a resource for citizens seeking information on historical emissions.

  7. Sealed source and device design safety testing. Volume 5: Technical report on the findings of Task 4, Investigation of failed radioactive stainless steel troxler gauges

    Energy Technology Data Exchange (ETDEWEB)

    Benac, D.J.; Schick, W.R. [Southwest Research Inst., San Antonio, TX (United States)

    1995-10-01

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate failed radioactive stainless steel troxler gauges. SwRI`s task was to determine the cause of failure of the rods and the extent of the problem. SwRI concluded that the broken rod failed in a brittle manner due to a hard zone in the heat affected zone.

  8. Sealed source and device design safety testing. Volume 5: Technical report on the findings of Task 4, Investigation of failed radioactive stainless steel troxler gauges

    International Nuclear Information System (INIS)

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate failed radioactive stainless steel troxler gauges. SwRI's task was to determine the cause of failure of the rods and the extent of the problem. SwRI concluded that the broken rod failed in a brittle manner due to a hard zone in the heat affected zone

  9. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  10. Engineering characterization of ground motion. Task II. Effects of ground motion characteristics on structural response considering localized structural nonlinearities and soil-structure interaction effects. Volume 2

    International Nuclear Information System (INIS)

    This report presents the results of part of a two-task study on the engineering characterization of earthquake ground motion for nuclear power plant design. Task I of the study, which is presented in NUREG/CR-3805, Vol. 1, developed a basis for selecting design response spectra taking into account the characteristics of free-field ground motion found to be significant in causing structural damage. Task II incorporates additional considerations of effects of spatial variations of ground motions and soil-structure interaction on foundation motions and structural response. The results of Task II are presented in four parts: (1) effects of ground motion characteristics on structural response of a typical PWR reactor building with localized nonlinearities and soil-structure interaction effects; (2) empirical data on spatial variations of earthquake ground motion; (3) soil-structure interaction effects on structural response; and (4) summary of conclusions and recommendations based on Tasks I and II studies. This report presents the results of the first part of Task II. The results of the other parts will be presented in NUREG/CR-3805, Vols. 3 to 5

  11. Evaluating Cognitive Action Control Using Eye-Movement Analysis: An Oculomotor Adaptation of the Simon Task.

    Science.gov (United States)

    Duprez, Joan; Houvenaghel, Jean-François; Naudet, Florian; Dondaine, Thibaut; Auffret, Manon; Robert, Gabriel; Drapier, Dominique; Argaud, Soizic; Vérin, Marc; Sauleau, Paul

    2016-01-01

    Cognitive action control has been extensively studied using conflict tasks such as the Simon task. In most recent studies, this process has been investigated in the light of the dual route hypothesis and more specifically of the activation-suppression model using distributional analyses. Some authors have suggested that cognitive action control assessment is not specific to response modes. In this study we adapted the Simon task, using oculomotor responses instead of manual responses, in order to evaluate whether the resolution of conflict induced by a two-dimensional stimulus yielded similar results to what is usually reported in tasks with manual responses. Results obtained from 43 young healthy participants revealed the typical congruence effect, with longer reaction times (RT) and lesser accuracy in the incongruent condition. Conditional accuracy functions (CAF) also revealed a higher proportion of fast errors in the incongruent condition and delta plots confirmed that conflict resolution was easier, as the time taken to respond increased. These results are very similar to what has been reported in the literature. Furthermore, our observations are in line with the assumptions of the activation-suppression model, in which automatic activation in conflict situations is captured in the fastest responses and selective inhibition of cognitive action control needs time to build up. Altogether, our results suggest that conflict resolution has core mechanisms whatever the response mode, manual or oculomotor. Using oculomotor responses in such tasks could be of interest when investigating cognitive action control in patients with severe motor disorders. PMID:26973499

  12. Design and analysis of self-adapted task scheduling strategies in wireless sensor networks.

    Science.gov (United States)

    Guo, Wenzhong; Xiong, Naixue; Chao, Han-Chieh; Hussain, Sajid; Chen, Guolong

    2011-01-01

    In a wireless sensor network (WSN), the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, their counterparts for WSNs remain largely unexplored. Existing traditional high performance computing solutions cannot be directly implemented in WSNs due to the limitations of WSNs such as limited resource availability and the shared communication medium. In this paper, a self-adapted task scheduling strategy for WSNs is presented. First, a multi-agent-based architecture for WSNs is proposed and a mathematical model of dynamic alliance is constructed for the task allocation problem. Then an effective discrete particle swarm optimization (PSO) algorithm for the dynamic alliance (DPSO-DA) with a well-designed particle position code and fitness function is proposed. A mutation operator which can effectively improve the algorithm's ability of global search and population diversity is also introduced in this algorithm. Finally, the simulation results show that the proposed solution can achieve significant better performance than other algorithms. PMID:22163971

  13. Design and Analysis of Self-Adapted Task Scheduling Strategies in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sajid Hussain

    2011-06-01

    Full Text Available In a wireless sensor network (WSN, the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and  scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, their counterparts for WSNs remain largely unexplored. Existing traditional high performance computing solutions cannot be directly implemented in WSNs due to the limitations of WSNs such as limited resource availability and the shared communication medium. In this paper, a self-adapted task scheduling strategy for WSNs is presented. First, a multi-agent-based architecture for WSNs is proposed and a mathematical model of dynamic alliance is constructed for the task allocation problem. Then an effective discrete particle swarm optimization (PSO algorithm for the dynamic alliance (DPSO-DA with a well-designed particle position code and fitness function is proposed. A mutation operator which can effectively improve the algorithm’s ability of global search and population diversity is also introduced in this algorithm. Finally, the simulation results show that the proposed solution can achieve significant better performance than other algorithms.

  14. Procedural learning deficits in specific language impairment (SLI): a meta-analysis of serial reaction time task performance.

    Science.gov (United States)

    Lum, Jarrad A G; Conti-Ramsden, Gina; Morgan, Angela T; Ullman, Michael T

    2014-02-01

    Meta-analysis and meta-regression were used to evaluate whether evidence to date demonstrates deficits in procedural memory in individuals with specific language impairment (SLI), and to examine reasons for inconsistencies of findings across studies. The Procedural Deficit Hypothesis (PDH) proposes that SLI is largely explained by abnormal functioning of the frontal-basal ganglia circuits that support procedural memory. It has also been suggested that declarative memory can compensate for at least some of the problems observed in individuals with SLI. A number of studies have used Serial Reaction Time (SRT) tasks to investigate procedural learning in SLI. In this report, results from eight studies that collectively examined 186 participants with SLI and 203 typically-developing peers were submitted to a meta-analysis. The average mean effect size was .328 (CI95: .071, .584) and was significant. This suggests SLI is associated with impairments of procedural learning as measured by the SRT task. Differences among individual study effect sizes, examined with meta-regression, indicated that smaller effect sizes were found in studies with older participants, and in studies that had a larger number of trials on the SRT task. The contributions of age and SRT task characteristics to learning are discussed with respect to impaired and compensatory neural mechanisms in SLI. PMID:24315731

  15. Grid-connected ICES: preliminary feasibility analysis and evaluation. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    The HEAL Complex in New Orleans will serve as a Demonstration Community for which the ICES Demonstration System will be designed. The complex is a group of hospitals, clinics, research facilities, and medical educational facilities. The five tasks reported on are: preliminary energy analysis; preliminary institutional assessment; conceptual design; firming-up of commitments; and detailed work management plan.

  16. Energy intensive industry for Alaska. Volume II: case analysis.

    Energy Technology Data Exchange (ETDEWEB)

    1978-09-01

    A case analysis of the attractiveness of the primary aluminium metal industry in Alaska is presented. Part 1 provides a discussion on the economics of the industry and provides a conceptual Alaskan smelter including its physical nature, employment, and tax consequences, and its environmental attributes. Part 2 discusses the social and economic impacts. Part 3 discusses the state management options for involvement with the industry. (MCW)

  17. Space tug economic analysis study. Volume 1: Executive summary

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The space tug is defined as any liquid propulsion stage under 100,000 pounds propellant loading that is flown from the space shuttle cargo bay. Two classes of vehicles are the orbit injection stages and reusable space tugs. The vehicle configurations, propellant combinations, and operating modes used for the study are reported. The summary contains data on the study approach, results, conclusions, and recommendations.

  18. Distributed Anomaly Detection using Minimum Volume Elliptical Principal Component Analysis

    OpenAIRE

    O'Reilly, CE; Gluhak, A.; Imran, A.

    2016-01-01

    Principal component analysis and the residual error is an effective anomaly detection technique. In an environment where anomalies are present in the training set, the derived principal components can be skewed by the anomalies. A further aspect of anomaly detection is that data might be distributed across different nodes in a network and their communication to a centralized processing unit is prohibited due to communication cost. Current solutions to distributed anomaly detection rely on a h...

  19. Re: Madsen et al. "Unnecessary work tasks and mental health: a prospective analysis of Danish human service workers".

    Science.gov (United States)

    Durand-Moreau, Quentin; Loddé, Brice; Dewitte, Jean-Dominique

    2015-03-01

    Madsen et al (1) recently published a secondary analysis on data provided by the Project on Burnout, Motivation and Job Satisfaction (PUMA). The aim of their study, published in the Scandinavian Journal of Work, Environment & Health was to examine the associations between unnecessary work tasks and a decreased level of mental health. Though the topic was quite novel, reading this work proved disturbing and raised issues. Based on the results of this study, the authors stated that there is an association between unnecessary work tasks (assessed by a single question) and a decreased level of mental health, idem [assessed by the Mental Health Inventory (MHI-5)], in the specific population included in this PUMA survey. The authors point out a limitation of the study, namely that unnecessary work tasks were evaluated using one single question: "Do you sometimes have to do things in your job which appear to be unnecessary?". Semmer defines unnecessary work task as "tasks that should not be carried out at all because they do not make sense or because they could have been avoided, or could be carried out with less effort if things were organized more efficiently" (2). De facto, qualifying what an unnecessary task is requires stating or explaining whether the task makes sense. Making sense or not is not an objective notion. It is very difficult for either a manager or an employee to say if a task is necessary or not. Most important is that it makes sense from the worker's point of view. Making sense and being necessary are not synonyms. Some tasks do not make sense but are economically necessary (eg, when, as physicians, we are reporting our activity using ICD-10 on computers instead of being at patients' bedsides or reading this journal). Thus, there is a wide gap between Semmer's definition and the question used by the authors to evaluate his concept. A secondary analysis based on a single question is not adequate to evaluate unnecessary tasks. Nowadays, the general trend

  20. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 2: Economic analysis

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) cost uncertainties, (2) scenario analysis, (3) economic sensitivities, (4) mixed integer programming formulation of the space tug problem, and (5) critical parameters in the evaluation of a public expenditure.

  1. Analysis of Petri net model and task planning heuristic algorithms for product reconfiguration

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Reconfiguration planning is recognized as an important factor for reducing the cost of manufacturing reconfigurable products, and the associated main task is to generate a set of optimal or near-optimal reconfiguration sequences using some effect algorithms. A method is developed to generate a Petri net as the reconfiguration tree to represent two-state-transit of product, which solved the representation problem of reconfiguring interfaces replacement. Relating with this method, two heuristic algorithms are proposed to generate task sequences which considering economics to search reconfiguration paths effectively. At last,an objective evaluation is applied to compare these two heuristic algorithms to other ones. The developed reconfiguration task planning heuristic algorithms can generate better strategies and plans for reconfiguration. The research finds are exemplified with struts reconfiguration of reconfigurable parallel kinematics machine (RPKM).

  2. Empirical Analysis of EEG and ERPs for Psychophysiological Adaptive Task Allocation

    Science.gov (United States)

    Prinzel, Lawrence J., III; Pope, Alan T.; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.

    2001-01-01

    The present study was designed to test the efficacy of using Electroencephalogram (EEG) and Event-Related Potentials (ERPs) for making task allocation decisions. Thirty-six participants were randomly assigned to an experimental, yoked, or control group condition. Under the experimental condition, a tracking task was switched between task modes based upon the participant's EEG. The results showed that the use of adaptive aiding improved performance and lowered subjective workload under negative feedback as predicted. Additionally, participants in the adaptive group had significantly lower RMSE and NASA-TLX ratings than participants in either the yoked or control group conditions. Furthermore, the amplitudes of the N1 and P3 ERP components were significantly larger under the experimental group condition than under either the yoked or control group conditions. These results are discussed in terms of the implications for adaptive automation design.

  3. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  4. User and Task Analysis of the Flight Surgeon Console at the Mission Control Center of the NASA Johnson Space Center

    Science.gov (United States)

    Johnson, Kathy A.; Shek, Molly

    2003-01-01

    Astronauts in a space station are to some extent like patients in an intensive care unit (ICU). Medical support of a mission crew will require acquisition, transmission, distribution, integration, and archiving of significant amounts of data. These data are acquired by disparate systems and will require timely, reliable, and secure distribution to different communities for the execution of various tasks of space missions. The goal of the Comprehensive Medical Information System (CMIS) Project at Johnson Space Center Flight Medical Clinic is to integrate data from all Medical Operations sources, including the reference information sources and the electronic medical records of astronauts. A first step toward the full CMIS implementation is to integrate and organize the reference information sources and the electronic medical record with the Flight Surgeons console. In order to investigate this integration, we need to understand the usability problems of the Flight Surgeon's console in particular and medical information systems in general. One way to achieve this understanding is through the use of user and task analyses whose general purpose is to ensure that only the necessary and sufficient task features that match users capacities will be included in system implementations. The goal of this summer project was to conduct user and task analyses employing cognitive engineering techniques to analyze the task of the Flight Surgeons and Biomedical Engineers (BMEs) while they worked on Console. The techniques employed were user interviews, observations and a questionnaire to collect data for which a hierarchical task analysis and an information resource assessment were performed. They are described in more detail below. Finally, based on our analyses, we make recommendations for improvements to the support structure.

  5. Waste Isolation Pilot Plant Safety Analysis Report. Volume 1

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection: Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating control and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  6. Waste Isolation Pilot Plant Safety Analysis Report. Volume 4

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  7. Waste Isolation Pilot Plant Safety Analysis Report. Volume 5

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  8. Waste Isolation Pilot Plant Safety Analysis Report. Volume 2

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  9. Waste Isolation Pilot Plant Safety Analysis Report. Volume 3

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  10. A 259.6 μW HRV-EEG Processor With Nonlinear Chaotic Analysis During Mental Tasks.

    Science.gov (United States)

    Roh, Taehwan; Hong, Sunjoo; Cho, Hyunwoo; Yoo, Hoi-Jun

    2016-02-01

    A system-on-chip (SoC) with nonlinear chaotic analysis (NCA) is presented for mental task monitoring. The proposed processor treats both heart rate variability (HRV) and electroencephalography (EEG). An independent component analysis (ICA) accelerator decreases the error of HRV extraction from 5.94% to 1.84% in the preprocessing step. Largest Lyapunov exponents (LLE), as well as linear features such as mean and standard variation and sub-band power, are calculated with NCA acceleration. Measurements with mental task protocols result in confidence level of 95%. Thanks to the hardware acceleration, the chaos-processor fabricated in 0.13 μm CMOS technology consumes only 259.6 μW. PMID:25616073

  11. Small V/STOL aircraft analysis, volume 1

    Science.gov (United States)

    Smith, K. R., Jr.; Belina, F. W.

    1974-01-01

    A study has been made of the economic viability of advanced V/STOL aircraft concepts in performing general aviation missions. A survey of general aviation aircraft users, operators, and manufacturers indicated that personnel transport missions formulated around business executive needs, commuter air service, and offshore oil supply are the leading potential areas of application using VTOL aircraft. Advanced VTOL concepts potentially available in the late 1970 time period were evaluated as alternatives to privately owned contemporary aircraft and commercial airline service in satisfying these personnel transport needs. Economic analysis incorporating the traveler's value of time as the principle figure of merit were used to identify the relative merits of alternative VTOL air transportation concepts.

  12. Mining and Minerals Technical Advisory Committee on Curriculum Development. Job Clusters, Competencies and Task Analysis.

    Science.gov (United States)

    Northern Montana Coll., Havre. Montana Center for Vocational Education, Research, Curriculum and Personnel Development.

    This skills inventory for mining occupations was developed by a technical committee in Montana to assist in the development of model curricula and to address state labor market needs. The committee included employers from the mining industry, members of trade and professional associations, and educators. The validated task list and defined job…

  13. A Task Analysis of a Sport Education Physical Education Season for Fourth Grade Students

    Science.gov (United States)

    Layne, Todd; Hastie, Peter

    2015-01-01

    Background: Previous research on Sport Education in which the participants were in the primary grades has focused on perceptions of fun and enjoyment as well as other components of motivation. To date, no study in Sport Education has examined the accomplishment of the various instructional and managerial tasks by upper primary school children,…

  14. The effect of exercise-induced arousal on cognitive task performance: a meta-regression analysis.

    Science.gov (United States)

    Lambourne, Kate; Tomporowski, Phillip

    2010-06-23

    The effects of acute exercise on cognitive performance were examined using meta-analytic techniques. The overall mean effect size was dependent on the timing of cognitive assessment. During exercise, cognitive task performance was impaired by a mean effect of -0.14. However, impairments were only observed during the first 20min of exercise. Otherwise, exercise-induced arousal enhanced performance on tasks that involved rapid decisions and automatized behaviors. Following exercise, cognitive task performance improved by a mean effect of 0.20. Arousal continued to facilitate speeded mental processes and also enhanced memory storage and retrieval. Positive effects were observed following exercise regardless of whether the study protocol was designed to measure the effects of steady-state exercise, fatiguing exercise, or the inverted-U hypothesis. Finally, cognitive performance was affected differentially by exercise mode. Cycling was associated with enhanced performance during and after exercise, whereas treadmill running led to impaired performance during exercise and a small improvement in performance following exercise. These results are indicative of the complex relation between exercise and cognition. Cognitive performance may be enhanced or impaired depending on when it is measured, the type of cognitive task selected, and the type of exercise performed. PMID:20381468

  15. Effects of Task Analysis and Self-Monitoring for Children with Autism in Multiple Social Settings

    Science.gov (United States)

    Parker, Daniel; Kamps, Debra

    2011-01-01

    In this study, written task analyses with self-monitoring were used to teach functional skills and verbal interactions to two high-functioning students with autism in social settings with peers. A social script language intervention was included in two of the activities to increase the quantity of verbal interaction between the students and peers.…

  16. Reading during Sentence Composing and Error Correction: A Multilevel Analysis of the Influences of Task Complexity

    Science.gov (United States)

    Van Waes, Luuk; Leijten, Marielle; Quinlan, Thomas

    2010-01-01

    In this study we investigated the role of reading, how writers coordinate editing with other writing processes. In particular, the experiment examines how the cognitive demands of sentence composing and the type of error influence the reading and writing performance. We devised an experimental writing task in which participants corrected an…

  17. Task analysis of information technology-mediated medication management in outpatient care

    NARCIS (Netherlands)

    van Stiphout, F.; Zwart-van Rijkom, J. E. F.; Maggio, L. A.; Aarts, J. E. C. M.; Bates, D. W.; van Gelder, T.; Jansen, P. A. F.; Schraagen, J. M. C.; Egberts, A. C. G.; ter Braak, E. W. M. T.

    2015-01-01

    Aims Educating physicians in the procedural as well as cognitive skills of information technology (IT)-mediated medication management could be one of the missing links for the improvement of patient safety. We aimed to compose a framework of tasks that need to be addressed to optimize medication man

  18. Development and Confirmatory Factory Analysis of the Achievement Task Value Scale for University Students

    Science.gov (United States)

    Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen

    2013-01-01

    The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…

  19. Agriculture Technical Advisory Committee on Curriculum Development. Job Clusters, Competencies and Task Analysis.

    Science.gov (United States)

    Northern Montana Coll., Havre. Montana Center for Vocational Education, Research, Curriculum and Personnel Development.

    This skills inventory for agricultural occupations was developed by a technical committee in Montana to assist in the development of model curricula and to address state labor market needs. The committee included employers from the forestry industry, members from trade and professional associations, and educators. The validated task list and…

  20. Forestry Technical Advisory Committee on Curriculum Development. Job Clusters, Competencies and Task Analysis.

    Science.gov (United States)

    Northern Montana Coll., Havre. Montana Center for Vocational Education, Research, Curriculum and Personnel Development.

    This skills inventory for forestry occupations was developed by a technical committee in Montana to assist in the development of model curricula and to address state labor market needs. The committee included employers from the forestry industry, members from trade and professional associations, and educators. The validated task list and defined…

  1. Path Analysis Examining Self-Efficacy and Decision-Making Performance on a Simulated Baseball Task

    Science.gov (United States)

    Hepler, Teri J.; Feltz, Deborah L.

    2012-01-01

    The purpose of this study was to examine the relationship between decision-making self-efficacy and decision-making performance in sport. Undergraduate students (N = 78) performed 10 trials of a decision-making task in baseball. Self-efficacy was measured before performing each trial. Decision-making performance was assessed by decision speed and…

  2. Task Analysis of Medical Technology Administration and Supervision as a Foundation to a Curriculum Ladder.

    Science.gov (United States)

    Becan-McBride, Kathleen Elizabeth

    The administrative and supervisory competencies that a medical technology student should acquire before graduation were investigated. Selected medical technology laboratory supervisors and administrative technologists in the Houston-Galveston, Texas area were surveyed to determine the tasks performed by the medical technology laboratory…

  3. Task and person-focused leadership behaviors and team performance: A meta-analysis.

    NARCIS (Netherlands)

    Ceri-Booms, Meltem; Curseu, P.L.; Oerlemans, L.A.G.

    2017-01-01

    This paper reports the results of a meta-analytic review of the relationship between person and task oriented leader behaviors, on the one hand, and team performance, on the other hand. The results, based on 89 independent samples, show a moderate positive (ρ=.33) association between both types of l

  4. Iowa Gambling Task in patients with early-onset Parkinson’s disease: strategy analysis

    Czech Academy of Sciences Publication Activity Database

    Gescheidt, T.; Czekóová, Kristína; Urbánek, Tomáš; Mareček, R.; Mikl, M.; Kubíková, R.; Telecká, S.; Andrlová, H.; Husárová, I.; Bareš, M.

    2012-01-01

    Roč. 33, č. 6 (2012), s. 1329-1335. ISSN 1590-1874 R&D Projects: GA ČR(CZ) GAP407/12/2432 Institutional support: RVO:68081740 Keywords : Parkinson’s disease * decision making * Iowa gambling task * executive function Subject RIV: FL - Psychiatry, Sexuology Impact factor: 1.412, year: 2012

  5. Determination of bone mineral volume fraction using impedance analysis and Bruggeman model

    Energy Technology Data Exchange (ETDEWEB)

    Ciuchi, Ioana Veronica; Olariu, Cristina Stefania, E-mail: oocristina@yahoo.com; Mitoseriu, Liliana, E-mail: lmtsr@uaic.ro

    2013-11-20

    Highlights: • Mineral volume fraction of a bone sample was determined. • Dielectric properties for bone sample and for the collagen type I were determined by impedance spectroscopy. • Bruggeman effective medium approximation was applied in order to evaluate mineral volume fraction of the sample. • The computed values were compared with ones derived from a histogram test performed on SEM micrographs. -- Abstract: Measurements by impedance spectroscopy and Bruggeman effective medium approximation model were employed in order to determine the mineral volume fraction of dry bone. This approach assumes that two or more phases are present into the composite: the matrix (environment) and the other ones are inclusion phases. A fragment of femur diaphysis dense bone from a young pig was investigated in its dehydrated state. Measuring the dielectric properties of bone and its main components (hydroxyapatite and collagen) and using the Bruggeman approach, the mineral volume filling factor was determined. The computed volume fraction of the mineral volume fraction was confirmed by a histogram test analysis based on the SEM microstructures. In spite of its simplicity, the method provides a good approximation for the bone mineral volume fraction. The method which uses impedance spectroscopy and EMA modeling can be further developed by considering the conductive components of the bone tissue as a non-invasive in situ impedance technique for bone composition evaluation and monitoring.

  6. Analysis of Load-Carrying Capacity for Redundant Free-Floating Space Manipulators in Trajectory Tracking Task

    OpenAIRE

    2014-01-01

    The aim of this paper is to analyze load-carrying capacity of redundant free-floating space manipulators (FFSM) in trajectory tracking task. Combined with the analysis of influential factors in load-carrying process, evaluation of maximum load-carrying capacity (MLCC) is described as multiconstrained nonlinear programming problem. An efficient algorithm based on repeated line search within discontinuous feasible region is presented to determine MLCC for a given trajectory of the end-effector ...

  7. Open Educational Resources from Performance Task using Video Analysis and Modeling - Tracker and K12 science education framework

    OpenAIRE

    Wee, Loo Kang

    2014-01-01

    This invited paper discusses why Physics performance task by grade 9 students in Singapore is worth participating in for two reasons; 1) the video analysis and modeling are open access, licensed creative commons attribution for advancing open educational resources in the world and 2) allows students to be like physicists, where the K12 science education framework is adopted. Personal reflections on how physics education can be made more meaningful in particular Practice 1: Ask Questions, Prac...

  8. Drive-Response Analysis of Global Ice Volume, CO2, and Insolation using Information Transfer

    Science.gov (United States)

    Brendryen, J.; Hannisdal, B.

    2014-12-01

    The processes and interactions that drive global ice volume variability and deglaciations are a topic of considerable debate. Here we analyze the drive-response relationships between data sets representing global ice volume, CO2 and insolation over the past 800 000 years using an information theoretic approach. Specifically, we use a non-parametric measure of directional information transfer (IT) based on the construct of transfer entropy to detect the relative strength and directionality of interactions in the potentially chaotic and non-linear glacial-interglacial climate system. Analyses of unfiltered data suggest a tight coupling between CO2 and ice volume, detected as strong, symmetric information flow consistent with a two-way interaction. In contrast, IT from Northern Hemisphere (NH) summer insolation to CO2 is highly asymmetric, suggesting that insolation is an important driver of CO2. Conditional analysis further suggests that CO2 is a dominant influence on ice volume, with the effect of insolation also being significant but limited to smaller-scale variability. However, the strong correlation between CO2 and ice volume renders them information redundant with respect to insolation, confounding further drive-response attribution. We expect this information redundancy to be partly explained by the shared glacial-interglacial "sawtooth" pattern and its overwhelming influence on the transition probability distributions over the target interval. To test this, we filtered out the abrupt glacial terminations from the ice volume and CO2 records to focus on the residual variability. Preliminary results from this analysis confirm insolation as a driver of CO2 and two-way interactions between CO2 and ice volume. However, insolation is reduced to a weak influence on ice volume. Conditional analyses support CO2 as a dominant driver of ice volume, while ice volume and insolation both have a strong influence on CO2. These findings suggest that the effect of orbital

  9. Molecular modeling and structural analysis of two-pore domain potassium channels TASK1 interactions with the blocker A1899

    Directory of Open Access Journals (Sweden)

    David Mauricio Ramirez

    2015-03-01

    Full Text Available A1899 is a potent and highly selective blocker of the Two-pore domain potassium (K2P channel TASK-1, it acts as an antagonist blocking the K+ flux and binds to TASK-1 in the inner cavity and shows an activity in nanomolar order. This drug travels through the central cavity and finally binds in the bottom of the selectivity filter with some threonines and waters molecules forming a H-bond network and several hydrophobic interactions. Using alanine mutagenesis screens the binding site was identify involving residues in the P1 and P2 pore loops, the M2 and M4 transmembrane segments, and the halothane response element; mutations were introduced in the human TASK-1 (KCNK3, NM_002246 expressed in Oocytes from anesthetized Xenopus laevis frogs. Based in molecular modeling and structural analysis as such as molecular docking and binding free energy calculations a pose was suggested using a TASK-1 homology models. Recently, various K2P crystal structures have been obtained. We want redefined – from a structural point of view – the binding mode of A1899 in TASK-1 homology models using as a template the K2P crystal structures. By computational structural analysis we describe the molecular basis of the A1899 binding mode, how A1899 travel to its binding site and suggest an interacting pose (Figure 1. after 100 ns of molecular dynamics simulation (MDs we found an intra H-Bond (80% of the total MDs, a H-Bond whit Thr93 (42% of the total MDs, a pi-pi stacking interaction between a ring and Phe125 (88% of the total MDs and several water bridges. Our experimental and computational results allow the molecular understanding of the structural binding mechanism of the selective blocker A1899 to TASK-1 channels. We identified the structural common and divergent features of TASK-1 channel through our theoretical and experimental studies of A1899 drug action.

  10. Scaling analysis for mixing in large stratified volumes of passive containment

    International Nuclear Information System (INIS)

    Integral test plays a key role in assessing the feasibility of the passive containment cooling system (PCCS) and the accuracy of the calculation model. The scaling analysis for ; mixing in large stratified volumes of PCCS provides the primary theoretical basis for determining the key size of the integral test facility. Based on the mixing in large stratified volumes, the key parameters were obtained by scaling analysis based on the hierarchical two-tiered scaling method. The similarity criteria that ensure the integral test facility can well simulate mixing in the passive containment was obtained. (authors)

  11. [Environmental investigation of ground water contamination at Wright- Patterson Air Force Base, Ohio]. Volume 4, Health and Safety Plan (HSP); Phase 1, Task 4 Field Investigation report: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This Health and Safety Plan (HSP) was developed for the Environmental Investigation of Ground-water Contamination Investigation at Wright-Patterson Air Force Base near Dayton, Ohio, based on the projected scope of work for the Phase 1, Task 4 Field Investigation. The HSP describes hazards that may be encountered during the investigation, assesses the hazards, and indicates what type of personal protective equipment is to be used for each task performed. The HSP also addresses the medical monitoring program, decontamination procedures, air monitoring, training, site control, accident prevention, and emergency response.

  12. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 2: Unsteady ducted propfan analysis computer program users manual

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.; Bettner, James L.

    1991-01-01

    The primary objective of this study was the development of a time-dependent three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The computer codes resulting from this study are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). This report is intended to serve as a computer program user's manual for the ADPAC developed under Task 2 of NASA Contract NAS3-25270, Unsteady Ducted Propfan Analysis. Aerodynamic calculations were based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. A time-accurate implicit residual smoothing operator was utilized for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C-grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted propfan flows. The solution scheme demonstrated efficiency and accuracy comparable with other schemes of this class.

  13. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis. Computer program user's manual

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.; Adamczyk, John J.; Miller, Christopher J.; Arnone, Andrea; Swanson, Charles

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOACR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This report is intended to serve as a computer program user's manual for the ADPAC-AOACR codes developed under Task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  14. Numerical analysis of volume holograms with spherical reference wave based on Born approximation

    Science.gov (United States)

    Yoshida, S.; Yamamoto, M.

    2013-05-01

    Holographic Data Storage (HDS) is one of the next generation storage technologies that can actualize high data capacity and high data transfer rate. Since information is recorded 3-dimensionally in a thick medium, data capacity of the HDS is not constrained by diffraction limit. However, behavior of wavefront in an inhomogeneous thick medium is highly complex, and it is hard to handle propagation of wavefront in the medium analytically. Therefore, we establish a numerical technique for analysis of volume holograms. The proposed technique is based on the scalar diffraction theory, which is described as the volume integral equation. By applying Born approximation and angular spectrum method to the volume integral equation, the technique can be applicable for various problems. We analyze characteristics of the volume hologram with spherical reference wave, and confirm effectiveness of the proposed technique. Compared to conventional techniques such as coupled wave analysis, beam propagation method, and finite-difference time domain method, the proposed technique has application potentiality for various problems, and it is easy to implement. In this study, we show effectiveness of the proposed technique by applying to analysis of the volume hologram with spherical reference wave. It can be expected that the proposed technique may become a tool for design of HDS systems.

  15. Alternative methods for disposal of low-level radioactive wastes. Volume 3. Task 2b: technical requirements for aboveground vault disposal of low-level radioactive waste

    International Nuclear Information System (INIS)

    The study reported herein contains the results of Task 2b (Technical Requirements for Aboveground Vault Disposal of Low-Level Radioactive Waste) of a four-task study entitled ''Criteria for Evaluating Engineered Facilities.'' The overall objective of this study is to ensure that the criteria needed to evaluate five alternative low-level radioactive waste (LLW) disposal methods are available potential license applicants. The above-ground vault disposal alternative is one of several methods that may be proposed for disposal of low-level radioactive waste. In this report, the term aboveground vault refers to an engineered structure with roof, walls and floor enclosing the disposal space. The limited experience and knowledge gained with this method are described and updated in this report. The short term experience does not conclusively demonstrate the capability of this method to satisfy the Part 61 Performance Objectives. A generic description of the features and components and operation of an aboveground vault disposal facility is provided. Features and components that could enhance the long-term performance are described. The applicability of existing criteria developed for near-surface disposal (10 CFR Part 61 Subpart D) to the aboveground vault disposal method, as assessed in Task 1, are reassessed herein. With few exceptions, these criteria were found to be applicable in the reassessment. These conclusions differ slightly from the Task 1 findings. 22 refs., 5 figs

  16. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report

    Science.gov (United States)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.

  17. EEG Analysis during complex diagnostic tasks in Nuclear Power Plants - Simulator-based Experimental Study

    International Nuclear Information System (INIS)

    In literature, there are a lot of studies based on EEG signals during cognitive activities of human-beings but most of them dealt with simple cognitive activities such as transforming letters into Morse code, subtraction, reading, semantic memory search, visual search, memorizing a set of words and so on. In this work, EEG signals were analyzed during complex diagnostic tasks in NPP simulator-based environment. Investigated are the theta, alpha, beta, and gamma band EEG powers during the diagnostic tasks. The experimental design and procedure are represented in section 2 and the results are shown in section 3. Finally some considerations are discussed and the direction for the further work is proposed in section 4

  18. Analysis of the financial task generated by the construction of a nuclear power plant in Mexico

    International Nuclear Information System (INIS)

    The construction of new nuclear reactors requires of a high investment making them intensive projects in capital and that require as minimum of 5 years for its construction. The financial task that represents for the electric company is of vital importance, since in the case of privates in other countries prevents them of entering in this type of projects if they do not have its Government support. In the case of Mexico, being an electric company integrated vertically can have financing to carry out this investment type. In this study is analyzed the construction viability of new nuclear reactors in Mexico based on the financial task that represents for the Electric Company its construction. (Author)

  19. Task-Team-Process: Assessment and Analysis of the Development of Shared Representations in an Engineering Team

    DEFF Research Database (Denmark)

    Petra, Badke-Schaub; Lauche, Kristine; Neumann, Andre;

    2007-01-01

    In this article, an analysis of the development of team mental models in two engineering meetings is described. The authors present a two-stage model of the development of sharedness in teams, which formed the basis for a communication analysis of both meetings. The transcripts of the meetings were...... categorised referring to underlying cognitive acts and design strategies. The results are largely consistent with the assumptions of the model indicating a lack of sharedness. This was confirmed by changes of frequencies linked to task-, team-, and process-related cognitive acts within and between the two...

  20. Engineering task plan for development, fabrication, and deployment of nested, fixed depth fluidic sampling and at-tank analysis systems

    Energy Technology Data Exchange (ETDEWEB)

    REICH, F.R.

    1999-05-18

    An engineering task plan was developed that presents the resources, responsibilities, and schedules for the development, test, and deployment of the nested, fixed-depth fluidic sampling and at-tank analysis system. The sampling system, deployed in the privatization contract double-shell tank feed tank, will provide waste samples for assuring the readiness of the tank for shipment to the privatization contractor for vitrification. The at-tank analysis system will provide ''real-time'' assessments of the sampled wastes' chemical and physical properties. These systems support the Hanford Phase 1B Privatization Contract.

  1. Engineering task plan for development, fabrication, and deployment of nested, fixed depth fluidic sampling and at-tank analysis systems

    International Nuclear Information System (INIS)

    An engineering task plan was developed that presents the resources, responsibilities, and schedules for the development, test, and deployment of the nested, fixed-depth fluidic sampling and at-tank analysis system. The sampling system, deployed in the privatization contract double-shell tank feed tank, will provide waste samples for assuring the readiness of the tank for shipment to the privatization contractor for vitrification. The at-tank analysis system will provide ''real-time'' assessments of the sampled wastes' chemical and physical properties. These systems support the Hanford Phase 1B Privatization Contract

  2. Analysis of dual-task elderly gait in fallers and non-fallers using wearable sensors.

    Science.gov (United States)

    Howcroft, Jennifer; Kofman, Jonathan; Lemaire, Edward D; McIlroy, William E

    2016-05-01

    Dual-task (DT) gait involves walking while simultaneously performing an attention-demanding task and can be used to identify impaired gait or executive function in older adults. Advancment is needed in techniques that quantify the influence of dual tasking to improve predictive and diagnostic potential. This study investigated the viability of wearable sensor measures to identify DT gait changes in older adults and distinguish between elderly fallers and non-fallers. A convenience sample of 100 older individuals (75.5±6.7 years; 76 non-fallers, 24 fallers based on 6 month retrospective fall occurrence) walked 7.62m under single-task (ST) and DT conditions while wearing pressure-sensing insoles and tri-axial accelerometers at the head, pelvis, and left and right shanks. Differences between ST and DT gait were identified for temporal measures, acceleration descriptive statistics, Fast Fourier Transform (FFT) quartiles, ratio of even to odd harmonics, center of pressure (CoP) stance path coefficient of variation, and deviations to expected CoP stance path. Increased posterior CoP stance path deviations, increased coefficient of variation, decreased FFT quartiles, and decreased ratio of even to odd harmonics suggested increased DT gait variability. Decreased gait velocity and decreased acceleration standard deviations (SD) at the pelvis and shanks could represent compensatory gait strategies that maintain stability. Differences in acceleration between fallers and non-fallers in head posterior SD and pelvis AP ratio of even to odd harmonics during ST, and pelvis vertical maximum Lyapunov exponent during DT gait were identified. Wearable-sensor-based DT gait assessments could be used in point-of-care environments to identify gait deficits. PMID:26994786

  3. Analysis of eye and head coordination in a visual peripheral recognition task

    OpenAIRE

    Schwab, Simon; Würmle, Othmar; Altorfer, Andreas

    2012-01-01

    Coordinated eye and head movements simultaneously occur to scan the visual world for relevant targets. However, measuring both eye and head movements in experiments allowing natural head movements may be challenging. This paper provides an approach to study eye-head coordination: First, we demonstra- te the capabilities and limits of the eye-head tracking system used, and compare it to other technologies. Second, a beha- vioral task is introduced to invoke eye-head coordination. Third, a meth...

  4. Brain wave correlates of attentional states: Event related potentials and quantitative EEG analysis during performance of cognitive and perceptual tasks

    Science.gov (United States)

    Freeman, Frederick G.

    1993-01-01

    presented target stimulus. In addition to the task requirements, irrelevant tones were presented in the background. Research has shown that even though these stimuli are not attended, ERP's to them can still be elicited. The amplitude of the ERP waves has been shown to change as a function of a person's level of alertness. ERP's were also collected and analyzed for the target stimuli for each task. Brain maps were produced based on the ERP voltages for the different stimuli. In addition to the ERP's, a quantitative EEG (QEEG) was performed on the data using a fast Fourier technique to produce a power spectral analysis of the EEG. This analysis was conducted on the continuous EEG while the subjects were performing the tasks. Finally, a QEEG was performed on periods during the task when subjects indicated that they were in an altered state of awareness. During the tasks, subjects were asked to indicate by pressing a button when they realized their level of task awareness had changed. EEG epochs were collected for times just before and just after subjects made this reponse. The purpose of this final analysis was to determine whether or not subjective indices of level of awareness could be correlated with different patterns of EEG.

  5. Analysis of Load-Carrying Capacity for Redundant Free-Floating Space Manipulators in Trajectory Tracking Task

    Directory of Open Access Journals (Sweden)

    Qingxuan Jia

    2014-01-01

    Full Text Available The aim of this paper is to analyze load-carrying capacity of redundant free-floating space manipulators (FFSM in trajectory tracking task. Combined with the analysis of influential factors in load-carrying process, evaluation of maximum load-carrying capacity (MLCC is described as multiconstrained nonlinear programming problem. An efficient algorithm based on repeated line search within discontinuous feasible region is presented to determine MLCC for a given trajectory of the end-effector and corresponding joint path. Then, considering the influence of MLCC caused by different initial configurations for the starting point of given trajectory, a kind of maximum payload initial configuration planning method is proposed by using PSO algorithm. Simulations are performed for a particular trajectory tracking task of the 7-DOF space manipulator, of which MLCC is evaluated quantitatively. By in-depth research of the simulation results, significant gap between the values of MLCC when using different initial configurations is analyzed, and the discontinuity of allowable load-carrying capacity is illustrated. The proposed analytical method can be taken as theoretical foundation of feasibility analysis, trajectory optimization, and optimal control of trajectory tracking task in on-orbit load-carrying operations.

  6. Team situation awareness in nuclear power plant process control: A literature review, task analysis and future research

    International Nuclear Information System (INIS)

    Operator achievement and maintenance of situation awareness (SA) in nuclear power plant (NPP) process control has emerged as an important concept in defining effective relationships between humans and automation in this complex system. A literature review on factors influencing SA revealed several variables to be important to team SA, including the overall task and team goals, individual tasks, team member roles, and the team members themselves. Team SA can also be adversely affected by a range of factors, including stress, mental over- or under-loading, system design (including human-machine interface design), complexity, human error in perception, and automation. Our research focused on the analysis of 'shared' SA and team SA among an assumed three-person, main-control-room team. Shared SA requirements represent the knowledge that is held in common by NPP operators, and team SA represents the collective, unique knowledge of all operators. The paper describes an approach to goal-directed task analysis (GDTA) applied to NPP main control room operations. In general, the GDTA method reveals critical operator decision and information requirements. It identifies operator SA requirements relevant to performing complex systems control. The GDTA can reveal requirements at various levels of cognitive processing, including perception, comprehension and projection, in NPP process control. Based on the literature review and GDTA approach, a number of potential research issues are proposed with an aim toward understanding and facilitating team SA in NPP process control. (authors)

  7. A Work-Demand Analysis Compatible with Preemption-Aware Scheduling for Power-Aware Real-Time Tasks

    Directory of Open Access Journals (Sweden)

    Da-Ren Chen

    2013-01-01

    Full Text Available Due to the importance of slack time utilization for power-aware scheduling algorithms,we propose a work-demand analysis method called parareclamation algorithm (PRA to increase slack time utilization of the existing real-time DVS algorithms. PRA is an online scheduling for power-aware real-time tasks under rate-monotonic (RM policy. It can be implemented and fully compatible with preemption-aware or transition-aware scheduling algorithms without increasing their computational complexities. The key technique of the heuristics method doubles the analytical interval and turns the deferrable workload out the potential slack time. Theoretical proofs show that PRA guarantees the task deadlines in a feasible RM schedule and takes linear time and space complexities. Experimental results indicate that the proposed method combining the preemption-aware methods seamlessly reduces the energy consumption by 14% on average over their original algorithms.

  8. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 2. OPERATIONS MANUAL

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. Original system development was based on research conducted in a smaller water utility in Kenton Co...

  9. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 1. CASE STUDY

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. Original system development was based on research conducted in a smaller water utility in Kenton Co...

  10. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 3. PROGRAMMING MANUAL

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. MARS will help both the Water and Sanitary Sewer Departments control costs and manage expanding ser...

  11. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    International Nuclear Information System (INIS)

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  12. Supply-demand analysis. Volume II. of the offshore oil industry support craft market. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, J.H.

    1977-10-01

    Volume Two of this report presents a description of the market for offshore petroleum industry support craft and an analysis of that market. Financial performance of five major operating companies is described. A forecast of support craft supply and demand for 1977, 1982, and 1987 is included.

  13. Meta-analysis: Effects of glycerol administration on plasma volume, haemoglobin, and haematocrit.

    Science.gov (United States)

    Koehler, Karsten; Thevis, Mario; Schaenzer, Wilhelm

    2013-01-01

    The use of glycerol in combination with excess fluid can be used to increase total body water. Because glycerol hyperhydration may also be misused to mask the effects of blood doping on doping-relevant parameters, namely haemoglobin and haematocrit, glycerol has been prohibited by the World Anti-Doping Agency since 2010. In order to test this rationale, the purpose of this meta-analysis was to quantify the effects of glycerol hyperhydration on plasma volume, haemoglobin, and haematocrit in comparison to administration of fluid only. Following a literature search, a total of seven studies was included and meta-analyses were performed separately for the effects on plasma volume (5 studies, total n = 54) and on haemoglobin (6 studies, n = 52) and haematocrit (6 studies, n = 52). The meta-analysis revealed that the increase in plasma volume was 3.3% larger (95%-CI: 1.1-5.5%) after glycerol administration when compared to fluid only. Reductions in haemoglobin were 0.2 g/dl (95%-CI: -0.3, 0.0) larger and there was no difference in the changes in haematocrit between glycerol and fluid administration (95%-CI: -0.7-0.8%). In comparison with other plasma-volume expanding agents, glycerol hyperhydration has a very limited potential in increasing plasma volume and altering doping-relevant blood parameters. PMID:24353192

  14. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon [Hallym University College of Medicine, Seoul (Korea, Republic of)

    2002-09-15

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  15. Comparison of gray matter volume and thickness for analysis of cortical changes in Alzheimer's disease

    Science.gov (United States)

    Liu, Jiachao; Li, Ziyi; Chen, Kewei; Yao, Li; Wang, Zhiqun; Li, Kunchen; Guo, Xiaojuan

    2011-03-01

    Gray matter volume and cortical thickness are two indices of concern in brain structure magnetic resonance imaging research. Gray matter volume reflects mixed-measurement information of cerebral cortex, while cortical thickness reflects only the information of distance between inner surface and outer surface of cerebral cortex. Using Scaled Subprofile Modeling based on Principal Component Analysis (SSM_PCA) and Pearson's Correlation Analysis, this study further provided quantitative comparisons and depicted both global relevance and local relevance to comprehensively investigate morphometrical abnormalities in cerebral cortex in Alzheimer's disease (AD). Thirteen patients with AD and thirteen age- and gender-matched healthy controls were included in this study. Results showed that factor scores from the first 8 principal components accounted for ~53.38% of the total variance for gray matter volume, and ~50.18% for cortical thickness. Factor scores from the fifth principal component showed significant correlation. In addition, gray matter voxel-based volume was closely related to cortical thickness alterations in most cortical cortex, especially, in some typical abnormal brain regions such as insula and the parahippocampal gyrus in AD. These findings suggest that these two measurements are effective indices for understanding the neuropathology in AD. Studies using both gray matter volume and cortical thickness can separate the causes of the discrepancy, provide complementary information and carry out a comprehensive description of the morphological changes of brain structure.

  16. Oak Ridge Reservation volume I. Y-12 mercury task force files: A guide to record series of the Department of Energy and its contractors

    International Nuclear Information System (INIS)

    The purpose of this guide is to describe each of the series of records identified in the documents of the Y-12 Mercury Task Force Files that pertain to the use of mercury in the separation and enrichment of lithium isotopes at the Department of Energy's (DOE) Y-12 Plant in Oak Ridge, Tennessee. History Associates Incorporated (HAI) prepared this guide as part of DOE's Epidemiologic Records Inventory Project, which seeks to verify and conduct inventories of epidemiologic and health-related records at various DOE and DOE contractor sites. This introduction briefly describes the Epidemiologic Records Inventory Project and HAI's role in the project. Specific attention will be given to the history of the DOE-Oak Ridge Reservation, the development of the Y-12 Plant, and the use of mercury in the production of nuclear weapons during the 1950s and early 1960s. This introduction provides background information on the Y-12 Mercury Task Force Files, an assembly of documents resulting from the 1983 investigation of the Mercury Task Force into the effects of mercury toxicity upon workplace hygiene and worker health, the unaccountable loss of mercury, and the impact of those losses upon the environment. This introduction also explains the methodology used in the selection and inventory of these record series. Other topics include the methodology used to produce this guide, the arrangement of the detailed record series descriptions, and information concerning access to the collection

  17. Oak Ridge Reservation volume I. Y-12 mercury task force files: A guide to record series of the Department of Energy and its contractors

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-17

    The purpose of this guide is to describe each of the series of records identified in the documents of the Y-12 Mercury Task Force Files that pertain to the use of mercury in the separation and enrichment of lithium isotopes at the Department of Energy`s (DOE) Y-12 Plant in Oak Ridge, Tennessee. History Associates Incorporated (HAI) prepared this guide as part of DOE`s Epidemiologic Records Inventory Project, which seeks to verify and conduct inventories of epidemiologic and health-related records at various DOE and DOE contractor sites. This introduction briefly describes the Epidemiologic Records Inventory Project and HAI`s role in the project. Specific attention will be given to the history of the DOE-Oak Ridge Reservation, the development of the Y-12 Plant, and the use of mercury in the production of nuclear weapons during the 1950s and early 1960s. This introduction provides background information on the Y-12 Mercury Task Force Files, an assembly of documents resulting from the 1983 investigation of the Mercury Task Force into the effects of mercury toxicity upon workplace hygiene and worker health, the unaccountable loss of mercury, and the impact of those losses upon the environment. This introduction also explains the methodology used in the selection and inventory of these record series. Other topics include the methodology used to produce this guide, the arrangement of the detailed record series descriptions, and information concerning access to the collection.

  18. Analysis of Occupants’ Visual Perception to Refine Indoor Lighting Environment for Office Tasks

    Directory of Open Access Journals (Sweden)

    Ji-Hyun Lee

    2014-06-01

    Full Text Available The combined effects of color temperature and illuminance in a small office on visual response and mood under various lighting conditions were examined in this study. Visual annoyance tests were conducted using a sample of 20 subjects in a full-scale mock-up test space. Computer and paper-based reading tasks were conducted for 500 lx and 750 lx illuminance levels under 3,000 K, 4,000 K and 6,500 K conditions. Two hypotheses were considered for the test in this study. The primary hypothesis was that visual perception is affected by the color temperatures of light sources. The secondary hypothesis was that better moods, such as relaxed and cozy feelings, are associated with low color temperatures given equal illuminance levels. The visual environment under the 3,000 K condition was characterized by glare and brightness, resulting in visual discomfort when target illuminance was higher than 500 lx. Occupants preferred 500 lx under the 6,500 K condition, and 500 lx and 750 lx under the 4,000 K condition, reporting better visual satisfaction when performing office tasks. Prediction models for visual comfort suggest that the less that subjects are visually bothered by light during tasks, the more visual comfort they feel. User satisfaction with light source color is critical for the prediction of visual comfort under different lighting conditions. Visual comfort was the most influential factor on mood. Lower color temperature was associated with better mood at lower illuminance levels, while higher color temperature was preferred at higher illuminance levels.

  19. Human Factors Process Task Analysis Liquid Oxygen Pump Acceptance Test Procedure for the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.

    2002-01-01

    A process task analysis effort was undertaken by Dynacs Inc. commencing in June 2002 under contract from NASA YA-D6. Funding was provided through NASA's Ames Research Center (ARC), Code M/HQ, and Industrial Engineering and Safety (IES). The John F. Kennedy Space Center (KSC) Engineering Development Contract (EDC) Task Order was 5SMA768. The scope of the effort was to conduct a Human Factors Process Failure Modes and Effects Analysis (HF PFMEA) of a hazardous activity and provide recommendations to eliminate or reduce the effects of errors caused by human factors. The Liquid Oxygen (LOX) Pump Acceptance Test Procedure (ATP) was selected for this analysis. The HF PFMEA table (see appendix A) provides an analysis of six major categories evaluated for this study. These categories include Personnel Certification, Test Procedure Format, Test Procedure Safety Controls, Test Article Data, Instrumentation, and Voice Communication. For each specific requirement listed in appendix A, the following topics were addressed: Requirement, Potential Human Error, Performance-Shaping Factors, Potential Effects of the Error, Barriers and Controls, Risk Priority Numbers, and Recommended Actions. This report summarizes findings and gives recommendations as determined by the data contained in appendix A. It also includes a discussion of technology barriers and challenges to performing task analyses, as well as lessons learned. The HF PFMEA table in appendix A recommends the use of accepted and required safety criteria in order to reduce the risk of human error. The items with the highest risk priority numbers should receive the greatest amount of consideration. Implementation of the recommendations will result in a safer operation for all personnel.

  20. A distributional analysis of the effect of physical exercise on a choice reaction time task.

    Science.gov (United States)

    Davranche, Karen; Audiffren, Michel; Denjean, André

    2006-03-01

    The aim of this study was to examine the facilitating effects of physical exercise on the reaction process. Eleven participants with specific expertise in decision-making sports performed a choice reaction time task during moderate sub-maximal exercise (90% of their ventilatory threshold power). Participants were tested at rest and while cycling. During exercise, the participants were faster, without being more variable. We suggest that the effect of exercise on cognitive performance was due to a major generalized improvement of the whole distribution of response time and, although the benefit effect was small, it was consistent throughout the entire range of reaction times. PMID:16368641

  1. On-The-Job Tasks and Performance Pay: A Vacancy-Level Analysis

    OpenAIRE

    2010-01-01

    Drawing on a dataset of job openings posted at an online job board, the authors find that employers are less likely to offer performance-based pay when a job entails multitasking, quality control, or team work than when a job does not entail these tasks. This finding is consistent with the notion that when employers have difficulty measuring a worker’s overall performance at jobs with these dimensions, they offer weaker performance incentives to ensure that workers allocate their efforts to e...

  2. Organizational analysis and safety for utilities with nuclear power plants: an organizational overview. Volume 1

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. A model is introduced for the purposes of organizing the literature review and showing key relationships among identified organizational factors and nuclear power plant safety. Volume I of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety

  3. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes. PMID:24179734

  4. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis. PMID:26488206

  5. Leukocyte telomere length and hippocampus volume: a meta-analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Gustav Nilsonne

    2015-10-01

    Full Text Available Leukocyte telomere length has been shown to correlate to hippocampus volume, but effect estimates differ in magnitude and are not uniformly positive. This study aimed primarily to investigate the relationship between leukocyte telomere length and hippocampus gray matter volume by meta-analysis and secondarily to investigate possible effect moderators. Five studies were included with a total of 2107 participants, of which 1960 were contributed by one single influential study. A random-effects meta-analysis estimated the effect to r = 0.12 [95% CI -0.13, 0.37] in the presence of heterogeneity and a subjectively estimated moderate to high risk of bias. There was no evidence that apolipoprotein E (APOE genotype was an effect moderator, nor that the ratio of leukocyte telomerase activity to telomere length was a better predictor than leukocyte telomere length for hippocampus volume. This meta-analysis, while not proving a positive relationship, also is not able to disprove the earlier finding of a positive correlation in the one large study included in analyses. We propose that a relationship between leukocyte telomere length and hippocamus volume may be mediated by transmigrating monocytes which differentiate into microglia in the brain parenchyma.

  6. The 1999-2000 task analysis of American nurse-midwifery/midwifery practice.

    Science.gov (United States)

    Oshio, Sachiko; Johnson, Peter; Fullerton, Judith

    2002-01-01

    A master list of tasks, which contained 200 task statements, 23 professional issues statements, and 177 clinical conditions, was divided into three equivalent survey forms and distributed to those certified nurse-midwives (CNMs) and certified midwives (CMs) certified by the ACNM Certification Council, Inc. during the 5-year period from 1995 to 1999. Specific efforts were made to encourage the participation of CMs, because they represented a new professional cohort. A total of 627 valid responses were obtained. Reasonably similar numbers of respondents contributed data related to each of the three versions of the survey form. The responsibilities have expanded substantially within the domains of nonreproductive primary health care and gynecologic care of the well woman, including advances in assisted reproductive technology. A diminished emphasis on the CNM/CM role in the provision of newborn care was documented. The ACC Research Committee recommended the revision of the entry-level certification examination blueprint, and this was approved by the ACC Board of Directors. The specific recommendations included the development of a new primary care domain and the reconfiguration of content emphasis with percentage allocations as follows: Primary Care, 5-10%; Well-Woman/Gynecology, 15-20%; Newborn, 5-10%; Postpartum, 5-10%; Antepartum, 25-30%; Intrapartum, 25-35%; Professional Issues, up to 5%. PMID:11874090

  7. Mathematical tasks, study approaches, and course grades in undergraduate mathematics: a year-by-year analysis

    Science.gov (United States)

    Maciejewski, Wes; Merchant, Sandra

    2016-04-01

    Students approach learning in different ways, depending on the experienced learning situation. A deep approach is geared toward long-term retention and conceptual change while a surface approach focuses on quickly acquiring knowledge for immediate use. These approaches ultimately affect the students' academic outcomes. This study takes a cross-sectional look at the approaches to learning used by students from courses across all four years of undergraduate mathematics and analyses how these relate to the students' grades. We find that deep learning correlates with grade in the first year and not in the upper years. Surficial learning has no correlation with grades in the first year and a strong negative correlation with grades in the upper years. Using Bloom's taxonomy, we argue that the nature of the tasks given to students is fundamentally different in lower and upper year courses. We find that first-year courses emphasize tasks that require only low-level cognitive processes. Upper year courses require higher level processes but, surprisingly, have a simultaneous greater emphasis on recall and understanding. These observations explain the differences in correlations between approaches to learning and course grades. We conclude with some concerns about the disconnect between first year and upper year mathematics courses and the effect this may have on students.

  8. Correcting Working Postures in Water Pump AssemblyTasks using the OVAKO Work Analysis System (OWAS

    Directory of Open Access Journals (Sweden)

    Atiya Kadhim Al-Zuheri

    2008-01-01

    Full Text Available Ovako Working Postures Analyzing System (OWAS is a widely used method for studying awkward working postures in workplaces. This study with OWAS, analyzed working postures for manual material handling of laminations at stacking workstation for water pump assembly line in Electrical Industrial Company (EICO / Baghdad. A computer program, WinOWAS, was used for the study. In real life workstation was found that more than 26% of the working postures observed were classified as either AC2 (slightly harmful, AC3 (distinctly harmful. Postures that needed to be corrected soon (AC3 and corresponding tasks, were identified. The most stressful tasks observed were grasping, handling, and positioning of the laminations from workers. The construction of real life workstation is modified simultaneously by redesign suggestions in the values of location (positioning factors for stacking workstation. The simulation workstation executed by mean of parametric CAD software. That modifications lead to improvement in the percentage of harmful postures. It was therefore recommended the use of supplementary methods is required to identify ergonomic risk factors for handling work or other hand-intensive activities on industry sites.

  9. Cortical networks for rotational uncertainty effect in mental rotation task by partial directed coherence analysis of EEG.

    Science.gov (United States)

    Yan, Jing; Guo, Xiaoli; Sun, Junfeng; Tong, Shanbao

    2011-01-01

    Partial directed coherence (PDC) as a frequency-domain representation of Granger casuality (GC) could detect both strength and direction of cortical interactions by multivariate autoregressive (MVAR) model of electroencephalography (EEG). In the present study, we investigate the underlying neural networks mechanisms of "rotational uncertainty effect" during mental rotation (MR) task by PDC analysis of multichannel EEG signals before and after the visual stimuli presented, we found that (i) temporally the "rotational uncertainty effect" involved an activated network before the visual stimuli presented, which could also affect the cognitive process of MR later; (ii) the causality functional connectivity network indicated that the bi-directional frontal [symbol see text] parietal networks played critical roles in maintaining the readiness during the MR task. These findings suggest that functional networks of un-cued preparation before visual stimuli presented are worth to be paid more attention. And these networks provide crucial casuality information to understand the neural mechanism for "rotational uncertainty effect" in MR task. PMID:22254583

  10. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators

  11. Volumes of the hippocampus and amygdala in patients with borderline personality disorder: a meta-analysis.

    Science.gov (United States)

    Nunes, Paulo Menezes; Wenzel, Amy; Borges, Karinne Tavares; Porto, Cristianne Ribeiro; Caminha, Renato Maiato; de Oliveira, Irismar Reis

    2009-08-01

    Individuals with borderline personality disorder (BPD) often exhibit impulsive and aggressive behavior. The hippocampus and amygdala form part of the limbic system, which plays a central role in controlling such expressions of emotional reactivity. There are mixed results in the literature regarding whether patients with BPD have smaller hippocampal and amygdalar volume relative to healthy controls. To clarify the precise nature of these mixed results, we performed a meta-analysis to aggregate data on the size of the hippocampus and amygdala in patients with BPD. Seven publications involving six studies and a total of 104 patients with BPD and 122 healthy controls were included. A significantly smaller volume was found in both the right and left hippocampi and amygdala of patients with BPD compared to healthy controls. These findings raise the possibility that reduced hippocampal and amygdalar volumes are biological substrates of some symptoms of BPD. PMID:19663654

  12. A hybrid neural network analysis of subtle brain volume differences in children surviving brain tumors.

    Science.gov (United States)

    Reddick, W E; Mulhern, R K; Elkin, T D; Glass, J O; Merchant, T E; Langston, J W

    1998-05-01

    In the treatment of children with brain tumors, balancing the efficacy of treatment against commonly observed side effects is difficult because of a lack of quantitative measures of brain damage that can be correlated with the intensity of treatment. We quantitatively assessed volumes of brain parenchyma on magnetic resonance (MR) images using a hybrid combination of the Kohonen self-organizing map for segmentation and a multilayer backpropagation neural network for tissue classification. Initially, we analyzed the relationship between volumetric differences and radiologists' grading of atrophy in 80 subjects. This investigation revealed that brain parenchyma and white matter volumes significantly decreased as atrophy increased, whereas gray matter volumes had no relationship with atrophy. Next, we compared 37 medulloblastoma patients treated with surgery, irradiation, and chemotherapy to 19 patients treated with surgery and irradiation alone. This study demonstrated that, in these patients, chemotherapy had no significant effect on brain parenchyma, white matter, or gray matter volumes. We then investigated volumetric differences due to cranial irradiation in 15 medulloblastoma patients treated with surgery and radiation therapy, and compared these with a group of 15 age-matched patients with low-grade astrocytoma treated with surgery alone. With a minimum follow-up of one year after irradiation, all radiation-treated patients demonstrated significantly reduced white matter volumes, whereas gray matter volumes were relatively unchanged compared with those of age-matched patients treated with surgery alone. These results indicate that reductions in cerebral white matter: 1) are correlated significantly with atrophy; 2) are not related to chemotherapy; and 3) are correlated significantly with irradiation. This hybrid neural network analysis of subtle brain volume differences with magnetic resonance may constitute a direct measure of treatment-induced brain damage

  13. Thermal algorithms analysis. [programming tasks supporting the development of a thermal model of the Earth's surface

    Science.gov (United States)

    Lien, T.

    1981-01-01

    The programming and analysis methods to support the development of a thermal model of the Earth's surface from detailed analysis of day/night registered data sets from the Heat Capacity Mapping Mission satellite are briefly described.

  14. Sealed source and device design safety testing. Volume 4: Technical report on the findings of Task 4, Investigation of sealed source for paper mill digester

    International Nuclear Information System (INIS)

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate a suspected leaking radioactive source that was installed in a gauge that was on a paper mill digester. The actual source that was leaking was not available, therefore, SwRI examined another source. SwRI concluded that the encapsulated source examined by SwRI was not leaking. However, the presence of Cs-137 on the interior and exterior of the outer encapsulation and hending tube suggests that contamination probably occurred when the source was first manufactured, then installed in the handling tube

  15. Parametric study of potential early commercial power plants Task 3-A MHD cost analysis

    Science.gov (United States)

    1983-01-01

    The development of costs for an MHD Power Plant and the comparison of these costs to a conventional coal fired power plant are reported. The program is divided into three activities: (1) code of accounts review; (2) MHD pulverized coal power plant cost comparison; (3) operating and maintenance cost estimates. The scope of each NASA code of account item was defined to assure that the recently completed Task 3 capital cost estimates are consistent with the code of account scope. Improvement confidence in MHD plant capital cost estimates by identifying comparability with conventional pulverized coal fired (PCF) power plant systems is undertaken. The basis for estimating the MHD plant operating and maintenance costs of electricity is verified.

  16. Multi-Task Learning for Food Identification and Analysis with Deep Convolutional Neural Networks

    Institute of Scientific and Technical Information of China (English)

    Xi-Jin Zhang; Yi-Fan Lu; Song-Hai Zhang

    2016-01-01

    In this paper, we proposed a multi-task system that can identify dish types, food ingredients, and cooking methods from food images with deep convolutional neural networks. We built up a dataset of 360 classes of different foods with at least 500 images for each class. To reduce the noises of the data, which was collected from the Internet, outlier images were detected and eliminated through a one-class SVM trained with deep convolutional features. We simultaneously trained a dish identifier, a cooking method recognizer, and a multi-label ingredient detector. They share a few low-level layers in the deep network architecture. The proposed framework shows higher accuracy than traditional method with handcrafted features, and the cooking method recognizer and ingredient detector can be applied to dishes which are not included in the training dataset to provide reference information for users.

  17. Dynamic sequence analysis of a decision making task of multielement target tracking and its usage as a learning method

    Science.gov (United States)

    Kang, Ziho

    This dissertation is divided into four parts: 1) Development of effective methods for comparing visual scanning paths (or scanpaths) for a dynamic task of multiple moving targets, 2) application of the methods to compare the scanpaths of experts and novices for a conflict detection task of multiple aircraft on radar screen, 3) a post-hoc analysis of other eye movement characteristics of experts and novices, and 4) finding out whether the scanpaths of experts can be used to teach the novices. In order to compare experts' and novices' scanpaths, two methods are developed. The first proposed method is the matrix comparisons using the Mantel test. The second proposed method is the maximum transition-based agglomerative hierarchical clustering (MTAHC) where comparisons of multi-level visual groupings are held out. The matrix comparison method was useful for a small number of targets during the preliminary experiment, but turned out to be inapplicable to a realistic case when tens of aircraft were presented on screen; however, MTAHC was effective with large number of aircraft on screen. The experiments with experts and novices on the aircraft conflict detection task showed that their scanpaths are different. The MTAHC result was able to explicitly show how experts visually grouped multiple aircraft based on similar altitudes while novices tended to group them based on convergence. Also, the MTAHC results showed that novices paid much attention to the converging aircraft groups even if they are safely separated by altitude; therefore, less attention was given to the actual conflicting pairs resulting in low correct conflict detection rates. Since the analysis showed the scanpath differences, experts' scanpaths were shown to novices in order to find out its effectiveness. The scanpath treatment group showed indications that they changed their visual movements from trajectory-based to altitude-based movements. Between the treatment and the non-treatment group, there were no

  18. FINITE VOLUME NUMERICAL ANALYSIS FOR PARABOLIC EQUATION WITH ROBIN BOUNDARY CONDITION

    Institute of Scientific and Technical Information of China (English)

    Xia Cui

    2005-01-01

    In this paper, finite volume method on unstructured meshes is studied for a parabolic convection-diffusion problem on an open bounded set of Rd (d = 2 or 3) with Robin boundary condition. Upwinding approximations are adapted to treat both the convection term and Robin boundary condition. By directly getting start from the formulation of the finite volume scheme, numerical analysis is done. By using several discrete functional analysis techniques such as summation by parts, discrete norm inequality, et al, the stability and error estimates on the approximate solution are established, existence and uniqueness of the approximate solution and the 1st order temporal norm and L2 and H1 spacial norm convergence properties are obtained.

  19. Crucial role of detailed function, task, timeline, link and human vulnerability analyses in HRA. [Human Reliability Analysis (HRA)

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, T.G.; Haney, L.N.; Ostrom, L.T.

    1992-01-01

    This paper addresses one major cause for large uncertainties in human reliability analysis (HRA) results, that is, an absence of detailed function, task, timeline, link and human vulnerability analyses. All too often this crucial step in the HRA process is done in a cursory fashion using word of mouth or written procedures which themselves may incompletely or inaccurately represent the human action sequences and human error vulnerabilities being analyzed. The paper examines the potential contributions these detailed analyses can make in achieving quantitative and qualitative HRA results which are: (1) creditable, that is, minimize uncertainty, (2) auditable, that is, systematically linking quantitative results and qualitative information from which the results are derived, (3) capable of supporting root cause analyses on human reliability factors determined to be major contributors to risk, and (4) capable of repeated measures and being combined with similar results from other analyses to examine HRA issues transcending individual systems and facilities. Based on experience analyzing test and commercial nuclear reactors, and medical applications of nuclear technology, an iterative process is suggested for doing detailed function, task, timeline, link and human vulnerability analyses using documentation reviews, open-ended and structured interviews, direct observations, and group techniques. Finally, the paper concludes that detailed analyses done in this manner by knowledgeable human factors practitioners, can contribute significantly to the credibility, auditability, causal factor analysis, and combining goals of the HRA.

  20. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    Science.gov (United States)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  1. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    International Nuclear Information System (INIS)

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  2. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L. [Pacific Science and Engineering Group, San Diego, CA (United States)] [and others

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.

  3. Predicting Nonauditory Adverse Radiation Effects Following Radiosurgery for Vestibular Schwannoma: A Volume and Dosimetric Analysis

    International Nuclear Information System (INIS)

    Purpose: To define clinical and dosimetric predictors of nonauditory adverse radiation effects after radiosurgery for vestibular schwannoma treated with a 12 Gy prescription dose. Methods: We retrospectively reviewed our experience of vestibular schwannoma patients treated between September 2005 and December 2009. Two hundred patients were treated at a 12 Gy prescription dose; 80 had complete clinical and radiological follow-up for at least 24 months (median, 28.5 months). All treatment plans were reviewed for target volume and dosimetry characteristics; gradient index; homogeneity index, defined as the maximum dose in the treatment volume divided by the prescription dose; conformity index; brainstem; and trigeminal nerve dose. All adverse radiation effects (ARE) were recorded. Because the intent of our study was to focus on the nonauditory adverse effects, hearing outcome was not evaluated in this study. Results: Twenty-seven (33.8%) patients developed ARE, 5 (6%) developed hydrocephalus, 10 (12.5%) reported new ataxia, 17 (21%) developed trigeminal dysfunction, 3 (3.75%) had facial weakness, and 1 patient developed hemifacial spasm. The development of edema within the pons was significantly associated with ARE (p = 0.001). On multivariate analysis, only target volume is a significant predictor of ARE (p = 0.001). There is a target volume threshold of 5 cm3, above which ARE are more likely. The treatment plan dosimetric characteristics are not associated with ARE, although the maximum dose to the 5th nerve is a significant predictor of trigeminal dysfunction, with a threshold of 9 Gy. The overall 2-year tumor control rate was 96%. Conclusions: Target volume is the most important predictor of adverse radiation effects, and we identified the significant treatment volume threshold to be 5 cm3. We also established through our series that the maximum tolerable dose to the 5th nerve is 9 Gy.

  4. Predicting Nonauditory Adverse Radiation Effects Following Radiosurgery for Vestibular Schwannoma: A Volume and Dosimetric Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hayhurst, Caroline; Monsalves, Eric; Bernstein, Mark; Gentili, Fred [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada); Heydarian, Mostafa; Tsao, May [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Schwartz, Michael [Radiation Oncology Program and Division of Neurosurgery, Sunnybrook Hospital, Toronto (Canada); Prooijen, Monique van [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Millar, Barbara-Ann; Menard, Cynthia [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Kulkarni, Abhaya V. [Division of Neurosurgery, Hospital for Sick Children, University of Toronto (Canada); Laperriere, Norm [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Zadeh, Gelareh, E-mail: Gelareh.Zadeh@uhn.on.ca [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada)

    2012-04-01

    Purpose: To define clinical and dosimetric predictors of nonauditory adverse radiation effects after radiosurgery for vestibular schwannoma treated with a 12 Gy prescription dose. Methods: We retrospectively reviewed our experience of vestibular schwannoma patients treated between September 2005 and December 2009. Two hundred patients were treated at a 12 Gy prescription dose; 80 had complete clinical and radiological follow-up for at least 24 months (median, 28.5 months). All treatment plans were reviewed for target volume and dosimetry characteristics; gradient index; homogeneity index, defined as the maximum dose in the treatment volume divided by the prescription dose; conformity index; brainstem; and trigeminal nerve dose. All adverse radiation effects (ARE) were recorded. Because the intent of our study was to focus on the nonauditory adverse effects, hearing outcome was not evaluated in this study. Results: Twenty-seven (33.8%) patients developed ARE, 5 (6%) developed hydrocephalus, 10 (12.5%) reported new ataxia, 17 (21%) developed trigeminal dysfunction, 3 (3.75%) had facial weakness, and 1 patient developed hemifacial spasm. The development of edema within the pons was significantly associated with ARE (p = 0.001). On multivariate analysis, only target volume is a significant predictor of ARE (p = 0.001). There is a target volume threshold of 5 cm3, above which ARE are more likely. The treatment plan dosimetric characteristics are not associated with ARE, although the maximum dose to the 5th nerve is a significant predictor of trigeminal dysfunction, with a threshold of 9 Gy. The overall 2-year tumor control rate was 96%. Conclusions: Target volume is the most important predictor of adverse radiation effects, and we identified the significant treatment volume threshold to be 5 cm3. We also established through our series that the maximum tolerable dose to the 5th nerve is 9 Gy.

  5. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    International Nuclear Information System (INIS)

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs

  6. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    International Nuclear Information System (INIS)

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs

  7. HYDRA-II: A hydrothermal analysis computer code: Volume 1, Equations and numerics

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.

    1987-04-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in Cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the Cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits of modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. This volume, Volume I - Equations and Numerics, describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. The final volume, Volume III - Verification/Validation Assessments, presents results of numerical simulations of single- and multiassembly storage systems and comparisons with experimental data. 4 refs.

  8. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  9. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.

  10. 3D photography in the objective analysis of volume augmentation including fat augmentation and dermal fillers.

    Science.gov (United States)

    Meier, Jason D; Glasgold, Robert A; Glasgold, Mark J

    2011-11-01

    The authors present quantitative and objective 3D data from their studies showing long-term results with facial volume augmentation. The first study analyzes fat grafting of the midface and the second study presents augmentation of the tear trough with hyaluronic filler. Surgeons using 3D quantitative analysis can learn the duration of results and the optimal amount to inject, as well as showing patients results that are not demonstrable with standard, 2D photography. PMID:22004863

  11. Object-based representation and analysis of light and electron microscopic volume data using Blender

    OpenAIRE

    Asadulina, Albina; Conzelmann, Markus; Williams, Elizabeth A; Panzera, Aurora; Jékely, Gáspár

    2015-01-01

    Background Rapid improvements in light and electron microscopy imaging techniques and the development of 3D anatomical atlases necessitate new approaches for the visualization and analysis of image data. Pixel-based representations of raw light microscopy data suffer from limitations in the number of channels that can be visualized simultaneously. Complex electron microscopic reconstructions from large tissue volumes are also challenging to visualize and analyze. Results Here we exploit the a...

  12. Evaluation of Cylinder Volume Estimation Methods for In–Cylinder Pressure Trace Analysis

    OpenAIRE

    Adrian Irimescu

    2012-01-01

    In–cylinder pressure trace analysis is an important investigation tool frequently employed in the study of internal combustion engines. While technical data is usually available for experimental engines, in some cases measurements are performed on automotive engines for which only the most basic geometry features are available. Therefore, several authors aimed to determine the cylinder volume and length of the connecting rod by other methods than direct measurement. This stu...

  13. Operator function modeling: Cognitive task analysis, modeling and intelligent aiding in supervisory control systems

    Science.gov (United States)

    Mitchell, Christine M.

    1990-01-01

    The design, implementation, and empirical evaluation of task-analytic models and intelligent aids for operators in the control of complex dynamic systems, specifically aerospace systems, are studied. Three related activities are included: (1) the models of operator decision making in complex and predominantly automated space systems were used and developed; (2) the Operator Function Model (OFM) was used to represent operator activities; and (3) Operator Function Model Expert System (OFMspert), a stand-alone knowledge-based system was developed, that interacts with a human operator in a manner similar to a human assistant in the control of aerospace systems. OFMspert is an architecture for an operator's assistant that uses the OFM as its system and operator knowledge base and a blackboard paradigm of problem solving to dynamically generate expectations about upcoming operator activities and interpreting actual operator actions. An experiment validated the OFMspert's intent inferencing capability and showed that it inferred the intentions of operators in ways comparable to both a human expert and operators themselves. OFMspert was also augmented with control capabilities. An interface allowed the operator to interact with OFMspert, delegating as much or as little control responsibility as the operator chose. With its design based on the OFM, OFMspert's control capabilities were available at multiple levels of abstraction and allowed the operator a great deal of discretion over the amount and level of delegated control. An experiment showed that overall system performance was comparable for teams consisting of two human operators versus a human operator and OFMspert team.

  14. Loads in the hip joint during physically demanding occupational tasks: A motion analysis study.

    Science.gov (United States)

    Varady, Patrick Aljoscha; Glitsch, Ulrich; Augat, Peter

    2015-09-18

    Epidemiologic studies of osteoarthritis of the hip indicate a possible connection between work related activities and the pathogenesis of the disease. This study investigated the hip joint contact forces for physically demanding occupational tasks (lifting, carrying, transferring of a weight (mass: 25 kg, 40 kg and 50 kg); stair climbing without and with additional load of 25 kg; ladder climbing) and compared these with everyday activities (level gait, sitting down and getting up). The hip joint contact force was calculated with the human multibody simulation software AnyBody employing motion capture and ground reaction force measurements by force plates and an instrumented staircase and ladder. Although the results for 11 male test subjects showed individual variations, a general trend could be observed in regards of force curves' characteristics and maxima. The largest joint contact forces calculated were (637 ± 148)%-body weight for horizontal transfer of a 50 kg weight. For several of the occupational activities the computed hip joint contact forces were significantly larger compared to the investigated examples of activities of daily living. This study provides original data of simulated hip joint contact forces for physically demanding activities. PMID:26187677

  15. Engineering Task Plan for Development and Fabrication and Deployment of a mobile, variable depth sampling At-Tank Analysis Systems

    International Nuclear Information System (INIS)

    This engineering task plan identifies the resources, responsibilities, and schedules for the development and deployment of a mobile, variable depth sampling system and an at-tank analysis system. The mobile, variable depth sampling system concept was developed after a cost assessment indicated a high cost for multiple deployments of the nested, fixed-depth sampling system. The sampling will provide double-shell tank (DST) staging tank waste samples for assuring the readiness of the waste for shipment to the LAW/HLW plant for treatment and immobilization. The at-tank analysis system will provide ''real-time'' assessments of the samples' chemical and physical properties. These systems support the Hanford Phase 1B vitrification project

  16. Engineering Task Plan for Development and Fabrication and Deployment of Nested Fixed Depth Fluidic Sampling and At Tank Analysis Systems

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-10-30

    This engineering task plan identifies the resources, responsibilities, and schedules for the development and deployment of a mobile, variable depth sampling system and an at-tank analysis system. The mobile, variable depth sampling system concept was developed after a cost assessment indicated a high cost for multiple deployments of the nested, fixed-depth sampling system. The sampling will provide double-shell tank (DST) staging tank waste samples for assuring the readiness of the waste for shipment to the LAW/HLW plant for treatment and immobilization. The at-tank analysis system will provide ''real-time'' assessments of the samples' chemical and physical properties. These systems support the Hanford Phase 1B vitrification project.

  17. Magnetic resonance velocity imaging derived pressure differential using control volume analysis

    Directory of Open Access Journals (Sweden)

    Cohen Benjamin

    2011-03-01

    Full Text Available Abstract Background Diagnosis and treatment of hydrocephalus is hindered by a lack of systemic understanding of the interrelationships between pressures and flow of cerebrospinal fluid in the brain. Control volume analysis provides a fluid physics approach to quantify and relate pressure and flow information. The objective of this study was to use control volume analysis and magnetic resonance velocity imaging to non-invasively estimate pressure differentials in vitro. Method A flow phantom was constructed and water was the experimental fluid. The phantom was connected to a high-resolution differential pressure sensor and a computer controlled pump producing sinusoidal flow. Magnetic resonance velocity measurements were taken and subsequently analyzed to derive pressure differential waveforms using momentum conservation principles. Independent sensor measurements were obtained for comparison. Results Using magnetic resonance data the momentum balance in the phantom was computed. The measured differential pressure force had amplitude of 14.4 dynes (pressure gradient amplitude 0.30 Pa/cm. A 12.5% normalized root mean square deviation between derived and directly measured pressure differential was obtained. These experiments demonstrate one example of the potential utility of control volume analysis and the concepts involved in its application. Conclusions This study validates a non-invasive measurement technique for relating velocity measurements to pressure differential. These methods may be applied to clinical measurements to estimate pressure differentials in vivo which could not be obtained with current clinical sensors.

  18. Lobar analysis of collapsibility indices to assess functional lung volumes in COPD patients

    Directory of Open Access Journals (Sweden)

    Kitano M

    2014-12-01

    Full Text Available Mariko Kitano,1 Shingo Iwano,1 Naozumi Hashimoto,2 Keiji Matsuo,3 Yoshinori Hasegawa,2 Shinji Naganawa1 1Department of Radiology, 2Department of Respiratory Medicine, Graduate School of Medicine, Nagoya University, Nagoya, Aichi, Japan; 3Department of Radiology, Ichinomiya Municipal Hospital, Ichinomiya, Aichi, Japan Background: We investigated correlations between lung volume collapsibility indices and pulmonary function test (PFT results and assessed lobar differences in chronic obstructive pulmonary disease (COPD patients, using paired inspiratory and expiratory three dimensional (3D computed tomography (CT images. Methods: We retrospectively assessed 28 COPD patients who underwent paired inspiratory and expiratory CT and PFT exams on the same day. A computer-aided diagnostic system calculated total lobar volume and emphysematous lobar volume (ELV. Normal lobar volume (NLV was determined by subtracting ELV from total lobar volume, both for inspiratory phase (NLVI and for expiratory phase (NLVE. We also determined lobar collapsibility indices: NLV collapsibility ratio (NLVCR (% = (1 - NLVE/NLVI × 100%. Associations between lobar volumes and PFT results, and collapsibility indices and PFT results were determined by Pearson correlation analysis. Results: NLVCR values were significantly correlated with PFT results. Forced expiratory volume in 1 second, measured as percent of predicted results (FEV1%P was significantly correlated with NLVCR values for the lower lobes (P<0.01, whereas this correlation was not significant for the upper lobes (P=0.05. FEV1%P results were also moderately correlated with inspiratory, expiratory ELV (ELVI,E for the lower lobes (P<0.05. In contrast, the ratio of the diffusion capacity for carbon monoxide to alveolar gas volume, measured as percent of predicted (DLCO/VA%P results were strongly correlated with ELVI for the upper lobes (P<0.001, whereas this correlation with NLVCR values was weaker for upper lobes (P<0

  19. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    OpenAIRE

    Chernov, Vladimir; Oleksandr DOROKHOV; Liudmyla DOROKHOVA

    2016-01-01

    The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the eva...

  20. Fabrication, testing, and analysis of anisotropic carbon/glass hybrid composites: volume 1: technical report.

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, Kyle K. (Wetzel Engineering, Inc. Lawrence, Kansas); Hermann, Thomas M. (Wichita state University, Wichita, Kansas); Locke, James (Wichita state University, Wichita, Kansas)

    2005-11-01

    o} from the long axis for approximately two-thirds of the laminate volume (discounting skin layers), with reinforcing carbon fibers oriented axially comprising the remaining one-third of the volume. Finite element analysis of each laminate has been performed to examine first ply failure. Three failure criteria--maximum stress, maximum strain, and Tsai-Wu--have been compared. Failure predicted by all three criteria proves generally conservative, with the stress-based criteria the most conservative. For laminates that respond nonlinearly to loading, large error is observed in the prediction of failure using maximum strain as the criterion. This report documents the methods and results in two volumes. Volume 1 contains descriptions of the laminates, their fabrication and testing, the methods of analysis, the results, and the conclusions and recommendations. Volume 2 contains a comprehensive summary of the individual test results for all laminates.

  1. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 1-Summary

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original ''fresh'' composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized- water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Isotopic densities for spent fuel assemblies in the core were calculated using the SAS2H analytical sequence in SCALE-4. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code sequence was used to extract the necessary isotopic densities from SAS2H results and to provide the data in the format required for SCALE-4 criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (keff) for the critical configuration. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for analysis of each critical configuration. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2

  2. JET modeling and control analysis for POET (PFX Operating Early Task)

    International Nuclear Information System (INIS)

    Highlights: ► New POET operational space Safely opened, and used in 2011/12 JET exp. campaign. ► Fully non-linear dynamic simulation to reproduce the effects of different iron saturation level and eddy currents. ► Early achievement of highly shaped plasma. ► Reduced initial limiter phase: lower thermal load for the new metallic ILW and fuel retention studies. -- Abstract: The aim of the PFX Operating Early Task (POET) was to obtain a highly shaped plasma and x-point formation in the early phases of the discharge. The PFX is an amplifier which feeds the central pancakes of the JET primary circuit. In the past it was possible to energize the PFX circuit only when the PFGC current, which feeds all the coils of the primary circuit, was already flowing in the same direction as the PFX current would have flowed, to avoid repulsive vertical forces which would tend to lift the top pancakes of the central solenoid, balanced only by the net weight of the upper part of the JET machine. In this paper the modeling activity performed to provide a more accurate estimate of the ejection forces acting on the upper coils in order to safely widen the operational space, by using two dimensional finite element electromagnetic models and the simulation of the performances of the actual controller algorithm on tracking the desired references of current are described. Finally will be presented the results of the implemented POET system, routinely used in JET in the 2011/2012 experimental campaigns, in terms of anticipation of x-point formation and enhanced system flexibility

  3. Effectiveness of the random sequential absorption algorithm in the analysis of volume elements with nanoplatelets

    DEFF Research Database (Denmark)

    Pontefisso, Alessandro; Zappalorto, Michele; Quaresimin, Marino

    2016-01-01

    In this work, a study of the Random Sequential Absorption (RSA) algorithm in the generation of nanoplatelet Volume Elements (VEs) is carried out. The effect of the algorithm input parameters on the reinforcement distribution is studied through the implementation of statistical tools, showing that...... the platelet distribution is systematically affected by these parameters. The consequence is that a parametric analysis of the VE input parameters may be biased by hidden differences in the filler distribution. The same statistical tools used in the analysis are implemented in a modified RSA algorithm...

  4. STATISTICAL ANALYSIS OF DWARF GALAXIES AND THEIR GLOBULAR CLUSTERS IN THE LOCAL VOLUME

    International Nuclear Information System (INIS)

    Although morphological classification of dwarf galaxies into early and late types can account for some of their origin and characteristics, this does not aid the study of their formation mechanism. Thus an objective classification using principal component analysis together with K means cluster analysis of these dwarf galaxies and their globular clusters (GCs) is carried out to overcome this problem. It is found that the classification of dwarf galaxies in the local volume is irrespective of their morphological indices. The more massive (MV0 V0 > - 13.7) are influenced by their environment in the star formation process.

  5. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    Science.gov (United States)

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  6. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 2, Task 3, Testing of process improvement concepts: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    This final report, Volume 2, on ``Process Improvement Concepts`` presents the results of work conducted by the Institute of Gas Technology (IGT), the Illinois Institute of Technology (IIT), and the Ohio State University (OSU) to develop three novel approaches for desulfurization that have shown good potential with coal and could be cost-effective for oil shales. These are (1) In-Bed Sulfur Capture using different sorbents (IGT), (2) Electrostatic Desulfurization (IIT), and (3) Microbial Desulfurization and Denitrification (OSU and IGT). Results of work on electroseparation of shale oil and fines conducted by IIT is included in this report, as well as work conducted by IGT to evaluate the restricted pipe discharge system. The work was conducted as part of the overall program on ``Pressurized Fluidized-Bed Hydroretorting of Eastern Oil Shales.``

  7. Multifidus Muscle Volume Estimation Based on Three Dimensional Wavelet Multi Resolution Analysis: MRA with Buttocks Computer-Tomography: CT Images

    OpenAIRE

    Kohei Arai

    2013-01-01

    Multi-Resolution Analysis:. MRA based edge detection algorithm is proposed for estimation of volume of multifidus muscle in the Computer Tomography: CT scanned image The volume of multifidus muscle would be a good measure for metabolic syndrome rather than internal fat from a point of view from processing complexity. The proposed measure shows 0.178 of R square which corresponds to mutual correlation between internal fat and the volume of multifidus muscle. It is also fund that R square betwe...

  8. Changes in Olfactory Bulb Volume in Parkinson’s Disease: A Systematic Review and Meta-Analysis

    OpenAIRE

    Li, Jia; Gu, Cheng-zhi; Su, Jian-bin; Zhu, Lian-hai; Zhou, Yong; Huang, Huai-yu; LIU Chun-feng

    2016-01-01

    Objective The changes in olfactory bulb (OB) volume in Parkinson’s disease (PD) patients have not yet been comprehensively evaluated. The purpose of this meta-analysis was to explore whether the OB volume was significantly different between PD patients and healthy controls. Methods PubMed and Embase were searched up to March 6, 2015 with no language restrictions. Two independent reviewers screened eligible studies and extracted data on study characteristics and OB volume. Additionally, a syst...

  9. [Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio]. Volume 2, Work plan: Phase 1, Task 4, Field Investigation: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    In April 1990 Wright-Patterson Air Force Base (WPAFB) initiated an investigation to evaluate a potential CERCLA removal action to prevent, to the extent practicable, the migration of ground-water contamination in the Mad River Valley Aquifer within and across WPAFB boundaries. The action will be based on a Focused Feasibility Study with an Action Memorandum serving as a decision document that is subject to approval by the Ohio Environmental Protection Agency. The first phase (Phase 1) of this effort involves an investigation of ground-water contamination migrating across the southwest boundary of Area C and across Springfield Pike adjacent to Area B. Task 4 of Phase 1 is a field investigation to collect sufficient additional information to evaluate removal alternatives. The field investigation will provide information in the following specific areas of study: water-level data which will be used to permit calibration of the ground-water flow model to a unique time in history; and ground-water quality data which will be used to characterize the current chemical conditions of ground water.

  10. Analysis of action tremor and impaired control of movement velocity in multiple sclerosis during visually guided wrist-tracking tasks.

    Science.gov (United States)

    Liu, X; Miall, C; Aziz, T Z; Palace, J A; Haggard, P N; Stein, J F

    1997-11-01

    We investigated the relationship between action tremor (AT) and impaired control of movement velocity (MV) in visually guided tracking tasks, in normal subjects and in patients with multiple sclerosis (MS) with or without motor deficits. The effects of withdrawing visual feedback of either the target or the cursor were then investigated. Visually cued simple reaction times (SRTs) were also measured. The effects of thalamotomy on motor performance in these tasks were evaluated in seven patients. In the MS patients with tremor, there was no correlation between AT and impairment in control of MV, but the latter was highly correlated with an increased delay in SRT. Withdrawal of visually guiding cues increased the error significantly in MV, but reduced AT by approximately 30% in magnitude. Frequency analysis indicated that the AT had two components: (a) non-visual-dependent, oscillatory movements, mainly at 4 Hz; and (2) visual-dependent, repetitive movements, with significant power at 1-2 Hz. Thalamotomy significantly reduced AT but hardly improved accuracy in MV. These results suggest that visual feedback of a spatial mismatch signal may provoke a visually dependent repetitive movement contributing to AT. Conduction delays along either the cortico-cerebello-cortical or the proprioceptive pathways and impaired working memory caused by MS may be responsible for the movement disorders in these patients. PMID:9399226

  11. Economic analysis of a volume reduction/polyethylene solidification system for low-level radioactive wastes

    International Nuclear Information System (INIS)

    A study was conducted at Brookhaven National Laboratory to determine the economic feasibility of a fluidized bed volume reduction/polyethylene solidification system for low-level radioactive wastes. These results are compared with the ''null'' alternative of no volume reduction and solidification of aqueous waste streams in hydraulic cement. The economic analysis employed a levelized revenue requirement (LRR) technique conducted over a ten year period. An interactive computer program was written to conduct the LRR calculations. Both of the treatment/solidification options were considered for a number of scenarios including type of plant (BWR or PWR) and transportation distance to the disposal site. If current trends in the escalation rates of cost components continue, the volume reduction/polyethylene solidification option will be cost effective for both BWRs and PWRs. Data indicate that a minimum net annual savings of $0.8 million per year (for a PWR shipping its waste 750 miles) and a maximum net annual savings of $9 million per year (for a BWR shipping its waste 2500 miles) can be achieved. A sensitivity analysis was performed for the burial cost escalation rate, which indicated that variation of this factor will impact the total levelized revenue requirement. The burial cost escalation rate which yields a break-even condition was determined for each scenario considered. 11 refs., 8 figs., 39 tabs

  12. Final safety analysis report for the Galileo Mission: Volume 1, Reference design document

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    The Galileo mission uses nuclear power sources called Radioisotope Thermoelectric Generators (RTGs) to provide the spacecraft's primary electrical power. Because these generators contain nuclear material, a Safety Analysis Report (SAR) is required. A preliminary SAR and an updated SAR were previously issued that provided an evolving status report on the safety analysis. As a result of the Challenger accident, the launch dates for both Galileo and Ulysses missions were later rescheduled for November 1989 and October 1990, respectively. The decision was made by agreement between the DOE and the NASA to have a revised safety evaluation and report (FSAR) prepared on the basis of these revised vehicle accidents and environments. The results of this latest revised safety evaluation are presented in this document (Galileo FSAR). Volume I, this document, provides the background design information required to understand the analyses presented in Volumes II and III. It contains descriptions of the RTGs, the Galileo spacecraft, the Space Shuttle, the Inertial Upper Stage (IUS), the trajectory and flight characteristics including flight contingency modes, and the launch site. There are two appendices in Volume I which provide detailed material properties for the RTG.

  13. Final Report: LSAC Skills Analysis. Law School Task Survey. LSAC Research Report Series.

    Science.gov (United States)

    Luebke, Stephen W.; Swygert, Kimberly A.; McLeod, Lori D.; Dalessandro, Susan P.; Roussos, Louis A.

    The Law School Admission Council (LSAC) Skills Analysis Survey identifies the skills that are important for success in law school. This information provides validity evidence for the current Law School Admission Test (LSAT) and guides the development of new test items and test specifications. The key question of the survey is "what academic tasks…

  14. Review of staff training plans Licensed Benchmarking on task analysis and selection of learning environments

    International Nuclear Information System (INIS)

    The purpose of this paper is to present the findings and possible improvement actions taken after a work of technical exchange with U.S. Surry nuclear power plant. The visit focused on the study of the methodology for the analysis and design of training programs according to the standards of INPO.

  15. Cognitive Task Analysis of Experts in Designing Multimedia Learning Object Guideline (M-LOG)

    Science.gov (United States)

    Razak, Rafiza Abdul; Palanisamy, Punithavathy

    2013-01-01

    The purpose of this study was to design and develop a set of guidelines for multimedia learning objects to inform instructional designers (IDs) about the procedures involved in the process of content analysis. This study was motivated by the absence of standardized procedures in the beginning phase of the multimedia learning object design which is…

  16. Systems Studies Department FY 78 activity report. Volume 2. Systems analysis

    International Nuclear Information System (INIS)

    The Systems Studies Department at Sandia Laboratories Livermore (SLL) has two primary responsibilities: to provide computational and mathematical services and to perform systems analysis studies. This document (Volume 2) describes the FY Systems Analysis highlights. The description is an unclassified overview of activities and is not complete or exhaustive. The objective of the systems analysis activities is to evaluate the relative value of alternative concepts and systems. SLL systems analysis activities reflect Sandia Laboratory programs and in 1978 consisted of study efforts in three areas: national security: evaluations of strategic, theater, and navy nuclear weapons issues; energy technology: particularly in support of Sandia's solar thermal programs; and nuclear fuel cycle physical security: a special project conducted for the Nuclear Regulatory Commission. Highlights of these activities are described in the following sections. 7 figures

  17. Systems Studies Department FY 78 activity report. Volume 2. Systems analysis. [Sandia Laboratories, Livermore

    Energy Technology Data Exchange (ETDEWEB)

    Gold, T.S.

    1979-02-01

    The Systems Studies Department at Sandia Laboratories Livermore (SLL) has two primary responsibilities: to provide computational and mathematical services and to perform systems analysis studies. This document (Volume 2) describes the FY Systems Analysis highlights. The description is an unclassified overview of activities and is not complete or exhaustive. The objective of the systems analysis activities is to evaluate the relative value of alternative concepts and systems. SLL systems analysis activities reflect Sandia Laboratory programs and in 1978 consisted of study efforts in three areas: national security: evaluations of strategic, theater, and navy nuclear weapons issues; energy technology: particularly in support of Sandia's solar thermal programs; and nuclear fuel cycle physical security: a special project conducted for the Nuclear Regulatory Commission. Highlights of these activities are described in the following sections. 7 figures. (RWR)

  18. Evaluation of Cylinder Volume Estimation Methods for In–Cylinder Pressure Trace Analysis

    Directory of Open Access Journals (Sweden)

    Adrian Irimescu

    2012-09-01

    Full Text Available In–cylinder pressure trace analysis is an important investigation tool frequently employed in the study of internal combustion engines. While technical data is usually available for experimental engines, in some cases measurements are performed on automotive engines for which only the most basic geometry features are available. Therefore, several authors aimed to determine the cylinder volume and length of the connecting rod by other methods than direct measurement. This study performs an evaluation of two such methods. The most appropriate way was found to be the estimation of connecting rod length based on general engine category as opposed to the use of an equation that predicts cylinder volume with good accuracy around top dead centre for most geometries.

  19. Georgetown University Integrated Community Energy System (GU-ICES). Phase III, Stage I. Feasibility analysis. Final report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-10-01

    This Feasibility Analysis covers a wide range of studies and evaluations. The Report is divided into five parts. Section 1 contains all material relating to the Institutional Assessment including consideration of the requirements and position of the Potomac Electric Co. as they relate to cogeneration at Georgetown in parallel with the utility (Task 1). Sections 2 through 7 contain all technical information relating to the Alternative Subsystems Analysis (Task 4). This includes the energy demand profiles upon which the evaluations were based (Task 3). It further includes the results of the Life-Cycle-Cost Analyses (Task 5) which are developed in detail in the Appendix for evaluation in the Technical Report. Also included is the material relating to Incremental Savings and Optimization (Task 6) and the Conceptual Design for candidate alternate subsystems (Task 7). Section 8 contains all material relating to the Environmental Impact Assessment (Task 2). The Appendix contains supplementary material including the budget cost estimates used in the life-cycle-cost analyses, the basic assumptions upon which the life-cycle analyses were developed, and the detailed life-cycle-cost anlysis for each subsystem considered in detail.

  20. JV Task 99-Integrated Risk Analysis and Contaminant Reduction, Watford City, North Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Jaroslav Solc; Barry W. Botnen

    2007-05-31

    The Energy & Environmental Research Center (EERC) conducted a limited site investigation and risk analyses for hydrocarbon-contaminated soils and groundwater at a Construction Services, Inc., site in Watford City, North Dakota. Site investigation confirmed the presence of free product and high concentrations of residual gasoline-based contaminants in several wells, the presence of 1,2-dichloroethane, and extremely high levels of electrical conductivity indicative of brine residuals in the tank area south of the facility. The risk analysis was based on compilation of information from the site-specific geotechnical investigation, including multiphase extraction pilot test, laser induced fluorescence probing, evaluation of contaminant properties, receptor survey, capture zone analysis and evaluation of well head protection area for municipal well field. The project results indicate that the risks associated with contaminant occurrence at the Construction Services, Inc. site are low and, under current conditions, there is no direct or indirect exposure pathway between the contaminated groundwater and soils and potential receptors.

  1. Space station data system analysis/architecture study. Task 4: System definition report

    Science.gov (United States)

    1985-01-01

    Functional/performance requirements for the Space Station Data System (SSDS) are analyzed and architectural design concepts are derived and evaluated in terms of their performance and growth potential, technical feasibility and risk, and cost effectiveness. The design concepts discussed are grouped under five major areas: SSDS top-level architecture overview, end-to-end SSDS design and operations perspective, communications assumptions and traffic analysis, onboard SSDS definition, and ground SSDS definition.

  2. Improved Duct Systems Task Report with StageGate 2 Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moyer, Neil [Florida Solar Energy Center, Cocoa, FL (United States); Stroer, Dennis [Calcs-Plus, Venice, FL (United States)

    2007-12-31

    This report is about Building America Industrialized Housing Partnership's work with two industry partners, Davalier Homes and Southern Energy Homes, in constructing and evaluating prototype interior duct systems. Issues of energy performance, comfort, DAPIA approval, manufacturability and cost is addressed. A stage gate 2 analysis addresses the current status of project showing that there are still refinements needed to the process of incorporating all of the ducts within the air and thermal boundaries of the envelope.

  3. Advanced Turbine Systems Program, Conceptual Design and Product Development. Task 6, System definition and analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The strategy of the ATS program is to develop a new baseline for industrial gas turbine systems for the 21st century, meeting the buying criteria of industrial gas turbine end users, and having growth potential. These criteria guided the Solar ATS Team in selecting the system definition described in this Topical Report. The key to selecting the ATS system definition was meeting or exceeding each technical goal without negatively impacting other commercial goals. Among the most crucial goals are the buying criteria of the industrial gas turbine market. Solar started by preliminarily considering several cycles with the potential to meet ATS program goals. These candidates were initially narrowed based on a qualitative assessment of several factors such as the potential for meeting program goals and for future growth; the probability of successful demonstration within the program`s schedule and expected level of funding; and the appropriateness of the cycle in light of end users` buying criteria. A first level Quality Function Deployment (QFD) analysis then translated customer needs into functional requirements, and ensured favorable interaction between concept features. Based on this analysis, Solar selected a recuperated cycle as the best approach to fulfilling both D.O.E. and Solar marketing goals. This report details the design and analysis of the selected engine concept, and explains how advanced features of system components achieve program goals. Estimates of cost, performance, emissions and RAMD (reliability, availability, maintainability, durability) are also documented in this report.

  4. Small Volume Flow Probe for Automated Direct-Injection NMR Analysis: Design and Performance

    Science.gov (United States)

    Haner, Ronald L.; Llanos, William; Mueller, Luciano

    2000-03-01

    A detailed characterization of an NMR flow probe for use in direct-injection sample analysis is presented. A 600-MHz, indirect detection NMR flow probe with a 120-μl active volume is evaluated in two configurations: first as a stand-alone small volume probe for the analysis of static, nonflowing solutions, and second as a component in an integrated liquids-handling system used for high-throughput NMR analysis. In the stand-alone mode, 1H lineshape, sensitivity, radiofrequency (RF) homogeneity, and heat transfer characteristics are measured and compared to conventional-format NMR probes of related design. Commonly used descriptive terminology for the hardware, sample regions, and RF coils are reviewed or defined, and test procedures developed for flow probes are described. The flow probe displayed general performance that is competitive with standard probes. Key advantages of the flow probe include high molar sensitivity, ease of use in an automation setup, and superior reproducibility of magnetic field homogeneity which enables the practical implementation of 1D T2-edited analysis of protein-ligand interactions.

  5. Semiautomatic measuring system for bubble chamber film analysis under control of a multi-task operating system

    International Nuclear Information System (INIS)

    We have constructed a semiautomatic measuring system by applying the on-line diagnosis method developed previously for manual digitizer to a new track follower, SWEEPNIK. The aim of our system is that all tracks of an event are automatically measured once an interaction vertex is given (Vertex Guidance). The on-line diagnosis gives the following two features to this system. One is a diagnosis of the accuracy of measured data. The other is a diagnosis to reject a spurious track. The former is essential to obtain accurate data. The latter makes the Vertex Guidance reliable. With the help of the multi-task operating system, the on-line diagnosis scarcely increases the overall measuring time. An analysis of 1339 two-prong events shows that the passing rate of the measured events through a general reconstruction program amounts to 95 percent and that we do not need to remeasure in practice. (author)

  6. The Effect of a Workload-Preview on Task-Prioritization and Task-Performance

    Science.gov (United States)

    Minotra, Dev

    2012-01-01

    With increased volume and sophistication of cyber attacks in recent years, maintaining situation awareness and effective task-prioritization strategy is critical to the task of cybersecurity analysts. However, high levels of mental-workload associated with the task of cybersecurity analyst's limits their ability to prioritize tasks.…

  7. Classifications of Motor Imagery Tasks in Brain Computer Interface Using Linear Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    Roxana Aldea

    2014-07-01

    Full Text Available In this paper, we address a method for motor imagery feature extraction for brain computer interface (BCI. The wavelet coefficients were used to extract the features from the motor imagery EEG and the linear discriminant analysis was utilized to classify the pattern of left or right hand imagery movement and rest. The performance of the proposed method was evaluated using EEG data recorded by us, with 8 g.tec active electrodes by means of g.MOBIlab+ module. The maximum accuracy of classification is 91%.

  8. Ocean thermal energy conversion cold water pipe preliminary design project. Task 2. Analysis for concept selection

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-04-01

    The successful performance of the CWP is of crucial importance to the overall OTEC system; the pipe itself is considered the most critical part of the entire operation. Because of the importance the CWP, a project for the analysis and design of CWP's was begun in the fall of 1978. The goals of this project were to study a variety of concepts for delivering cold water to an OTEC plant, to analyze and rank these concepts based on their relative cost and risk, and to develop preliminary design for those concepts which seemed most promising. Two representative platforms and sites were chosen: a spar buoy of a Gibbs and Cox design to be moored at a site off Punta Tuna, Puerto Rico, and a barge designed by APL/Johns Hopkins University, grazing about a site approximately 200 miles east of the coast of Brazil. The approach was to concentrate on the most promising concepts and on those which were either of general interest or espoused by others (e.g., steel and concrete concepts). Much of the overall attention, therefore, focused on analyzing rigid and compliant wall design, while stockade (except for the special case of the FRP stockade) and bottom-mounted concepts received less attention. A total of 67 CWP concepts were initially generated and subjected to a screening process. Of these, 16 were carried through design analysis, costing, and ranking. Study results are presented in detail. (WHK)

  9. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  10. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    Science.gov (United States)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  11. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  12. A Meta-analysis of the Wisconsin Card Sort Task in Autism.

    Science.gov (United States)

    Landry, Oriane; Al-Taie, Shems

    2016-04-01

    We conducted a meta-analysis of 31 studies, spanning 30 years, utilizing the WCST in participants with autism. We calculated Cohen's d effect sizes for four measures of performance: sets completed, perseveration, failure-to-maintain-set, and non-perseverative errors. The average weighted effect size ranged from 0.30 to 0.74 for each measure, all statistically greater than 0. No evidence was found for reduced impairment when WCST is administered by computer. Age and PIQ predicted perseverative error rates, while VIQ predicted non-perseverative error rates, and both perseverative and non-perseverative error rates in turn predicted number of sets completed. No correlates of failure-to-maintain set errors were found; further research is warranted on this aspect of WCST performance in autism. PMID:26614085

  13. Dose-volume complication analysis for visual pathway structures of patients with advanced paranasal sinus tumors

    International Nuclear Information System (INIS)

    Purpose: The purpose of the present work was to relate dose and volume information to complication data for visual pathway structures in patients with advanced paranasal sinus tumors. Methods and Materials: Three-dimensional (3D) dose distributions for chiasm, optic nerve, and retina were calculated and analyzed for 20 patients with advanced paranasal sinus malignant tumors. 3D treatment planning with beam's eye view capability was used to design beam and block arrangements, striving to spare the contralateral orbit (to lessen the chance of unilateral blindness) and frequently the ipsilateral orbit (to help prevent bilateral blindness). Point doses, dose-volume histogram analysis, and normal tissue complication probability (NTCP) calculations were performed. Published tolerance doses that indicate significant risk of complications were used as guidelines for analysis of the 3D dose distributions. Results: Point doses, percent volume exceeding a specified published tolerance dose, and NTCP calculations are given in detail for patients with complications versus patients without complications. Two optic nerves receiving maximum doses below the published tolerance dose sustained damage (mild vision loss). Three patients (of 13) without optic nerve sparing and/or chiasm sparing had moderate or severe vision loss. Complication data, including individual patient analysis to estimate overall risk for loss of vision, are given. Conclusion: 3D treatment planning techniques were used successfully to provide bilateral sparing of the globe for most patients. It was more difficult to spare the optic nerves, especially on the ipsilateral side, when prescription dose exceeded the normal tissue tolerance doses. NTCP calculations may be useful in assessing complication risk better than point dose tolerance criteria for the chiasm, optic nerve, and retina. It is important to assess the overall risk of blindness for the patient in addition to the risk for individual visual pathway

  14. Coupled Structural, Thermal, Phase-change and Electromagnetic Analysis for Superconductors, Volume 2

    Science.gov (United States)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Militello, C.; Schuler, J. J.

    1996-01-01

    Described are the theoretical development and computer implementation of reliable and efficient methods for the analysis of coupled mechanical problems that involve the interaction of mechanical, thermal, phase-change and electromag subproblems. The focus application has been the modeling of superconductivity and associated quantum-state phase change phenomena. In support of this objective the work has addressed the following issues: (1) development of variational principles for finite elements, (2) finite element modeling of the electromagnetic problem, (3) coupling of thermel and mechanical effects, and (4) computer implementation and solution of the superconductivity transition problem. The main accomplishments have been: (1) the development of the theory of parametrized and gauged variational principles, (2) the application of those principled to the construction of electromagnetic, thermal and mechanical finite elements, and (3) the coupling of electromagnetic finite elements with thermal and superconducting effects, and (4) the first detailed finite element simulations of bulk superconductors, in particular the Meissner effect and the nature of the normal conducting boundary layer. The theoretical development is described in two volumes. Volume 1 describes mostly formulation specific problems. Volume 2 describes generalization of those formulations.

  15. Fatigue monitoring and analysis of orthotropic steel deck considering traffic volume and ambient temperature

    Institute of Scientific and Technical Information of China (English)

    SONG; YongSheng; DING; YouLiang

    2013-01-01

    Fatigue has gradually become a serious issue for orthotropic steel deck used for long-span bridges. Two fatigue effects, namely number of stress cycles and equivalent stress amplitude, were introduced as investigated parameters in this paper. Investigation was focused on their relationships with traffic volume and ambient temperature by using 7-months fatigue monitoring data of an actual bridge. A fatigue analytical model considering temperature-induced changes in material property of asphalt pavement was established for verifying these relationships. The analysis results revealed that the number of stress cycles and equivalent stress amplitude showed a linear correlation with the traffic volume and ambient temperature, respectively, and that the rib-to-deck welded joint was much more sensitive to the traffic volume and ambient temperature than the rib-to-rib welded joint. The applicability of the code-recommended model for fatigue vehicle loading was also discussed, which revealed that the deterministic vehicle loading model requires improvement to account for significant randomness of the actual traffic conditions.

  16. Volume estimation of low-contrast lesions with CT: a comparison of performances from a phantom study, simulations and theoretical analysis

    International Nuclear Information System (INIS)

    Measurements of lung nodule volume with multi-detector computed tomography (MDCT) have been shown to be more accurate and precise compared to conventional lower dimensional measurements. Quantifying the size of lesions is potentially more difficult when the object-to-background contrast is low as with lesions in the liver. Physical phantom and simulation studies are often utilized to analyze the bias and variance of lesion size estimates because a ground truth or reference standard can be established. In addition, it may also be useful to derive theoretical bounds as another way of characterizing lesion sizing methods. The goal of this work was to study the performance of a MDCT system for a lesion volume estimation task with object-to-background contrast less than 50 HU, and to understand the relation among performances obtained from phantom study, simulation and theoretical analysis. We performed both phantom and simulation studies, and analyzed the bias and variance of volume measurements estimated by a matched-filter-based estimator. We further corroborated results with a theoretical analysis to estimate the achievable performance bound, which was the Cramer–Rao’s lower bound (CRLB) of minimum variance for the size estimates. Results showed that estimates of non-attached solid small lesion volumes with object-to-background contrast of 31–46 HU can be accurate and precise, with less than 10.8% in percent bias and 4.8% in standard deviation of percent error (SPE), in standard dose scans. These results are consistent with theoretical (CRLB), computational (simulation) and empirical phantom bounds. The difference between the bounds is rather small (for SPE less than 1.9%) indicating that the theoretical- and simulation-based performance bounds can be good surrogates for physical phantom studies. (paper)

  17. Vigilance Task-Related Change in Brain Functional Connectivity as Revealed by Wavelet Phase Coherence Analysis of Near-Infrared Spectroscopy Signals.

    Science.gov (United States)

    Wang, Wei; Wang, Bitian; Bu, Lingguo; Xu, Liwei; Li, Zengyong; Fan, Yubo

    2016-01-01

    This study aims to assess the vigilance task-related change in connectivity in healthy adults using wavelet phase coherence (WPCO) analysis of near-infrared spectroscopy signals (NIRS). NIRS is a non-invasive neuroimaging technique for assessing brain activity. Continuous recordings of the NIRS signals were obtained from the prefrontal cortex (PFC) and sensorimotor cortical areas of 20 young healthy adults (24.9 ± 3.3 years) during a 10-min resting state and a 20-min vigilance task state. The vigilance task was used to simulate driving mental load by judging three random numbers (i.e., whether odd numbers). The task was divided into two sessions: the first 10 min (Task t1) and the second 10 min (Task t2). The WPCO of six channel pairs were calculated in five frequency intervals: 0.6-2 Hz (I), 0.145-0.6 Hz (II), 0.052-0.145 Hz (III), 0.021-0.052 Hz (IV), and 0.0095-0.021 Hz (V). The significant WPCO formed global connectivity (GC) maps in intervals I and II and functional connectivity (FC) maps in intervals III to V. Results show that the GC levels in interval I and FC levels in interval III were significantly lower in the Task t2 than in the resting state (p < 0.05), particularly between the left PFC and bilateral sensorimotor regions. Also, the reaction time (RT) shows an increase in Task t2 compared with that in Task t1. However, no significant difference in WPCO was found between Task t1 and resting state. The results showed that the change in FC at the range of 0.6-2 Hz was not attributed to the vigilance task per se, but the interaction effect of vigilance task and time factors. The findings suggest that the decreased attention level might be partly attributed to the reduced GC levels between the left prefrontal region and sensorimotor area. The present results provide a new insight into the vigilance task-related brain activity. PMID:27547182

  18. Vigilance Task-Related Change in Brain Functional Connectivity as Revealed by Wavelet Phase Coherence Analysis of Near-Infrared Spectroscopy Signals

    Science.gov (United States)

    Wang, Wei; Wang, Bitian; Bu, Lingguo; Xu, Liwei; Li, Zengyong; Fan, Yubo

    2016-01-01

    This study aims to assess the vigilance task-related change in connectivity in healthy adults using wavelet phase coherence (WPCO) analysis of near-infrared spectroscopy signals (NIRS). NIRS is a non-invasive neuroimaging technique for assessing brain activity. Continuous recordings of the NIRS signals were obtained from the prefrontal cortex (PFC) and sensorimotor cortical areas of 20 young healthy adults (24.9 ± 3.3 years) during a 10-min resting state and a 20-min vigilance task state. The vigilance task was used to simulate driving mental load by judging three random numbers (i.e., whether odd numbers). The task was divided into two sessions: the first 10 min (Task t1) and the second 10 min (Task t2). The WPCO of six channel pairs were calculated in five frequency intervals: 0.6–2 Hz (I), 0.145–0.6 Hz (II), 0.052–0.145 Hz (III), 0.021–0.052 Hz (IV), and 0.0095–0.021 Hz (V). The significant WPCO formed global connectivity (GC) maps in intervals I and II and functional connectivity (FC) maps in intervals III to V. Results show that the GC levels in interval I and FC levels in interval III were significantly lower in the Task t2 than in the resting state (p < 0.05), particularly between the left PFC and bilateral sensorimotor regions. Also, the reaction time (RT) shows an increase in Task t2 compared with that in Task t1. However, no significant difference in WPCO was found between Task t1 and resting state. The results showed that the change in FC at the range of 0.6–2 Hz was not attributed to the vigilance task per se, but the interaction effect of vigilance task and time factors. The findings suggest that the decreased attention level might be partly attributed to the reduced GC levels between the left prefrontal region and sensorimotor area. The present results provide a new insight into the vigilance task-related brain activity. PMID:27547182

  19. Composite materials. Volume 3 - Engineering applications of composites. Volume 4 - Metallic matrix composites. Volume 8 - Structural design and analysis, Part 2

    Science.gov (United States)

    Noton, B. R. (Editor); Kreider, K. G.; Chamis, C. C.

    1974-01-01

    This volume discusses a vaety of applications of both low- and high-cost composite materials in a number of selected engineering fields. The text stresses the use of fiber-reinforced composites, along with interesting material systems used in the electrical and nuclear industries. As to technology transfer, a similarity is noted between many of the reasons responsible for the utilization of composites and those problems requiring urgent solution, such as mechanized fabrication processes and design for production. Features topics include road transportation, rail transportation, civil aircraft, space vehicles, builing industry, chemical plants, and appliances and equipment. The laminate orientation code devised by Air Force materials laboratory is included. Individual items are announced in this issue.

  20. Open Educational Resources from Performance Task using Video Analysis and Modeling - Tracker and K12 science education framework

    CERN Document Server

    Wee, Loo Kang

    2014-01-01

    This invited paper discusses why Physics performance task by grade 9 students in Singapore is worth participating in for two reasons; 1) the video analysis and modeling are open access, licensed creative commons attribution for advancing open educational resources in the world and 2) allows students to be like physicists, where the K12 science education framework is adopted. Personal reflections on how physics education can be made more meaningful in particular Practice 1: Ask Questions, Practice 2: Use Models and Practice 5: Mathematical and Computational Thinking using Video Modeling supported by evidence based data from video analysis. This paper hopes to spur fellow colleagues to look into open education initiatives such as our Singapore Tracker community open educational resources curate on http://weelookang.blogspot.sg/p/physics-applets-virtual-lab.html as well as digital libraries http://iwant2study.org/lookangejss/ directly accessible through Tracker 4.86, EJSS reader app on Android and iOS and EJS 5....

  1. Beyond the initial 140 ms, lexical decision and reading aloud are different tasks: An ERP study with topographic analysis

    OpenAIRE

    Mahe, Gwendoline; Zesiger, Pascal Eric; Laganaro, Marina

    2015-01-01

    Most of our knowledge on the time-course of the mechanisms involved in reading derived from electrophysiological studies is based on lexical decision tasks. By contrast, very few ERP studies investigated the processes involved in reading aloud. It has been suggested that the lexical decision task provides a good index of the processes occurring during reading aloud, with only late processing differences related to task response modalities. However, some behavioral studies reported different s...

  2. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume I. Data analysis methodology and hardware description

    International Nuclear Information System (INIS)

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and had dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241Pu and 208-keV 237U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  3. Analysis of volume expansion data for periclase, lime, corundum and spinel at high temperatures

    Indian Academy of Sciences (India)

    B P Singh; H Chandra; R Shyam; A Singh

    2012-08-01

    We have presented an analysis of the volume expansion data for periclase (MgO), lime (CaO), corundum (Al2O3) and spinel (MgAl2O4) determined experimentally by Fiquet et al (1999) from 300K up to 3000K. The thermal equation of state due to Suzuki et al (1979) and Shanker et al (1997) are used to study the relationships between thermal pressure and volume expansion for the entire range of temperatures starting from room temperature up to the melting temperatures of the solids under study. Comparison of the results obtained in the present study with the corresponding experimental data reveal that the thermal pressure changes with temperature almost linearly up to quite high temperatures. At extremely high temperatures close to the melting temperatures thermal pressure deviates significantly from linearity. This prediction is consistent with other recent investigations. A quantitative analysis based on the theory of anharmonic effects has been presented to account for the nonlinear variation of the thermal pressure at high temperatures.

  4. Network analysis of returns and volume trading in stock markets: The Euro Stoxx case

    Science.gov (United States)

    Brida, Juan Gabriel; Matesanz, David; Seijas, Maria Nela

    2016-02-01

    This study applies network analysis to analyze the structure of the Euro Stoxx market during the long period from 2002 up to 2014. The paper generalizes previous research on stock market networks by including asset returns and volume trading as the main variables to study the financial market. A multidimensional generalization of the minimal spanning tree (MST) concept is introduced, by adding the role of trading volume to the traditional approach which only includes price returns. Additionally, we use symbolization methods to the raw data to study the behavior of the market structure in different, normal and critical, situations. The hierarchical organization of the network is derived, and the MST for different sub-periods of 2002-2014 is created to illustrate how the structure of the market evolves over time. From the structural topologies of these trees, different clusters of companies are identified and analyzed according to their geographical and economic links. Two important results are achieved. Firstly, as other studies have highlighted, at the time of the financial crisis after 2008 the network becomes a more centralized one. Secondly and most important, during our second period of analysis, 2008-2014, we observe that hierarchy becomes more country-specific where different sub-clusters of stocks belonging to France, Germany, Spain or Italy are found apart from their business sector group. This result may suggest that during this period of time financial investors seem to be worried most about country specific economic circumstances.

  5. Beyond the initial 140 ms, lexical decision and reading aloud are different tasks: An ERP study with topographic analysis.

    Science.gov (United States)

    Mahé, Gwendoline; Zesiger, Pascal; Laganaro, Marina

    2015-11-15

    Most of our knowledge on the time-course of the mechanisms involved in reading derived from electrophysiological studies is based on lexical decision tasks. By contrast, very few ERP studies investigated the processes involved in reading aloud. It has been suggested that the lexical decision task provides a good index of the processes occurring during reading aloud, with only late processing differences related to task response modalities. However, some behavioral studies reported different sensitivity to psycholinguistic factors between the two tasks, suggesting that print processing could differ at earlier processing stages. The aim of the present study was thus to carry out an ERP comparison between lexical decision and reading aloud in order to determine when print processing differs between these two tasks. Twenty native French speakers performed a lexical decision task and a reading aloud task with the same written stimuli. Results revealed different electrophysiological patterns on both waveform amplitudes and global topography between lexical decision and reading aloud from about 140 ms after stimulus presentation for both words and pseudowords, i.e., as early as the N170 component. These results suggest that only very early, low-level visual processes are common to the two tasks which differ in core processes. Taken together, our main finding questions the use of the lexical decision task as an appropriate paradigm to investigate reading processes and warns against generalizing its results to word reading. PMID:26244274

  6. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    International Nuclear Information System (INIS)

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate

  7. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Henriksen, K.; Kaye, R.D.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate.

  8. ITER task title - source term data, modelling, and analysis. ITER subtask no. S81TT05/5 (SEP 1-1). Global tritium source term analysis basis document. Subtask 1: operational tritium effluents and releases. Final report (1995 TASK)

    International Nuclear Information System (INIS)

    This document represents the final report for the global tritium source term analysis task initiated in 1995. The report presents a room-by-room map/table at the subsystem level for the ITER tritium systems, identifying the major equipment, secondary containments, tritium release sources, duration/frequency of tritium releases and the release pathways. The chronic tritium releases during normal operation, as well as tritium releases due to routine maintenance of the Water Distillation Unit, Isotope Separation System and Primary and Secondary Heat Transport Systems, have been estimated for most of the subsystems, based on the IDR design, the Design Description Documents (April - Jun 1995 issues) and the design updates up to December 1995. The report also outlines the methodology and the key assumptions that are adopted in preparing the tritium release estimates. The design parameters for the ITER Basic Performance Phase (BPP) have been used in estimating the tritium releases shown in the room-by-room map/table. The tritium release calculations and the room-by-room map/table have been prepared in EXCEL, so that the estimates can be refined easily as the design evolves and more detailed information becomes available. (author). 23 refs., tabs

  9. Parameter uncertainty analysis in the task of internal dose reconstruction based on 241Am organ activity measurements

    International Nuclear Information System (INIS)

    Retrospective individual dose assessment of workers chronically exposed to plutonium is an important task in investigation of possible health effects from internal plutonium depositions. In most cases inhalation is the primary mode of the plutonium exposures, though an additional route of plutonium intake through wounds (pinpricks) also occurs in some cases. Estimating of systemic and total body deposition of plutonium from urinalysis is usually used to carry out the task of individual dose assessment. But this technique used alone gives no information about the routes of intake and solubility of aerosol particles, which could result in wrong lung dose calculations. Direct in vivo measurements of 241Am content in lungs, skeleton and liver by whole-body counting technique allow one to improve the estimation of the organ deposition of plutonium and the estimation of the individual lung doses. Our method of plutonium dose calculating uses the fact that plutonium of reactor origin is accompanied by 241Am that is grown in from decay of the 241Pu parent. The algorithm applies the new ICRP Publication 66/67 models and takes into account such parameters as 241Pu/239Pu ratio, the activity ratio of 239Pu to the sum of alpha emitting plutonium isotopes, effective aging time of mixed fission products and actinides, particle size distribution of plutonium aerosols (AMAD) and a rate of intake within the period of employment. The purpose of this paper is to quantify the reliability of the model's prediction and retrospective dose calculations by the parameter uncertainty analysis. It takes into account the uncertainty of all parameters describing the components of the plutonium aerosol, AMAD and the rate of intake. The parameter uncertainty analysis involved assigning probability distributions to each parameter and the use of Monte Carlo simulation technique to produce a quantitative statement of confidence in the model's prediction. It is shown that cumulative distribution

  10. A simple algorithm for subregional striatal uptake analysis with partial volume correction in dopaminergic PET imaging

    International Nuclear Information System (INIS)

    In positron emission tomography (PET) of the dopaminergic system, quantitative measurements of nigrostriatal dopamine function are useful for differential diagnosis. A subregional analysis of striatal uptake enables the diagnostic performance to be more powerful. However, the partial volume effect (PVE) induces an underestimation of the true radioactivity concentration in small structures. This work proposes a simple algorithm for subregional analysis of striatal uptake with partial volume correction (PVC) in dopaminergic PET imaging. The PVC algorithm analyzes the separate striatal subregions and takes into account the PVE based on the recovery coefficient (RC). The RC is defined as the ratio of the PVE-uncorrected to PVE-corrected radioactivity concentration, and is derived from a combination of the traditional volume of interest (VOI) analysis and the large VOI technique. The clinical studies, comprising 11 patients with Parkinson's disease (PD) and 6 healthy subjects, were used to assess the impact of PVC on the quantitative measurements. Simulations on a numerical phantom that mimicked realistic healthy and neurodegenerative situations were used to evaluate the performance of the proposed PVC algorithm. In both the clinical and the simulation studies, the striatal-to-occipital ratio (SOR) values for the entire striatum and its subregions were calculated with and without PVC. In the clinical studies, the SOR values in each structure (caudate, anterior putamen, posterior putamen, putamen, and striatum) were significantly higher by using PVC in contrast to those without. Among the PD patients, the SOR values in each structure and quantitative disease severity ratings were shown to be significantly related only when PVC was used. For the simulation studies, the average absolute percentage error of the SOR estimates before and after PVC were 22.74% and 1.54% in the healthy situation, respectively; those in the neurodegenerative situation were 20.69% and 2

  11. The history of NATO TNF policy: The role of studies, analysis and exercises conference proceedings. Volume 1, Introduction and summary

    Energy Technology Data Exchange (ETDEWEB)

    Rinne, R.L. [ed.

    1994-02-01

    This conference was organized to study and analyze the role of simulation, analysis, modeling, and exercises in the history of NATO policy. The premise was not that the results of past studies will apply to future policy, but rather that understanding what influenced the decision process -- and how -- would be of value. The structure of the conference was built around discussion panels. The panels were augmented by a series of papers and presentations focusing on particular TNF events, issues, studies or exercise. The conference proceedings consist of three volumes. This volume, Volume 1, contains the conference introduction, agenda, biographical sketches of principal participants, and analytical summary of the presentations and discussion panels. Volume 2 contains a short introduction and the papers and presentations from the conference. Volume 3 contains selected papers by Brig. Gen. Robert C. Richardson III (Ret.).

  12. The history of NATO TNF policy: The role of studies, analysis and exercises conference proceedings. Volume 2: Papers and presentations

    Energy Technology Data Exchange (ETDEWEB)

    Rinne, R.L.

    1994-02-01

    This conference was organized to study and analyze the role of simulation, analysis, modeling, and exercises in the history of NATO policy. The premise was not that the results of past studies will apply to future policy, but rather that understanding what influenced the decision process -- and how -- would be of value. The structure of the conference was built around discussion panels. The panels were augmented by a series of papers and presentations focusing on particular TNF events, issues, studies, or exercises. The conference proceedings consist of three volumes. Volume 1 contains the conference introduction, agenda, biographical sketches of principal participants, and analytical summary of the presentations and panels. This volume contains a short introduction and the papers and presentations from the conference. Volume 3 contains selected papers by Brig. Gen. Robert C. Richardson III (Ret.). Individual papers in this volume were abstracted and indexed for the database.

  13. A Typology of Tasks for Mobile-Assisted Language Learning: Recommendations from a Small-Scale Needs Analysis

    Science.gov (United States)

    Park, Moonyoung; Slater, Tammy

    2014-01-01

    In response to the research priorities of members of TESOL (Teachers of English to Speakers of Other Languages), this study investigated language learners' realworld tasks in mobile-assisted language learning (MALL) to inform the future development of pedagogic tasks for academic English as a second language (ESL) courses. The data included…

  14. Reflective Analysis as a Tool for Task Redesign: The Case of Prospective Elementary Teachers Solving and Posing Fraction Comparison Problems

    Science.gov (United States)

    Thanheiser, Eva; Olanoff, Dana; Hillen, Amy; Feldman, Ziv; Tobias, Jennifer M.; Welder, Rachael M.

    2016-01-01

    Mathematical task design has been a central focus of the mathematics education research community over the last few years. In this study, six university teacher educators from six different US institutions formed a community of practice to explore key aspects of task design (planning, implementing, reflecting, and modifying) in the context of…

  15. Evaluation of an inertial sensor system for analysis of timed-up-and-go under dual-task demands.

    Science.gov (United States)

    Coulthard, Jason T; Treen, Tanner T; Oates, Alison R; Lanovaz, Joel L

    2015-05-01

    Functional tests, such as the timed-up-and-go (TUG), are routinely used to screen for mobility issues and fall risk. While the TUG is easy to administer and evaluate, its single time-to-completion outcome may not discriminate between different mobility challenges. Wearable sensors provide an opportunity to collect a variety of additional variables during clinical tests. The purpose of this study was to assess a new wearable inertial sensor system (iTUG) by investigating the effects of cognitive tasks in a dual-task paradigm on spatiotemporal and kinematic variables during the TUG. No previous studies have looked at both spatiotemporal variables and kinematics during dual-task TUG tests. 20 healthy young participants (10 males) performed a total 15 TUG trials with two different cognitive tasks and a normal control condition. Total time, along with spatiotemporal gait parameters and kinematics for all TUG subtasks (sit-to-stand, walking, turn, turn-to-sit), were measured using the inertial sensors. Time-to-completion from iTUG was highly correlated with concurrent manual timing. Spatiotemporal variables during walking showed expected differences between control and cognitive dual-tasks while trunk kinematics appeared to show more sensitivity to dual-tasks than reported previously in straight line walking. Non-walking TUG subtasks showed only minor changes during dual-task conditions indicating a possible attentional shift away from the cognitive task. Stride length and some variability measures were significantly different between the two cognitive tasks suggesting an ability to discriminate between tasks. Overall, the use of the iTUG system allows the collection of both traditional and potentially more discriminatory variables with a protocol that is easily used in a clinical setting. PMID:25827680

  16. Investigation of Advanced Counterrotation Blade Configuration Concepts for High Speed Turboprop Systems. Task 2: Unsteady Ducted Propfan Analysis

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.; Bettner, James L.

    1991-01-01

    The primary objective was the development of a time dependent 3-D Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The resulting computer codes are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). A computer program user's manual is presented for the ADPAC. Aerodynamic calculations were based on a four stage Runge-Kutta time marching finite volume solution technique with added numerical dissipation. A time accurate implicit residual smoothing operator was used for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted flows.

  17. Multi-task flow system for potentiometric analysis: its application to the determination of vitamin B6 in pharmaceuticals.

    Science.gov (United States)

    Fernandes, R N; Sales, M G; Reis, B F; Zagatto, E A; Araújo, A N; Montenegro, M C

    2001-07-01

    A flow set-up based on the sequential injection analysis concept was designed, aiming at increased automation and robustness of procedures related with potentiometric detection in pharmaceutical control. In this sense, programmable set-up calibration, ion-selective electrode characterization, standard addition techniques and titration procedures could be carried out without any stock solutions conventional handling or modification on the physical structure of the flow system. Evaluation of a flow-through vitamin B6 selective electrode and its application to routine analysis of pharmaceuticals were selected as models to demonstrate the system potentialities. The system ability to generate in-line calibrating solutions was verified by comparing the results with those obtained with solutions prepared by the manual procedure. The vitamin B6 determination in pharmaceutical products was carried out and in-line performed recoveries gave values within 97.4-103.5%. The system ability to perform titrations was ascertained using the precipitation reaction of vitamin B6 with tetraphenylborate. Other profitable features such as lower reagent consumption with a low effluent generation volume were also achieved. PMID:11377053

  18. RETRAN code analysis of Tsuruga-2 plant chemical volume control system (CVCS) reactor coolant leakage incident

    International Nuclear Information System (INIS)

    In the Chemical Volume Control System (CVCS) reactor primary coolant leakage incident, which occurred in Tsuruga-2 (4-loop PWR, 3,423 MWt, 1,160 MWe) on July 12, 1999, it took about 14 hours before the leakage isolation. The delayed leakage isolation and a large amount of leakage have become a social concern. Effective procedure modification was studied. Three betterments were proposed based on a qualitative analysis to reduce the pressure and temperature of the primary loop as fast as possible by the current plant facilities while maintaining enough subcooling of the primary loop. I analyzed the incident with RETRAN code in order to quantitatively evaluate the leakage reduction when these betterments are adopted. This paper is very new because it created a typical analysis method for PWR plant behavior during plant shutdown procedure which conventional RETRAN transient analyses rarely dealt with. Also the event time is very long. To carry out this analysis successfully, I devised new models such as an Residual Heat Removal System (RHR) model etc. and simplified parts of the conventional model. Based on the analysis results, I confirmed that leakage can be reduced by about 30% by adopting these betterments. Then the Japan Atomic Power Company (JAPC) modified the operational procedure for reactor primary coolant leakage events adopting these betterments. (author)

  19. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 4; Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols; Revised

    Science.gov (United States)

    Mueller, J. L. (Editor); Fargion, Giuletta S. (Editor); McClain, Charles R. (Editor); Pegau, Scott; Zaneveld, J. Ronald V.; Mitchell, B. Gregg; Kahru, Mati; Wieland, John; Stramska, Malgorzat

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.

  20. Approach, avoidance, and affect: A meta-analysis of approach-avoidance tendencies in manual reaction time tasks

    Directory of Open Access Journals (Sweden)

    Hans ePhaf

    2014-05-01

    Full Text Available Approach action tendencies towards positive stimuli and avoidance tendencies from negative stimuli are widely seen to foster survival. Many studies have shown that approach and avoidance arm movements are facilitated by positive and negative affect, respectively. There is considerable debate whether positively and negatively valenced stimuli prime approach and avoidance movements directly (i.e., immediate, unintentional, implicit, automatic, and stimulus-based, or indirectly (i.e., after conscious or nonconscious interpretation of the situation. The direction and size of these effects were often found to depend on the instructions referring to the stimulus object or the self, and on explicit vs. implicit stimulus evaluation. We present a meta-analysis of 29 studies included for their use of strongly positive and negative stimuli, with 81 effect sizes derived solely from the means and standard deviations (combined N = 1538, to examine the automaticity of the link between affective information processing and approach and avoidance, and to test whether it depends on instruction, type of approach-avoidance task, and stimulus type. Results show a significant small to medium-sized effect after correction for publication bias. The strongest arguments for an indirect link between affect and approach-avoidance were the absence of evidence for an effect with implicit evaluation, and the opposite directions of the effect with self and object-related interpretations. The link appears to be influenced by conscious or nonconscious intentions to deal with affective stimuli.