WorldWideScience

Sample records for analysis task volume

  1. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  2. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  3. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    Kaye, R.D.; Henriksen, K.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  4. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  5. Nuclear power plant personnel qualifications and training: TAPS: the task analysis profiling system. Volume 2

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1985-06-01

    This report discusses an automated task analysis profiling system (TAPS) designed to provide a linking tool between the behaviors of nuclear power plant operators in performing their tasks and the measurement tools necessary to evaluate their in-plant performance. TAPS assists in the identification of the entry-level skill, knowledge, ability and attitude (SKAA) requirements for the various tasks and rapidly associates them with measurement tests and human factors principles. This report describes the development of TAPS and presents its first demonstration. It begins with characteristics of skilled human performance and proceeds to postulate a cognitive model to formally describe these characteristics. A method is derived for linking SKAA characteristics to measurement tests. The entire process is then automated in the form of a task analysis computer program. The development of the program is detailed and a user guide with annotated code listings and supporting test information is provided

  6. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    International Nuclear Information System (INIS)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A.; Saunders, W.M.; Lepage, R.P.; Chin, E.; Schoenfeld, I.; Serig, D.I.

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses

  7. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A. [Pacific Science and Engineering Group, San Diego, CA (United States); Saunders, W.M.; Lepage, R.P.; Chin, E. [University of California San Diego Medical Center, CA (United States). Div. of Radiation Oncology; Schoenfeld, I.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.

  8. A Study of Job Demands and Curriculum Development in Agricultural Training Related to the Muskegon County Wastewater Management System. Final Report. Volume II. Task Analysis Results.

    Science.gov (United States)

    Fisher, Harold S.; And Others

    This is the second volume of a four-volume report of a research project designed to (1) identify job needs for agricultural occupations which will result from the Muskegon County Wastewater Management System and perform a task analysis on each occupation, (2) develop instructional modules and determine their place in either high school or 2-year…

  9. Behavioral Task Analysis

    Science.gov (United States)

    2010-01-01

    methods included task analysis as a critical phase in developing instruction and training. Mon- temerlo and Tennyson (1976) noted that from 1951 to 1976...designed. The trend in the U.S . Department of Defense toward extensive procedural documentation noted by Montemerlo and Tennyson (1976) has not...M. Gagne’ (Ed.), Psychological principles in system development (pp. 187-228). New York: Holt. Montemerlo, M. D., & Tennyson , M. E. (1975

  10. Feasibility study of modern airships, phase 1. Volume 1: Summary and mission analysis (tasks 2 and 4)

    Science.gov (United States)

    Bloetscher, F.

    1975-01-01

    The histroy, potential mission application, and designs of lighter-than-air (LTA) vehicles are researched and evaluated. Missions are identified to which airship vehicles are potentially suited. Results of the mission analysis are combined with the findings of a parametric analysis to formulate the mission/vehicle combinations recommended for further study. Current transportation systems are surveyed and potential areas of competition are identified as well as potential missions resulting from limitations of these systems. Potential areas of military usage are included.

  11. System analysis task group

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    At this meeting, the main tasks of the study group were to discuss their task report with other task groups and to formulate the five-year research program, including next year's plans. A summary of the discussion with other task groups is presented. The general objective of the five-year program is to gather all elements necessary for a decision on the technical feasibility of the subseabed option. In addition, site selection criteria consistent with both radiological assessment and engineering capability will be produced. The task group report discussed radiological assessments, normal or base-case assessments, operational failures, low-probability postdisposal events, engineering studies, radiological criteria, legal aspects, social aspects, institutional aspects, generic comparison with other disposal options, and research priorities. The text of the report is presented along with supporting documents

  12. Data Systems Task Analysis.

    Science.gov (United States)

    1979-08-01

    QUALITY CCNTROL SUPERVISOR/NCOIC 369. PROGRAMMER 07?. PROGRAMMER ANALYST C7l, PROGRAMMING/ANALYSIS SUPERVISCR 󈨌. UALITY CONTROL PETTY OFFICER/CLERK...CLASSIFICATION OF THE FACILITY OR SITE THAT YOU ARE PRESENTLY WORKING IN? 01. CDPA (CENTRAL DESIGN PROGRAMMING ACTIVITY) 02. RASC (REGIONAL AUTOMATED...CARDS MANUALLY I)9. COORDINATE WITH CfFICES CF PFIMARY RESPONSIBILITY (OPR) ON NEW OR REVISED REPORTING REQUIREMENTS 115. DETERMINE ALTERNATE METHODS

  13. Genetic Inventory Task Final Report. Volume 2

    Science.gov (United States)

    Venkateswaran, Kasthuri; LaDuc, Myron T.; Vaishampayan, Parag

    2012-01-01

    Contaminant terrestrial microbiota could profoundly impact the scientific integrity of extraterrestrial life-detection experiments. It is therefore important to know what organisms persist on spacecraft surfaces so that their presence can be eliminated or discriminated from authentic extraterrestrial biosignatures. Although there is a growing understanding of the biodiversity associated with spacecraft and cleanroom surfaces, it remains challenging to assess the risk of these microbes confounding life-detection or sample-return experiments. A key challenge is to provide a comprehensive inventory of microbes present on spacecraft surfaces. To assess the phylogenetic breadth of microorganisms on spacecraft and associated surfaces, the Genetic Inventory team used three technologies: conventional cloning techniques, PhyloChip DNA microarrays, and 454 tag-encoded pyrosequencing, together with a methodology to systematically collect, process, and archive nucleic acids. These three analysis methods yielded considerably different results: Traditional approaches provided the least comprehensive assessment of microbial diversity, while PhyloChip and pyrosequencing illuminated more diverse microbial populations. The overall results stress the importance of selecting sample collection and processing approaches based on the desired target and required level of detection. The DNA archive generated in this study can be made available to future researchers as genetic-inventory-oriented technologies further mature.

  14. Survey of Task Analysis Methods

    Science.gov (United States)

    1978-02-14

    Taylor, for example, referred to task analysis in his work on scientific management (65). In the same time frame, the Gilbreths developed the first...ciation, Washington, D. C., 1965. 21. Gilbreth , F. B. Bricklaying System, M. C. Clark, New York, 1909. -42- REFERENCES (Continued) 22. Gilbreth , F

  15. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  16. [Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio]. Volume 3, Sampling and analysis plan (SAP): Phase 1, Task 4, Field Investigation: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    In April 1990, Wright-Patterson Air Force Base (WPAFB), initiated an investigation to evaluate a potential Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) removal action to prevent, to the extent practicable, the offsite migration of contaminated ground water from WPAFB. WPAFB retained the services of the Environmental Management Operations (EMO) and its principle subcontractor, International Technology Corporation (IT) to complete Phase 1 of the environmental investigation of ground-water contamination at WPAFB. Phase 1 of the investigation involves the short-term evaluation and potential design for a program to remove ground-water contamination that appears to be migrating across the western boundary of Area C, and across the northern boundary of Area B along Springfield Pike. Primarily, Task 4 of Phase 1 focuses on collection of information at the Area C and Springfield Pike boundaries of WPAFB. This Sampling and Analysis Plan (SAP) has been prepared to assist in completion of the Task 4 field investigation and is comprised of the Quality Assurance Project Plan (QAPP) and the Field Sampling Plan (FSP).

  17. Cognitive task load analysis : Allocating tasks and designing support

    NARCIS (Netherlands)

    Neerincx, M.A.

    2003-01-01

    We present a method for Cognitive Task Analysis that guides the early stages of software development, aiming at an optimal cognitive load for operators of process control systems. The method is based on a practical theory of cognitive task load and support. In addition to the classical measure

  18. Bare-Hand Volume Cracker for Raw Volume Data Analysis

    Directory of Open Access Journals (Sweden)

    Bireswar Laha

    2016-09-01

    Full Text Available Analysis of raw volume data generated from different scanning technologies faces a variety of challenges, related to search, pattern recognition, spatial understanding, quantitative estimation, and shape description. In a previous study, we found that the Volume Cracker (VC 3D interaction (3DI technique mitigated some of these problems, but this result was from a tethered glove-based system with users analyzing simulated data. Here, we redesigned the VC by using untethered bare-hand interaction with real volume datasets, with a broader aim of adoption of this technique in research labs. We developed symmetric and asymmetric interfaces for the Bare-Hand Volume Cracker (BHVC through design iterations with a biomechanics scientist. We evaluated our asymmetric BHVC technique against standard 2D and widely used 3D interaction techniques with experts analyzing scanned beetle datasets. We found that our BHVC design significantly outperformed the other two techniques. This study contributes a practical 3DI design for scientists, documents lessons learned while redesigning for bare-hand trackers, and provides evidence suggesting that 3D interaction could improve volume data analysis for a variety of visual analysis tasks. Our contribution is in the realm of 3D user interfaces tightly integrated with visualization, for improving the effectiveness of visual analysis of volume datasets. Based on our experience, we also provide some insights into hardware-agnostic principles for design of effective interaction techniques.

  19. Overview of job and task analysis

    International Nuclear Information System (INIS)

    Gertman, D.I.

    1984-01-01

    During the past few years the nuclear industry has become concerned with predicting human performance in nuclear power plants. One of the best means available at the present time to make sure that training, procedures, job performance aids and plant hardware match the capabilities and limitations of personnel is by performing a detailed analysis of the tasks required in each job position. The approved method for this type of analysis is referred to as job or task analysis. Job analysis is a broader type of analysis and is usually thought of in terms of establishing overall performance objectives, and in establishing a basis for position descriptions. Task analysis focuses on the building blocks of task performance, task elements, and places them within the context of specific performance requirements including time to perform, feedback required, special tools used, and required systems knowledge. The use of task analysis in the nuclear industry has included training validation, preliminary risk screening, and procedures development

  20. Reduced lateral prefrontal cortical volume is associated with performance on the modified Iowa Gambling Task: A surface based morphometric analysis of previously deployed veterans.

    Science.gov (United States)

    Fogleman, Nicholas D; Naaz, Farah; Knight, Lindsay K; Stoica, Teodora; Patton, Samantha C; Olson-Madden, Jennifer H; Barnhart, Meghan C; Hostetter, Trisha A; Forster, Jeri; Brenner, Lisa A; Banich, Marie T; Depue, Brendan E

    2017-09-30

    Post-traumatic stress disorder (PTSD) and mild traumatic brain injury (mTBI) are two of the most common consequences of combat deployment. Estimates of comorbidity of PTSD and mTBI are as high as 42% in combat exposed Operation Enduring Freedom, Operation Iraqi Freedom and Operation New Dawn (OEF/OIF/OND) Veterans. Combat deployed Veterans with PTSD and/or mTBI exhibit deficits in classic executive function (EF) tasks. Similarly, the extant neuroimaging literature consistently indicates abnormalities of the ventromedial prefrontal cortex (vmPFC) and amygdala/hippocampal complex in these individuals. While studies examining deficits in classical EF constructs and aberrant neural circuitry have been widely replicated, it is surprising that little research examining reward processing and decision-making has been conducted in these individuals, specifically, because the vmPFC has long been implicated in underlying such processes. Therefore, the current study employed the modified Iowa Gambling Task (mIGT) and structural neuroimaging to assess whether behavioral measures related to reward processing and decision-making were compromised and related to cortical morphometric features of OEF/OIF/OND Veterans with PTSD, mTBI, or co-occurring PTSD/mTBI. Results indicated that gray matter morphometry in the lateral prefrontal cortex (lPFC) predicted performance on the mIGT among all three groups and was significantly reduced, as compared to the control group. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  1. An ergonomic task analysis of spinal anaesthesia.

    LENUS (Irish Health Repository)

    Ajmal, Muhammad

    2009-12-01

    Ergonomics is the study of physical interaction between humans and their working environment. The objective of this study was to characterize the performance of spinal anaesthesia in an acute hospital setting, applying ergonomic task analysis.

  2. Development of flight experiment task requirements. Volume 2: Technical Report. Part 2: Appendix H: Tasks-skills data series

    Science.gov (United States)

    Hatterick, G. R.

    1972-01-01

    The data sheets presented contain the results of the task analysis portion of the study to identify skill requirements of space shuttle crew personnel. A comprehensive data base is provided of crew functions, operating environments, task dependencies, and task-skills applicable to a representative cross section of earth orbital research experiments.

  3. Job and task analysis for technical staff

    International Nuclear Information System (INIS)

    Toline, B.C.

    1991-01-01

    In September of 1989 Cooper Nuclear Station began a project to upgrade the Technical Staff Training Program. This project's roots began by performing job and Task Analysis for Technical Staff. While the industry has long been committed to Job and Task Analysis to target performance based instruction for single job positions, this approach was unique in that it was not originally considered appropriate for a group as diverse as Tech Staff. Much to his satisfaction the Job and Task Analysis Project was much less complicated for Technical Staff than the author had imagined. The benefits of performing the Job and Task Analysis for Technical Staff have become increasingly obvious as he pursues lesson plan development and course revisions. The outline for this presentation will be as follows: philosophy adopted; preparation of the job survey document; performing the job analysis; performing task analysis for technical staff and associated pitfalls; clustering objectives for training and comparison to existing program; benefits now and in the future; final phase (comparison to INPO guides and meeting the needs of non-degreed engineering professionals); and conclusion. By focusing on performance based needs for engineers rather than traditional academics for training the author is confident the future Technical Staff Program will meet the challenges ahead and will exceed requirements for accreditation

  4. From Cognitive Task Analysis to Simulation: Developing a Synthetic Team Task for AWACS Weapons Directors

    National Research Council Canada - National Science Library

    Hess, Stephen M; MacMillan, Jean; Serfaty, Daniel; Elliott, Linda

    2005-01-01

    ... while maintaining others. This paper reports the results of a successful effort to create a synthetic task environment that captures key elements of a team task based on Cognitive Task Analysis of the important features...

  5. Task analysis in neurosciences programme design - neurological ...

    African Journals Online (AJOL)

    Defining educational objectives is the key to achieving the goal of professional competence in students. The technique of task analysis was selected to determine components of competence in clinical neurology appropriate to the needs of primary care. A survey of neurological problems in general practice revealed that ...

  6. Radiation protection technician job task analysis manual

    International Nuclear Information System (INIS)

    1990-03-01

    This manual was developed to assist all DOE contractors in the design and conduct of job task analysis (JTA) for the radiation protection technician. Experience throughout the nuclear industry and the DOE system has indicated that the quality and efficiency in conducting a JTA at most sites is greatly enhanced by using a generic task list for the position, and clearly written guidelines on the JTA process. This manual is designed to provide this information for personnel to use in developing and conducting site-specific JTAs. (VC)

  7. Task force on compliance and enforcement. Final report. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    Recommendations for measures to strengthen the FEA enforcement program in the area of petroleum price regulation are presented. Results of task force efforts are presented in report and recommendations sections concerned with pending cases, compliance program organization, enforcement powers, compliance strategy, and audit staffing and techniques. (JRD)

  8. Final report on the Pathway Analysis Task

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, F.W.; Kirchner, T.B. [Colorado State Univ., Fort Collins, CO (United States)

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University`s Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere.

  9. Final report on the Pathway Analysis Task

    International Nuclear Information System (INIS)

    Whicker, F.W.; Kirchner, T.B.

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University's Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere

  10. Lung volume reduction of pulmonary emphysema: the radiologist task.

    Science.gov (United States)

    Milanese, Gianluca; Silva, Mario; Sverzellati, Nicola

    2016-03-01

    Several lung volume reduction (LVR) techniques have been increasingly evaluated in patients with advanced pulmonary emphysema, especially in the last decade. Radiologist plays a pivotal role in the characterization of parenchymal damage and, thus, assessment of eligibility criteria. This review aims to discuss the most common LVR techniques, namely LVR surgery, endobronchial valves, and coils LVR, with emphasis on the role of computed tomography (CT). Several trials have recently highlighted the importance of regional quantification of emphysema by computerized CT-based segmentation of hyperlucent parenchyma, which is strongly recommended for candidates to any LVR treatment. In particular, emphysema distribution pattern and fissures integrity are evaluated to tailor the choice of the most appropriate LVR technique. Furthermore, a number of CT measures have been tested for the personalization of treatment, according to imaging detected heterogeneity of parenchymal disease. CT characterization of heterogeneous parenchymal abnormalities provides criteria for selection of the preferable treatment in each patient and improves outcome of LVR as reflected by better quality of life, higher exercise tolerance, and lower mortality.

  11. A Comparative Analysis of Task Modeling Notations

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    paper a comparative analysis of selected models involving multiple users in an interaction is provided in order to identify concepts which are underexplored in today's multi-user interaction task modeling. This comparative analysis is based on three families of criteria: information criteria, conceptual coverage, and expressiveness. Merging the meta-models of the selected models enables to come up with a broader meta-model that could be instantiated in most situations involving multi-user interaction, like workflow information systems, CSCW.

  12. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  13. A cognitive task analysis of the SGTR scenario

    International Nuclear Information System (INIS)

    Hollnagel, E.; Edland, A.; Svenson, O.

    1996-04-01

    This report constitutes a contribution to the NKS/RAK-1:3 project on Integrated Sequence Analysis. Following the meeting at Ringhals, the work was proposed to be performed by the following three steps: Task 1. Cognitive Task Analysis of the E-3 procedure. Task 2. Evaluation and revision of task analysis with Ringhals/KSU experts. Task 3. Integration with simulator data. The Cognitive Task Analysis (CTA) of Task 1 uses the Goals-Means Task Analysis (GMTA) method to identify the sequence of tasks and task steps necessary to achieve the goals of the procedure. It is based on material supplied by Ringhals, which describes the E-3 procedure, including the relevant ES and ECA procedures. The analysis further outlines the cognitive demands profile associated with individual task steps as well as with the task as a whole, as an indication of the nominal task load. The outcome of the cognitive task analysis provides a basis for proposing an adequate event tree. This report describes the results from Task 1. The work has included a two-day meeting between the three contributors, as well as the exchange of intermediate results and comments throughout the period. After the initial draft of the report was prepared, an opportunity was given to observe the SGTR scenario in a full-scope training simulator, and to discuss the details with the instructors. This led to several improvements from the initial draft. (EG)

  14. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  15. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  16. Research and development of a heat-pump water heater. Volume 2. R and D task reports

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, R.L.; Amthor, F.R.; Doyle, E.J.

    1978-08-01

    The heat pump water heater is a device that works much like a window air conditioner except that heat from the home is pumped into a water tank rather than to the outdoors. The objective established for the device is to operate with a Coefficient of Performance (COP) of 3 or, an input of one unit of electric energy would create three units of heat energy in the form of hot water. With such a COP, the device would use only one-third the energy and at one-third the cost of a standard resistance water heater. This Volume 2 contains the final reports of the three major tasks performed in Phase I. In Task 2, a market study identifies the future market and selects an initial target market and channel of distribution, all based on an analysis of the parameters affecting feasibility of the device and the factors that will affect its market acceptance. In the Task 3 report, the results of a design and test program to arrive at final designs of heat pumps for both new water heaters and for retrofitting existing water heaters are presented. In the Task 4 report, a plan for an extensive field demonstration involving use in actual homes is presented. Volume 1 contains a final summary report of the information in Volume 2.

  17. Gray matter volume and dual-task gait performance in mild cognitive impairment.

    Science.gov (United States)

    Doi, Takehiko; Blumen, Helena M; Verghese, Joe; Shimada, Hiroyuki; Makizako, Hyuma; Tsutsumimoto, Kota; Hotta, Ryo; Nakakubo, Sho; Suzuki, Takao

    2017-06-01

    Dual-task gait performance is impaired in older adults with mild cognitive impairment, but the brain substrates associated with dual-task gait performance are not well-established. The relationship between gray matter and gait speed under single-task and dual-task conditions (walking while counting backward) was examined in 560 seniors with mild cognitive impairment (non-amnestic mild cognitive impairment: n = 270; mean age = 72.4 yrs., 63.6 % women; amnestic mild cognitive impairment: n = 290; mean age = 73.4 yrs., 45.4 % women). Multivariate covariance-based analyses of magnetic resonance imaging data, adjusted for potential confounders including single-task gait speed, were performed to identify gray matter patterns associated with dual-task gait speed. There were no differences in gait speed or cognitive performance during dual-task gait between individuals with non-amnestic mild cognitive impairment and amnestic mild cognitive impairment. Overall, increased dual-task gait speed was associated with a gray matter pattern of increased volume in medial frontal gyrus, superior frontal gyrus, anterior cingulate, cingulate, precuneus, fusiform gyrus, middle occipital gyrus, inferior temporal gyrus and middle temporal gyrus. The relationship between dual-task gait speed and brain substrates also differed by mild cognitive impairment subtype. Our study revealed a pattern of gray matter regions associated with dual-task performance. Although dual-task gait performance was similar in amnestic and non-amnestic mild cognitive impairment, the gray matter patterns associated with dual-task gait performance differed by mild cognitive impairment subtype. These findings suggest that the brain substrates supporting dual-task gait performance in amnestic and non-amnestic subtypes are different, and consequently may respond differently to interventions, or require different interventions.

  18. Electronic toll collection interoperability study in Brazil. Task 2 : economic and financial analysis and Task 3 : environmental/societal analysis

    Science.gov (United States)

    1998-05-01

    This report, conducted by Parsons Bricknerhoff International, was funded by the U.S. Trade and Development Agency. The report examines the potential for developing electronic toll collection systems in Brazil. This is Volume II and it contains "Task ...

  19. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    Moray, N.P.; Senders, J.W.; Rhodes, W.

    1985-06-01

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  20. Task Analysis data Processing and Enhanced Representations (TAPER), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Task Analysis (TA) is a fundamental part of NASA system design and validation. TAs are used to produce Master Task Lists that support engineering teams and...

  1. Workplace for analysis of task performance

    NARCIS (Netherlands)

    Bos, J; Mulder, LJM; van Ouwerkerk, RJ; Maarse, FJ; Akkerman, AE; Brand, AN; Mulder, LJM

    2003-01-01

    In current research on mental workload and task performance a large gap exists between laboratory based studies and research projects in real life working practice. Tasks conducted within a laboratory environment often lack a strong resemblance with real life working situations. This paper presents

  2. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  3. Job and task analysis: a view from the inside

    International Nuclear Information System (INIS)

    Allison, C.E.

    1981-01-01

    This paper is not intended to describe how to perform a Job and Task Analysis. There are a wide variety of approaches to conducting a Job and Task Analysis, many of which have been developed by highy seasoned and skilled professionals in this field. This paper is intended to discuss the internal support, in terms of money, time, and people, required for the Job and Task Analysis Project

  4. Project FARE Task IV Report: Urban Mass Transportation Industry Financial and Operating Data Reporting System. Volume I. Task and Project Summary.

    Science.gov (United States)

    1973-11-01

    The report contains a description of the uniform reporting system for the urban mass transit industry designed and tested in Project FARE. It is presented in five volumes. Volume I contains a description of how Task IV was accomplished and the conclu...

  5. Guidelines for job and task analysis for DOE nuclear facilities

    International Nuclear Information System (INIS)

    1983-06-01

    The guidelines are intended to be responsive to the need for information on methodology, procedures, content, and use of job and task analysis since the establishment of a requirement for position task analysis for Category A reactors in DOE 5480.1A, Chapter VI. The guide describes the general approach and methods currently being utilized in the nuclear industry and by several DOE contractors for the conduct of job and task analysis and applications to the development of training programs or evaluation of existing programs. In addition other applications for job and task analysis are described including: operating procedures development, personnel management, system design, communications, and human performance predictions

  6. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 2: Participant Case Studies

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  7. Task Analysis as a Resource for Strengthening Health Systems.

    Science.gov (United States)

    Hart, Leah J; Carr, Catherine; Fullerton, Judith T

    2016-01-01

    Task analysis is a descriptive study methodology that has wide application in the health professions. Task analysis is particularly useful in assessment and definition of the knowledge, skills, and behaviors that define the scope of practice of a health profession or occupation. Jhpiego, a US-based nongovernmental organization, has adapted traditional task analysis methods in several countries in assessment of workforce education and practice issues. Four case studies are presented to describe the utility and adaptability of the task analysis approach. Traditional task analysis field survey methods were used in assessment of the general and maternal-child health nursing workforce in Mozambique that led to curriculum redesign, reducing the number of education pathways from 4 to 2. The process of health system strengthening in Liberia, following a long history of civil war conflict, included a traditional task analysis study conducted among 119 registered nurses and 46 certified midwives who had graduated in the last 6 months to 2 years to determine gaps in education and preparation. An innovative approach for data collection that involves "playing cards" to document participant opinions (Task Master, Mining for Data) was developed by Jhpiego for application in other countries. Results of a task analysis involving 54 nurses and 100 nurse-midwives conducted in Lesotho were used to verify the newly drafted scope and standards of practice for nurses and to inform planning for a competency-based preservice curriculum for nursing. The Nursing and Midwifery Council developed a 100-question licensing examination for new graduates following a task analysis in Botswana. The task analysis process in each country resulted in recommendations that were action oriented and were implemented by the country governments. For maximal utility and ongoing impact, a task analysis study should be repeated on a periodic basis and more frequently in countries undergoing rapid change in

  8. Task Analysis Assessment on Intrastate Bus Traffic Controllers

    Science.gov (United States)

    Yen Bin, Teo; Azlis-Sani, Jalil; Nur Annuar Mohd Yunos, Muhammad; Ismail, S. M. Sabri S. M.; Tajedi, Noor Aqilah Ahmad

    2016-11-01

    Public transportation acts as social mobility and caters the daily needs of the society for passengers to travel from one place to another. This is true for a country like Malaysia where international trade has been growing significantly over the past few decades. Task analysis assessment was conducted with the consideration of cognitive ergonomic view towards problem related to human factors. Conducting research regarding the task analysis on bus traffic controllers had allowed a better understanding regarding the nature of work and the overall monitoring activities of the bus services. This paper served to study the task analysis assessment on intrastate bus traffic controllers and the objectives of this study include to conduct task analysis assessment on the bus traffic controllers. Task analysis assessment for the bus traffic controllers was developed via Hierarchical Task Analysis (HTA). There are a total of five subsidiary tasks on level one and only two were able to be further broken down in level two. Development of HTA allowed a better understanding regarding the work and this could further ease the evaluation of the tasks conducted by the bus traffic controllers. Thus, human error could be reduced for the safety of all passengers and increase the overall efficiency of the system. Besides, it could assist in improving the operation of the bus traffic controllers by modelling or synthesizing the existing tasks if necessary.

  9. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  10. Task analysis: a detailed example of stepping up from JSA

    International Nuclear Information System (INIS)

    Banks, W.W.; Paramore, B.A.; Buys, J.R.

    1984-10-01

    This paper discusses a pilot task analysis of operations in a proposed facility for the cutting and packaging of radioactively contaminated gloveboxes, for long-term storage or burial. The objective was to demonstrate how task analysis may be used as a tool for planning and risk management. Two specific products were generated - preliminary operating procedures and training requirements. The task data base, procedures list and training requirements developed were intended as first order categorizations. The analysis was limited to tasks that will be performed within the boundaries of the operational facility and the associated load-out area. The analysis documents tasks to be performed by D and D (Decontamination and Decommissioning) Workers. However, the analysis included all tasks identified as an integral part of glovebox processing within the facility. Thus tasks involving Radiation Protection Technicians (RPTs) are included. Based on hazard assessments, it is planned that at least two RPTs will be assigned full-time to the facility, so they may be considered part of its crew. Similarly, supervisory/administrative tasks are included where they were determined to be directly part of process sequences, such as obtaining appropriate certification. 11 tables

  11. Sentiment Analysis of Suicide Notes: A Shared Task.

    Science.gov (United States)

    Pestian, John P; Matykiewicz, Pawel; Linn-Gust, Michelle; South, Brett; Uzuner, Ozlem; Wiebe, Jan; Cohen, K Bretonnel; Hurdle, John; Brew, Christopher

    2012-01-30

    This paper reports on a shared task involving the assignment of emotions to suicide notes. Two features distinguished this task from previous shared tasks in the biomedical domain. One is that it resulted in the corpus of fully anonymized clinical text and annotated suicide notes. This resource is permanently available and will (we hope) facilitate future research. The other key feature of the task is that it required categorization with respect to a large set of labels. The number of participants was larger than in any previous biomedical challenge task. We describe the data production process and the evaluation measures, and give a preliminary analysis of the results. Many systems performed at levels approaching the inter-coder agreement, suggesting that human-like performance on this task is within the reach of currently available technologies.

  12. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US countries. This volume contains compiled data on Mexico, Netherlands, Pakistan, Philippines, South Africa, South Korea, and Spain.

  13. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.B.; Stobbs, J.J.; Collier, D.M.; Hobbs, J.S.

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US counries. Data are compiled in this volume on Canada, Egypt, Federal Republic of Germany, Finland, and France.

  14. Experiment with expert system guidance of an engineering analysis task

    International Nuclear Information System (INIS)

    Ransom, V.H.; Fink, R.K.; Callow, R.A.

    1986-01-01

    An experiment is being conducted in which expert systems are used to guide the performance of an engineering analysis task. The task chosen for experimentation is the application of a large thermal hydraulic transient simulation computer code. The expectation from this work is that the expert system will result in an improved analytical result with a reduction in the amount of human labor and expertise required. The code associated functions of model formulation, data input, code execution, and analysis of the computed output have all been identified as candidate tasks that could benefit from the use of expert systems. Expert system modules have been built for the model building and data input task. Initial results include the observation that human expectations of an intelligent environment rapidly escalate and structured or stylized tasks that are tolerated in the unaided system are frustrating within the intelligent environment

  15. Cognitive Task Analysis of the Battalion Level Visualization Process

    National Research Council Canada - National Science Library

    Leedom, Dennis K; McElroy, William; Shadrick, Scott B; Lickteig, Carl; Pokorny, Robet A; Haynes, Jacqueline A; Bell, James

    2007-01-01

    ... position or as a battalion Operations Officer or Executive Officer. Bases on findings from the cognitive task analysis, 11 skill areas were identified as potential focal points for future training development...

  16. Recurrence interval analysis of trading volumes.

    Science.gov (United States)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  17. Imaging gait analysis: An fMRI dual task study.

    Science.gov (United States)

    Bürki, Céline N; Bridenbaugh, Stephanie A; Reinhardt, Julia; Stippich, Christoph; Kressig, Reto W; Blatow, Maria

    2017-08-01

    In geriatric clinical diagnostics, gait analysis with cognitive-motor dual tasking is used to predict fall risk and cognitive decline. To date, the neural correlates of cognitive-motor dual tasking processes are not fully understood. To investigate these underlying neural mechanisms, we designed an fMRI paradigm to reproduce the gait analysis. We tested the fMRI paradigm's feasibility in a substudy with fifteen young adults and assessed 31 healthy older adults in the main study. First, gait speed and variability were quantified using the GAITRite © electronic walkway. Then, participants lying in the MRI-scanner were stepping on pedals of an MRI-compatible stepping device used to imitate gait during functional imaging. In each session, participants performed cognitive and motor single tasks as well as cognitive-motor dual tasks. Behavioral results showed that the parameters of both gait analyses, GAITRite © and fMRI, were significantly positively correlated. FMRI results revealed significantly reduced brain activation during dual task compared to single task conditions. Functional ROI analysis showed that activation in the superior parietal lobe (SPL) decreased less from single to dual task condition than activation in primary motor cortex and in supplementary motor areas. Moreover, SPL activation was increased during dual tasks in subjects exhibiting lower stepping speed and lower executive control. We were able to simulate walking during functional imaging with valid results that reproduce those from the GAITRite © gait analysis. On the neural level, SPL seems to play a crucial role in cognitive-motor dual tasking and to be linked to divided attention processes, particularly when motor activity is involved.

  18. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  19. Consensus statement of the ESICM task force on colloid volume therapy in critically ill patients

    DEFF Research Database (Denmark)

    Reinhart, Konrad; Perner, Anders; Sprung, Charles L

    2012-01-01

    PURPOSE: Colloids are administered to more patients than crystalloids, although recent evidence suggests that colloids may possibly be harmful in some patients. The European Society of Intensive Care Medicine therefore assembled a task force to compile consensus recommendations based on the current...... best evidence for the safety and efficacy of the currently most frequently used colloids--hydroxyethyl starches (HES), gelatins and human albumin. METHODS: Meta-analyses, systematic reviews and clinical studies of colloid use were evaluated for the treatment of volume depletion in mixed intensive care...... unit (ICU), cardiac surgery, head injury, sepsis and organ donor patients. Clinical endpoints included mortality, kidney function and bleeding. The relevance of concentration and dosage was also assessed. Publications from 1960 until May 2011 were included. The quality of available evidence...

  20. Grey matter volume correlates with virtual water maze task performance in boys with androgen excess.

    Science.gov (United States)

    Mueller, S C; Merke, D P; Leschek, E W; Fromm, S; Grillon, C; Cornwell, B R; Vanryzin, C; Ernst, M

    2011-12-01

    Major questions remain about the specific role of testosterone in human spatial navigation. We tested 10 boys (mean age 11.65 years) with an extremely rare disorder of androgen excess (Familial Male Precocious Puberty, FMPP) and 40 healthy boys (mean age 12.81 years) on a virtual version of the Morris Water Maze task. In addition, anatomical magnetic resonance images were collected for all patients and a subsample of the controls (n=21) after task completion. Behaviourally, no significant differences were found between both groups. However, in the MRI analyses, grey matter volume (GMV) was correlated with performance using voxel-based morphometry (VBM). Group differences in correlations of performance with GMV were apparent in medial regions of the prefrontal cortex as well as the middle occipital gyrus and the cuneus. By comparison, similar correlations for both groups were found in the inferior parietal lobule. These data provide novel insight into the relation between testosterone and brain development and suggest that morphological differences in a spatial navigation network covary with performance in spatial ability. Published by Elsevier Ltd.

  1. Feasibility study of modern airships, phase 1. Volume 3: Historical overview (task 1)

    Science.gov (United States)

    Faurote, G. L.

    1975-01-01

    The history of lighter-than-air vehicles is reviewed in terms of providing a background for the mission analysis and parametric analysis tasks. Data from past airships and airship operations are presented in the following areas: (1) parameterization of design characteristics; (2) markets, missions, costs, and operating procedures, (3) indices of efficiency for comparison; (4) identification of critical design and operational characteristics; and (5) definition of the 1930 state-of-the-art and the 1974 state-of-the-art from a technical and economic standpoint.

  2. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.

  3. Identifying radiotherapy target volumes in brain cancer by image analysis.

    Science.gov (United States)

    Cheng, Kun; Montgomery, Dean; Feng, Yang; Steel, Robin; Liao, Hanqing; McLaren, Duncan B; Erridge, Sara C; McLaughlin, Stephen; Nailon, William H

    2015-10-01

    To establish the optimal radiotherapy fields for treating brain cancer patients, the tumour volume is often outlined on magnetic resonance (MR) images, where the tumour is clearly visible, and mapped onto computerised tomography images used for radiotherapy planning. This process requires considerable clinical experience and is time consuming, which will continue to increase as more complex image sequences are used in this process. Here, the potential of image analysis techniques for automatically identifying the radiation target volume on MR images, and thereby assisting clinicians with this difficult task, was investigated. A gradient-based level set approach was applied on the MR images of five patients with grades II, III and IV malignant cerebral glioma. The relationship between the target volumes produced by image analysis and those produced by a radiation oncologist was also investigated. The contours produced by image analysis were compared with the contours produced by an oncologist and used for treatment. In 93% of cases, the Dice similarity coefficient was found to be between 60 and 80%. This feasibility study demonstrates that image analysis has the potential for automatic outlining in the management of brain cancer patients, however, more testing and validation on a much larger patient cohort is required.

  4. A Validated Task Analysis of the Single Pilot Operations Concept

    Science.gov (United States)

    Wolter, Cynthia A.; Gore, Brian F.

    2015-01-01

    The current day flight deck operational environment consists of a two-person Captain/First Officer crew. A concept of operations (ConOps) to reduce the commercial cockpit to a single pilot from the current two pilot crew is termed Single Pilot Operations (SPO). This concept has been under study by researchers in the Flight Deck Display Research Laboratory (FDDRL) at the National Aeronautics and Space Administration's (NASA) Ames (Johnson, Comerford, Lachter, Battiste, Feary, and Mogford, 2012) and researchers from Langley Research Centers (Schutte et al., 2007). Transitioning from a two pilot crew to a single pilot crew will undoubtedly require changes in operational procedures, crew coordination, use of automation, and in how the roles and responsibilities of the flight deck and ATC are conceptualized in order to maintain the high levels of safety expected of the US National Airspace System. These modifications will affect the roles and the subsequent tasks that are required of the various operators in the NextGen environment. The current report outlines the process taken to identify and document the tasks required by the crew according to a number of operational scenarios studied by the FDDRL between the years 2012-2014. A baseline task decomposition has been refined to represent the tasks consistent with a new set of entities, tasks, roles, and responsibilities being explored by the FDDRL as the move is made towards SPO. Information from Subject Matter Expert interviews, participation in FDDRL experimental design meetings, and study observation was used to populate and refine task sets that were developed as part of the SPO task analyses. The task analysis is based upon the proposed ConOps for the third FDDRL SPO study. This experiment possessed nine different entities operating in six scenarios using a variety of SPO-related automation and procedural activities required to guide safe and efficient aircraft operations. The task analysis presents the roles and

  5. Task Analysis in Instructional Design: Some Cases from Mathematics.

    Science.gov (United States)

    Resnick, Lauren B.

    Task analysis as a tool in the design of instruction is the subject of this paper. Some of the major historical approaches (associationist/behaviorist, gestalt, and Piagetian) are described using examples from mathematics. The usefulness of these approaches to instructional design is evaluated on the basis of four criteria: instructional…

  6. An Analysis of Tasks Performed in the Ornamental Horticulture Industry.

    Science.gov (United States)

    Berkey, Arthur L.; Drake, William E.

    This publication is the result of a detailed task analysis study of the ornamental horticulture industry in New York State. Nine types of horticulture businesses identified were: (1) retail florists, (2) farm and garden supply store, (3) landscape services, (4) greenhouse production, (5) nursery production, (6) turf production, (7) arborist…

  7. Data analysis & probability task sheets : grades pk-2

    CERN Document Server

    Cook, Tanya

    2009-01-01

    For grades PK-2, our Common Core State Standards-based resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to learn and review the concepts in unique ways. Each task sheet is organized around a central problem taken from real-life experiences of the students.

  8. Use of Job and Task Analysis in Training.

    Science.gov (United States)

    George Washington Univ., Alexandria, VA. Human Resources Research Office.

    A t a briefing sponsored by the Office of the Deputy Chief of Staff for Individual Training, members of the Human Resources Research Office reported on four projects using job and task analysis in different training situations. Wor k Unit STOCK was a training program designed to develop training management procedures for heterogeneous ability…

  9. Partial Derivative Games in Thermodynamics: A Cognitive Task Analysis

    Science.gov (United States)

    Kustusch, Mary Bridget; Roundy, David; Dray, Tevian; Manogue, Corinne A.

    2014-01-01

    Several studies in recent years have demonstrated that upper-division students struggle with the mathematics of thermodynamics. This paper presents a task analysis based on several expert attempts to solve a challenging mathematics problem in thermodynamics. The purpose of this paper is twofold. First, we highlight the importance of cognitive task…

  10. Feasibility of developing a portable driver performance data acquisition system for human factors research: Technical tasks. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.J.; Barickman, F.S.; Spelt, P.F.; Schmoyer, R.L.; Kirkpatrick, J.R.

    1998-01-01

    A two-phase, multi-year research program entitled ``development of a portable driver performance data acquisition system for human factors research`` was recently completed. The primary objective of the project was to develop a portable data acquisition system for crash avoidance research (DASCAR) that will allow drive performance data to be collected using a large variety of vehicle types and that would be capable of being installed on a given vehicle type within a relatively short-time frame. During phase 1 a feasibility study for designing and fabricating DASCAR was conducted. In phase 2 of the research DASCAR was actually developed and validated. This technical memorandum documents the results from the feasibility study. It is subdivided into three volumes. Volume one (this report) addresses the last five items in the phase 1 research and the first issue in the second phase of the project. Volumes two and three present the related appendices, and the design specifications developed for DASCAR respectively. The six tasks were oriented toward: identifying parameters and measures; identifying analysis tools and methods; identifying measurement techniques and state-of-the-art hardware and software; developing design requirements and specifications; determining the cost of one or more copies of the proposed data acquisition system; and designing a development plan and constructing DASCAR. This report also covers: the background to the program; the requirements for the project; micro camera testing; heat load calculations for the DASCAR instrumentation package in automobile trunks; phase 2 of the research; the DASCAR hardware and software delivered to the National Highway Traffic Safety Administration; and crash avoidance problems that can be addressed by DASCAR.

  11. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  12. Inpo/industry job and task analysis efforts

    International Nuclear Information System (INIS)

    Wigley, W.W.

    1985-01-01

    One of the goals of INPO is to develop and coordinate industrywide programs to improve the education, training and qualification of nuclear utility personnel. To accomplish this goal, INPO's Training and Education Division: conducts periodic evaluations of industry training programs; provides assistance to the industry in developing training programs; manages the accreditation of utility training programs. These efforts are aimed at satisfying the need for training programs for nuclear utility personnel to be performance-based. Performance-based means that training programs provide an incumbent with the skills and knowledge required to safely perform the job. One of the ways that INPO has provided assistance to the industry is through the industrywide job and task analysis effort. I will discuss the job analysis and task analysis processes, the current status of JTA efforts, JTA products and JTA lessons learned

  13. Sandia-Power Surety Task Force Hawaii foam analysis.

    Energy Technology Data Exchange (ETDEWEB)

    McIntyre, Annie

    2010-11-01

    The Office of Secretary of Defense (OSD) Power Surety Task Force was officially created in early 2008, after nearly two years of work in demand reduction and renewable energy technologies to support the Warfighter in Theater. The OSD Power Surety Task Force is tasked with identifying efficient energy solutions that support mission requirements. Spray foam insulation demonstrations were recently expanded beyond field structures to include military housing at Ft. Belvoir. Initial results to using the foam in both applications are favorable. This project will address the remaining key questions: (1) Can this technology help to reduce utility costs for the Installation Commander? (2) Is the foam cost effective? (3) What application differences in housing affect those key metrics? The critical need for energy solutions in Hawaii and the existing relationships among Sandia, the Department of Defense (DOD), the Department of Energy (DOE), and Forest City, make this location a logical choice for a foam demonstration. This project includes application and analysis of foam to a residential duplex at the Waikulu military community on Oahu, Hawaii, as well as reference to spray foam applied to a PACOM facility and additional foamed units on Maui, conducted during this project phase. This report concludes the analysis and describes the utilization of foam insulation at military housing in Hawaii and the subsequent data gathering and analysis.

  14. Task analysis: How far are we from usable PRA input

    International Nuclear Information System (INIS)

    Gertman, D.I.; Blackman, H.S.; Hinton, M.F.

    1984-01-01

    This chapter reviews data collected at the Idaho National Engineering Laboratory for three DOE-owned reactors (the Advanced Test Reactor, the Power Burst Facility, and the Loss of Fluids Test Reactor) in order to identify usable Probabilistic Risk Assessment (PRA) input. Task analytic procedures involve the determination of manning and skill levels as a means of determining communication requirements, in assessing job performance aids, and in assessing the accuracy and completeness of emergency and maintenance procedures. The least understood aspect in PRA and plant reliability models is the human factor. A number of examples from the data base are discussed and offered as a means of providing more meaningful data than has been available to PRA analysts in the past. It is concluded that the plant hardware-procedures-personnel interfaces are essential to safe and efficient plant operations and that task analysis is a reasonably sound way of achieving a qualitative method for identifying those tasks most strongly associated with task difficulty, severity of consequence, and error probability

  15. NCO Leadership: Tasks, Skills and Functions. Volume 1. Appendixes A and D

    Science.gov (United States)

    1984-06-01

    leader requirements. In determining the parameters of a leadership position, changes in requirements based on the: situation, the leader, and...asked to review the tentative lists and make any additional changes . The two groups consisted of sixteen instructors from the NCO academies located...Maintenance (7 tasks). Job dimensions which contained relatively few important tasks included Group Manangement (1 task). General Unit Administration (no

  16. Physical and cognitive task analysis in interventional radiology

    International Nuclear Information System (INIS)

    Johnson, S.; Healey, A.; Evans, J.; Murphy, M.; Crawshaw, M.; Gould, D.

    2006-01-01

    AIM: To identify, describe and detail the cognitive thought processes, decision-making, and physical actions involved in the preparation and successful performance of core interventional radiology procedures. MATERIALS AND METHODS: Five commonly performed core interventional radiology procedures were selected for cognitive task analysis. Several examples of each procedure being performed by consultant interventional radiologists were videoed. The videos of those procedures, and the steps required for successful outcome, were analysed by a psychologist and an interventional radiologist. Once a skeleton algorithm of the procedures was defined, further refinement was achieved using individual interview techniques with consultant interventional radiologists. Additionally a critique of each iteration of the established algorithm was sought from non-participating independent consultant interventional radiologists. RESULTS: Detailed task descriptions and decision protocols were developed for five interventional radiology procedures (arterial puncture, nephrostomy, venous access, biopsy-using both ultrasound and computed tomography, and percutaneous transhepatic cholangiogram). Identical tasks performed within these procedures were identified and standardized within the protocols. CONCLUSIONS: Complex procedures were broken down and their constituent processes identified. This might be suitable for use as a training protocol to provide a universally acceptable safe practice at the most fundamental level. It is envisaged that data collected in this way can be used as an educational resource for trainees and could provide the basis for a training curriculum in interventional radiology. It will direct trainees towards safe practice of the highest standard. It will also provide performance objectives of a simulator model

  17. Job task analysis: lessons learned from application in course development

    International Nuclear Information System (INIS)

    Meredith, J.B.

    1985-01-01

    Those at Public Service Electric and Gas Company are committed to a systematic approach to training known as Instructional System Design. Our performance-based training emphasizes the ISD process to have trainees do or perform the task whenever and wherever it is possible for the jobs for which they are being trained. Included is a brief description of our process for conducting and validating job analyses. The major thrust of this paper is primarily on the lessons that we have learned in the design and development of training programs based upon job analysis results

  18. Analysis and Modeling of Control Tasks in Dynamic Systems

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær; Krink, Thiemo; Jensen, Mikkel Thomas

    2002-01-01

    Most applications of evolutionary algorithms deal with static optimization problems. However, in recent years, there has been a growing interest in time-varying (dynamic) problems, which are typically found in real-world scenarios. One major challenge in this field is the design of realistic test......-case generators (TCGs), which requires a systematic analysis of dynamic optimization tasks. So far, only a few TCGs have been suggested. Our investigation leads to the conclusion that these TCGs are not capable of generating realistic dynamic benchmark tests. The result of our research is the design of a new TCG...

  19. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 4, Task 5, Operation of PFH on beneficiated shale, Task 6, Environmental data and mitigation analyses and Task 7, Sample procurement, preparation, and characterization: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    The objective of Task 5 (Operation of Pressurized Fluidized-Bed Hydro-Retorting (PFH) on Beneficiated Shale) was to modify the PFH process to facilitate its use for fine-sized, beneficiated Eastern shales. This task was divided into 3 subtasks: Non-Reactive Testing, Reactive Testing, and Data Analysis and Correlations. The potential environment impacts of PFH processing of oil shale must be assessed throughout the development program to ensure that the appropriate technologies are in place to mitigate any adverse effects. The overall objectives of Task 6 (Environmental Data and Mitigation Analyses) were to obtain environmental data relating to PFH and shale beneficiation and to analyze the potential environmental impacts of the integrated PFH process. The task was divided into the following four subtasks. Characterization of Processed Shales (IGT), 6.2. Water Availability and Treatment Studies, 6.3. Heavy Metals Removal and 6.4. PFH Systems Analysis. The objective of Task 7 (Sample Procurement, Preparation, and Characterization) was to procure, prepare, and characterize raw and beneficiated bulk samples of Eastern oil shale for all of the experimental tasks in the program. Accomplishments for these tasks are presented.

  20. Using Goal Setting and Task Analysis to Enhance Task-Based Language Learning and Teaching

    Science.gov (United States)

    Rubin, Joan

    2015-01-01

    Task-Based Language Learning and Teaching has received sustained attention from teachers and researchers for over thirty years. It is a well-established pedagogy that includes the following characteristics: major focus on authentic and real-world tasks, choice of linguistic resources by learners, and a clearly defined non-linguistic outcome. This…

  1. Evaluation of stereoscopic 3D displays for image analysis tasks

    Science.gov (United States)

    Peinsipp-Byma, E.; Rehfeld, N.; Eck, R.

    2009-02-01

    In many application domains the analysis of aerial or satellite images plays an important role. The use of stereoscopic display technologies can enhance the image analyst's ability to detect or to identify certain objects of interest, which results in a higher performance. Changing image acquisition from analog to digital techniques entailed the change of stereoscopic visualisation techniques. Recently different kinds of digital stereoscopic display techniques with affordable prices have appeared on the market. At Fraunhofer IITB usability tests were carried out to find out (1) with which kind of these commercially available stereoscopic display techniques image analysts achieve the best performance and (2) which of these techniques achieve a high acceptance. First, image analysts were interviewed to define typical image analysis tasks which were expected to be solved with a higher performance using stereoscopic display techniques. Next, observer experiments were carried out whereby image analysts had to solve defined tasks with different visualization techniques. Based on the experimental results (performance parameters and qualitative subjective evaluations of the used display techniques) two of the examined stereoscopic display technologies were found to be very good and appropriate.

  2. Observer analysis and its impact on task performance modeling

    Science.gov (United States)

    Jacobs, Eddie L.; Brown, Jeremy B.

    2014-05-01

    Fire fighters use relatively low cost thermal imaging cameras to locate hot spots and fire hazards in buildings. This research describes the analyses performed to study the impact of thermal image quality on fire fighter fire hazard detection task performance. Using human perception data collected by the National Institute of Standards and Technology (NIST) for fire fighters detecting hazards in a thermal image, an observer analysis was performed to quantify the sensitivity and bias of each observer. Using this analysis, the subjects were divided into three groups representing three different levels of performance. The top-performing group was used for the remainder of the modeling. Models were developed which related image quality factors such as contrast, brightness, spatial resolution, and noise to task performance probabilities. The models were fitted to the human perception data using logistic regression, as well as probit regression. Probit regression was found to yield superior fits and showed that models with not only 2nd order parameter interactions, but also 3rd order parameter interactions performed the best.

  3. Validity of the alcohol purchase task: a meta-analysis.

    Science.gov (United States)

    Kiselica, Andrew M; Webber, Troy A; Bornovalova, Marina A

    2016-05-01

    Behavioral economists assess alcohol consumption as a function of unit price. This method allows construction of demand curves and demand indices, which are thought to provide precise numerical estimates of risk for alcohol problems. One of the more commonly used behavioral economic measures is the Alcohol Purchase Task (APT). Although the APT has shown promise as a measure of risk for alcohol problems, the construct validity and incremental utility of the APT remain unclear. This paper presents a meta-analysis of the APT literature. Sixteen studies were included in the meta-analysis. Studies were gathered via searches of the PsycInfo, PubMed, Web of Science and EconLit research databases. Random-effects meta-analyses with inverse variance weighting were used to calculate summary effect sizes for each demand index-drinking outcome relationship. Moderation of these effects by drinking status (regular versus heavy drinkers) was examined. Additionally, tests of the incremental utility of the APT indices in predicting drinking problems above and beyond measuring alcohol consumption were performed. The APT indices were correlated in the expected directions with drinking outcomes, although many effects were small in size. These effects were typically not moderated by the drinking status of the samples. Additionally, the intensity metric demonstrated incremental utility in predicting alcohol use disorder symptoms beyond measuring drinking. The Alcohol Purchase Task appears to have good construct validity, but limited incremental utility in estimating risk for alcohol problems. © 2015 Society for the Study of Addiction.

  4. Spatio-temporal analysis reveals active control of both task-relevant and task-irrelevant variables

    Directory of Open Access Journals (Sweden)

    Kornelius eRácz

    2013-11-01

    Full Text Available The Uncontrolled Manifold hypothesis and Minimal Intervention principle propose that the observed differential variability across task relevant (i.e., task goals vs. irrelevant (i.e., in the null space of those goals variables is evidence of a separation of task variables for efficient neural control, ranked by their respective variabilities (sometimes referred to as hierarchy of control. Support for this comes from spatial domain analyses (i.e., structure of of kinematic, kinetic and EMG variability. While proponents admit the possibility of textsl{preferential} as opposed to strictly textsl{uncontrolled} variables, such distinctions have only begun to be quantified or considered in the temporal domain when inferring control action. Here we extend the study of task variability during tripod static grasp to the temporal domain by applying diffusion analysis. We show that both task-relevant and task-irrelevant parameters show corrective action at some time scales; and conversely, that task-relevant parameters do not show corrective action at other time scales. That is, the spatial fluctuations of fingertip forces show, as expected, greater ranges of variability in task-irrelevant variables (> 98% associated with changes in total grasp force; vs. only

  5. Heavy vehicle driver workload assessment. Task 3, task analysis data collection

    Science.gov (United States)

    This technical report consists of a collection of task analytic data to support heavy vehicle driver workload assessment and protocol development. Data were collected from professional drivers to provide insights into the following issues: the meanin...

  6. Heavy vehicle driver workload assessment. Task 1, task analysis data and protocols review

    Science.gov (United States)

    This report contains a review of available task analytic data and protocols pertinent to heavy vehicle operation and determination of the availability and relevance of such data to heavy vehicle driver workload assessment. Additionally, a preliminary...

  7. U. S. Army Land Warfare Laboratory. Volume II Appendix B. Task Sheets

    Science.gov (United States)

    1974-06-01

    Aerosols B-561 05-B-73 Entomological Use of Lights B-562 06-B-73 Pesticide Pyrolysis Device B-563 07-B-73 Drug Detection by an Enzyme Method B-564 08-8...experiment showed that addition of small amounts of benzyl diethyl ammonium benzoate (Bitrex or BEDAB) to the soil in which tomato plants wure growing...which hindered its evaluation. I B-563 TASK NUMBER: 06-B-73 "TITLE: Pesticide Pyrolysis Device AUTHORIZED FUNDING: $41,384 TASK DURATION: 3 August 1972

  8. Cognitive Modeling and Task Analysis: Basic Processes and Individual Differences

    National Research Council Canada - National Science Library

    Ackerman, Phillip

    1999-01-01

    ... in a complex-skill environment. The subset of task conditions selected were those that involve basic processes of working memory, task monitoring, and differential loads on spatial reasoning and speed of perceiving...

  9. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  10. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  11. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  12. Taste perception analysis using a semantic verbal fluency task

    Directory of Open Access Journals (Sweden)

    Ghemulet M

    2014-09-01

    Full Text Available Maria Ghemulet,1,2 Maria Baskini,3 Lambros Messinis,2,4 Eirini Mouza,1 Hariklia Proios1,5 1Department of Speech Therapy, Anagennisis (Revival Physical Recovery and Rehabilitation Centre, Nea Raidestos, Filothei, Thessaloniki, Greece; 2Department of Speech and Language Therapy, Technological Institute of Western Greece, Patra, Greece; 3Department of Neurosurgery, Interbalkan European Medical Centre, Thessaloniki, Greece; 4Neuropsychology Section, Department of Neurology, University of Patras, Medical School, Patras, Greece; 5Department of Education and Social Policy, University of Macedonia, Thessaloniki, Greece Abstract: A verbal fluency (VF task is a test used to examine cognitive perception. The main aim of this study was to explore a possible relationship between taste perception in the basic taste categories (sweet, salty, sour, and bitter and subjects’ taste preferences, using a VF task in healthy and dysphagic subjects. In addition, we correlated the results of the VF task with body mass index (BMI. The hypothesis is that categorical preferences would be consistent with the number of verbal responses. We also hypothesized that higher BMI (.30 kg/m2 would correlate with more responses in either some or all four categories. VF tasks were randomly administered. Analysis criteria included number of verbally produced responses, number of clusters, number of switches, number and type of errors, and VF consistency with taste preferences. Sixty Greek-speaking individuals participated in this study. Forty-three healthy subjects were selected with a wide range of ages, sex, and education levels. Seventeen dysphagic patients were then matched with 17 healthy subjects according to age, sex, and BMI. Quantitative one-way analysis of variance (between groups as well as repeated measures, post hoc, and chi-square, and qualitative analyses were performed. In the healthy subjects’ group, the differences among the mean number of responses for the four

  13. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  14. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2014-01-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  15. SLSF loop handling system. Volume I. Structural analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, H.; Cowie, A.; Ma, D.

    1978-10-01

    SLSF loop handling system was analyzed for deadweight and postulated dynamic loading conditions, identified in Chapters II and III in Volume I of this report, using a linear elastic static equivalent method of stress analysis. Stress analysis of the loop handling machine is presented in Volume I of this report. Chapter VII in Volume I of this report is a contribution by EG and G Co., who performed the work under ANL supervision.

  16. SLSF loop handling system. Volume I. Structural analysis

    International Nuclear Information System (INIS)

    Ahmed, H.; Cowie, A.; Ma, D.

    1978-10-01

    SLSF loop handling system was analyzed for deadweight and postulated dynamic loading conditions, identified in Chapters II and III in Volume I of this report, using a linear elastic static equivalent method of stress analysis. Stress analysis of the loop handling machine is presented in Volume I of this report. Chapter VII in Volume I of this report is a contribution by EG and G Co., who performed the work under ANL supervision

  17. RF model of the distribution system as a communication channel, phase 2. Volume 2: Task reports

    Science.gov (United States)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    Based on the established feasibility of predicting, via a model, the propagation of Power Line Frequency on radial type distribution feeders, verification studies comparing model predictions against measurements were undertaken using more complicated feeder circuits and situations. Detailed accounts of the major tasks are presented. These include: (1) verification of model; (2) extension, implementation, and verification of perturbation theory; (3) parameter sensitivity; (4) transformer modeling; and (5) compensation of power distribution systems for enhancement of power line carrier communication reliability.

  18. Comparative analysis of cognitive tasks for modeling mental workload with electroencephalogram.

    Science.gov (United States)

    Hwang, Taeho; Kim, Miyoung; Hwangbo, Minsu; Oh, Eunmi

    2014-01-01

    Previous electroencephalogram (EEG) studies have shown that cognitive workload can be estimated by using several types of cognitive tasks. In this study, we attempted to characterize cognitive tasks that have been used to manipulate workload for generating classification models. We carried out a comparative analysis between two representative types of working memory tasks: the n-back task and the mental arithmetic task. Based on experiments with 7 healthy subjects using Emotiv EPOC, we compared the consistency, robustness, and efficiency of each task in determining cognitive workload in a short training session. The mental arithmetic task seems consistent and robust in manipulating clearly separable high and low levels of cognitive workload with less training. In addition, the mental arithmetic task shows consistency despite repeated usage over time and without notable task adaptation in users. The current study successfully quantifies the quality and efficiency of cognitive workload modeling depending on the type and configuration of training tasks.

  19. Analysis of operators' diagnosis tasks based on cognitive process

    International Nuclear Information System (INIS)

    Zhou Yong; Zhang Li

    2012-01-01

    Diagnosis tasks in nuclear power plants characterized as high-dynamic uncertainties are complex reasoning tasks. Diagnosis errors are the main causes for the error of commission. Firstly, based on mental model theory and perception/action cycle theory, a cognitive model for analyzing operators' diagnosis tasks is proposed. Then, the model is used to investigate a trip event which occurred at crystal river nuclear power plant. The application demonstrates typical cognitive bias and mistakes which operators may make when performing diagnosis tasks. They mainly include the strong confirmation tendency, difficulty to produce complete hypothesis sets, group mindset, non-systematic errors in hypothesis testing, and etc. (authors)

  20. The effects of bedrest on crew performance during simulated shuttle reentry. Volume 2: Control task performance

    Science.gov (United States)

    Jex, H. R.; Peters, R. A.; Dimarco, R. J.; Allen, R. W.

    1974-01-01

    A simplified space shuttle reentry simulation performed on the NASA Ames Research Center Centrifuge is described. Anticipating potentially deleterious effects of physiological deconditioning from orbital living (simulated here by 10 days of enforced bedrest) upon a shuttle pilot's ability to manually control his aircraft (should that be necessary in an emergency) a comprehensive battery of measurements was made roughly every 1/2 minute on eight military pilot subjects, over two 20-minute reentry Gz vs. time profiles, one peaking at 2 Gz and the other at 3 Gz. Alternate runs were made without and with g-suits to test the help or interference offered by such protective devices to manual control performance. A very demanding two-axis control task was employed, with a subcritical instability in the pitch axis to force a high attentional demand and a severe loss-of-control penalty. The results show that pilots experienced in high Gz flying can easily handle the shuttle manual control task during 2 Gz or 3 Gz reentry profiles, provided the degree of physiological deconditioning is no more than induced by these 10 days of enforced bedrest.

  1. Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation

    Directory of Open Access Journals (Sweden)

    Marlen Promann

    2015-03-01

    Full Text Available Libraries are implementing discovery layers to offer better user experiences. While usability tests have been helpful in evaluating the success or failure of implementing discovery layers in the library context, the focus has remained on its relative interface benefits over the traditional federated search. The informal site- and context specific usability tests have offered little to test the rigor of the discovery layers against the user goals, motivations and workflow they have been designed to support. This study proposes hierarchical task analysis (HTA as an important complementary evaluation method to usability testing of discovery layers. Relevant literature is reviewed for the discovery layers and the HTA method. As no previous application of HTA to the evaluation of discovery layers was found, this paper presents the application of HTA as an expert based and workflow centered (e.g. retrieving a relevant book or a journal article method to evaluating discovery layers. Purdue University’s Primo by Ex Libris was used to map eleven use cases as HTA charts. Nielsen’s Goal Composition theory was used as an analytical framework to evaluate the goal carts from two perspectives: a users’ physical interactions (i.e. clicks, and b user’s cognitive steps (i.e. decision points for what to do next. A brief comparison of HTA and usability test findings is offered as a way of conclusion.

  2. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  3. Life science payload definition and integration study, task C and D. Volume 3: Appendices

    Science.gov (United States)

    1973-01-01

    Research equipment requirements were based on the Mini-7 and Mini-30 laboratory concepts defined in Tasks A and B of the intial LSPD contract. Modified versions of these laboratories and the research equipment within them were to be used in three missions of Shuttle/Sortie Module. These were designated (1) the shared 7-day laboratory (a mission with the life sciences laboratory sharing the sortie module with another scientific laboratory), (2) the dedicated 7-day laboratory (full use of the sortie module), and (3) the dedicated 30-day laboratory (full sortie module use with a 30-day mission duration). In defining the research equipment requirements of these laboratories, the equipment was grouped according to its function, and equipment unit data packages were prepared.

  4. Adverse Effects of the Apolipoprotein E ε4 Allele on Episodic Memory, Task Switching and Gray Matter Volume in Healthy Young Adults

    Directory of Open Access Journals (Sweden)

    Jianfei Nao

    2017-06-01

    Full Text Available Many studies have shown that healthy elderly subjects and patients with Alzheimer’s disease (AD who carry the apolipoprotein E (ApoE ε4 allele have worse cognitive function and more severe brain atrophy than non-carriers. However, it remains unclear whether this ApoE polymorphism leads to changes of cognition and brain morphology in healthy young adults. In this study, we used an established model to measure verbal episodic memory and core executive function (EF components (response inhibition, working memory and task switching in 32 ApoE ε4 carriers and 40 non-carriers between 20 years and 40 years of age. To do this, we carried out an adapted auditory verbal learning test and three computerized EF tasks. High-resolution head magnetic resonance scans were performed in all participants and voxel-based morphometry (VBM was used for image processing and analysis. Multivariate analysis of variance (ANOVA performed on memory measures showed that the overall verbal episodic memory of ApoE ε4 carriers was significantly worse than non-carriers (Wilk’s λ = 4.884, P = 0.004. No significant differences were detected in overall EF between the two groups. Post hoc analyses revealed group differences in terms of immediate recall, recognition and task switching, which favored non-carriers. VBM analysis showed gray matter (GM bilateral reductions in the medial and dorsolateral frontal, parietal and left temporal cortices in the carrier group relative to the non-carrier group, which were most significant in the bilateral anterior and middle cingulate gyri. However, these changes in GM volume were not directly associated with changes in cognitive function. Our data show that the ApoE ε4 allele is associated with poorer performance in verbal episodic memory and task switching, and a reduction in GM volume in healthy young adults, suggesting that the effects of ApoE ε4 upon cognition and brain morphology exist long before the possible occurrence of AD.

  5. The Effects of Describing Antecedent Stimuli and Performance Criteria in Task Analysis Instruction for Graphing

    Science.gov (United States)

    Tyner, Bryan C.; Fienup, Daniel M.

    2016-01-01

    Task analyses are ubiquitous to applied behavior analysis interventions, yet little is known about the factors that make them effective. Numerous task analyses have been published in behavior analytic journals for constructing single-subject design graphs; however, learner outcomes using these task analyses may fall short of what could be…

  6. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  7. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  8. CADDIS Volume 4. Data Analysis: Getting Started

    Science.gov (United States)

    Assembling data for an ecological causal analysis, matching biological and environmental samples in time and space, organizing data along conceptual causal pathways, data quality and quantity requirements, Data Analysis references.

  9. Longitudinal analysis of mouse SDOCT volumes

    Science.gov (United States)

    Antony, Bhavna J.; Carass, Aaron; Lang, Andrew; Kim, Byung-Jin; Zack, Donald J.; Prince, Jerry L.

    2017-03-01

    Spectral-domain optical coherence tomography (SDOCT), in addition to its routine clinical use in the diagnosis of ocular diseases, has begun to fund increasing use in animal studies. Animal models are frequently used to study disease mechanisms as well as to test drug efficacy. In particular, SDOCT provides the ability to study animals longitudinally and non-invasively over long periods of time. However, the lack of anatomical landmarks makes the longitudinal scan acquisition prone to inconsistencies in orientation. Here, we propose a method for the automated registration of mouse SDOCT volumes. The method begins by accurately segmenting the blood vessels and the optic nerve head region in the scans using a pixel classification approach. The segmented vessel maps from follow-up scans were registered using an iterative closest point (ICP) algorithm to the baseline scan to allow for the accurate longitudinal tracking of thickness changes. Eighteen SDOCT volumes from a light damage model study were used to train a random forest utilized in the pixel classification step. The area under the curve (AUC) in a leave-one-out study for the retinal blood vessels and the optic nerve head (ONH) was found to be 0.93 and 0.98, respectively. The complete proposed framework, the retinal vasculature segmentation and the ICP registration, was applied to a secondary set of scans obtained from a light damage model. A qualitative assessment of the registration showed no registration failures.

  10. A comparative critical analysis of modern task-parallel runtimes.

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Kyle Bruce; Stark, Dylan; Murphy, Richard C.

    2012-12-01

    The rise in node-level parallelism has increased interest in task-based parallel runtimes for a wide array of application areas. Applications have a wide variety of task spawning patterns which frequently change during the course of application execution, based on the algorithm or solver kernel in use. Task scheduling and load balance regimes, however, are often highly optimized for specific patterns. This paper uses four basic task spawning patterns to quantify the impact of specific scheduling policy decisions on execution time. We compare the behavior of six publicly available tasking runtimes: Intel Cilk, Intel Threading Building Blocks (TBB), Intel OpenMP, GCC OpenMP, Qthreads, and High Performance ParalleX (HPX). With the exception of Qthreads, the runtimes prove to have schedulers that are highly sensitive to application structure. No runtime is able to provide the best performance in all cases, and those that do provide the best performance in some cases, unfortunately, provide extremely poor performance when application structure does not match the schedulers assumptions.

  11. Life sciences payload definition and integration study, task C and D. Volume 1: Management summary

    Science.gov (United States)

    1973-01-01

    The findings of a study to define the required payloads for conducting life science experiments in space are presented. The primary objectives of the study are: (1) identify research functions to be performed aboard life sciences spacecraft laboratories and necessary equipment, (2) develop conceptual designs of potential payloads, (3) integrate selected laboratory designs with space shuttle configurations, and (4) establish cost analysis of preliminary program planning.

  12. 49 CFR 236.1043 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... performance necessary to perform each task; (5) Develop a training and evaluation curriculum that includes... refresher training and evaluation at intervals specified in the PTCDP and PTCSP that includes classroom..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive...

  13. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  14. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  15. Designing Preclinical Instruction for Psychomotor Skills (II)--Instructional Engineering: Task Analysis.

    Science.gov (United States)

    Knight, G. William; And Others

    1994-01-01

    The first step in engineering the instruction of dental psychomotor skills, task analysis, is explained. A chart details the procedural, cognitive, desired-criteria, and desired-performance analysis of a single task, occlusal preparation for amalgam restoration with carious lesion. (MSE)

  16. CADDIS Volume 4. Data Analysis: Download Software

    Science.gov (United States)

    Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.

  17. Diversion Path Analysis Handbook. Volume 1. Methodology

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  18. Corporate Data Network (CDN) data requirements task. Enterprise Model. Volume 1

    International Nuclear Information System (INIS)

    1985-11-01

    The NRC has initiated a multi-year program to centralize its information processing in a Corporate Data Network (CDN). The new information processing environment will include shared databases, telecommunications, office automation tools, and state-of-the-art software. Touche Ross and Company was contracted with to perform a general data requirements analysis for shared databases and to develop a preliminary plan for implementation of the CDN concept. The Enterprise Model (Vol. 1) provided the NRC with agency-wide information requirements in the form of data entities and organizational demand patterns as the basis for clustering the entities into logical groups. The Data Dictionary (Vol. 2) provided the NRC with definitions and example attributes and properties for each entity. The Data Model (Vol. 3) defined logical databases and entity relationships within and between databases. The Preliminary Strategic Data Plan (Vol. 4) prioritized the development of databases and included a workplan and approach for implementation of the shared database component of the Corporate Data Network

  19. Corporate Data Network (CDN). Data Requirements Task. Preliminary Strategic Data Plan. Volume 4

    International Nuclear Information System (INIS)

    1985-11-01

    The NRC has initiated a multi-year program to centralize its information processing in a Corporate Data Network (CDN). The new information processing environment will include shared databases, telecommunications, office automation tools, and state-of-the-art software. Touche Ross and Company was contracted with to perform a general data requirements analysis for shared databases and to develop a preliminary plan for implementation of the CDN concept. The Enterprise Model (Vol. 1) provided the NRC with agency-wide information requirements in the form of data entities and organizational demand patterns as the basis for clustering the entities into logical groups. The Data Dictionary (Vol.2) provided the NRC with definitions and example attributes and properties for each entity. The Data Model (Vol.3) defined logical databases and entity relationships within and between databases. The Preliminary Strategic Data Plan (Vol. 4) prioritized the development of databases and included a workplan and approach for implementation of the shared database component of the Corporate Data Network

  20. Research program for seismic qualification of nuclear plant electrical and mechanical equipment. Task 3. Recommendations for improvement of equipment qualification methodology and criteria. Volume 3

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1984-08-01

    The Research Program for Seismic Qualification of Nuclear Plant Electrical and Mechanical Equipment has spanned a period of three years and resulted in seven technical summary reports, each of which covered in detail the findings of different tasks and subtasks, and have been combined into five NUREG/CR volumes. Volume 3 presents recommendations for improvement of equipment qualification methodology and procedural clarification/modification. The fifth category identifies issues where adequate information does not exist to allow a recommendation to be made

  1. Research program for seismic qualification of nuclear plant electrical and mechanical equipment. Task 4. Use of fragility in seismic design of nuclear plant equipment. Volume 4

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1984-08-01

    The Research Program for Seismic Qualification of Nuclear Plant Electrical and Mechanical Equipment has spanned a period of three years and resulted in seven technical summary reports, each of which have covered in detail the findings of different tasks and subtasks, and have been combined into five NUREG/CR volumes. Volume 4 presents study of the use of fragility concepts in the design of nuclear plant equipment and compares the results of state-of-the-art proof testing with fragility testing

  2. Analysis of urea distribution volume in hemodialysis.

    Science.gov (United States)

    Maduell, F; Sigüenza, F; Caridad, A; Miralles, F; Serrato, F

    1994-01-01

    According to the urea kinetic model it is considered that the urea distribution volume (V) is that of body water, and that it is distributed in only one compartment. Since the V value is different to measure, it is normal to use 58% of body weight, in spite of the fact that it may range from 35 to 75%. In this study, we have calculated the value of V by using an accurate method based on the total elimination of urea from the dialysate. We have studied the V, and also whether the different dialysis characteristics modify it. Thirty-five patients were included in this study, 19 men and 16 women, under a chronic hemodialysis programme. The dialysate was collected in a graduated tank, and the concentration of urea in plasma and in dialysate were determined every hour. Every patient received six dialysis sessions, changing the blood flow (250 or 350 ml/min), the ultrafiltration (0.5 or 1.5 l/h), membrane (cuprophane or polyacrylonitrile) and/or buffer (bicarbonate or acetate). At the end of the hemodialysis session, the V value ranged from 43 to 72% of body weight; nevertheless, this value was practically constant in every patient. The V value gradually increased throughout the dialysis session, 42.1 +/- 6.9% of body weight in the first hour, 50.7 +/- 7.5% in the second hour and 55.7 +/- 7.9% at the end of the dialysis session. The change of blood flow, ultrafiltration, membrane or buffer did not alter the results. The V value was significantly higher in men in comparison with women, 60.0 +/- 6.6% vs. 50.5 +/- 5.9% of body weight (p < 0.001).

  3. Geometric nonlinear functional analysis volume 1

    CERN Document Server

    Benyamini, Yoav

    1999-01-01

    The book presents a systematic and unified study of geometric nonlinear functional analysis. This area has its classical roots in the beginning of the twentieth century and is now a very active research area, having close connections to geometric measure theory, probability, classical analysis, combinatorics, and Banach space theory. The main theme of the book is the study of uniformly continuous and Lipschitz functions between Banach spaces (e.g., differentiability, stability, approximation, existence of extensions, fixed points, etc.). This study leads naturally also to the classification of

  4. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  5. Financial Mathematical Tasks in a Middle School Mathematics Textbook Series: A Content Analysis

    Science.gov (United States)

    Hamburg, Maryanna P.

    2009-01-01

    This content analysis examined the distribution of financial mathematical tasks (FMTs), mathematical tasks that contain financial terminology and require financially related solutions, across the National Standards in K-12 Personal Finance Education categories (JumpStart Coalition, 2007), the thinking skills as identified by "A Taxonomy for…

  6. Task Assignment for Multi-UAV under Severe Uncertainty by Using Stochastic Multicriteria Acceptability Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoxuan Hu

    2015-01-01

    Full Text Available This paper considers a task assignment problem for multiple unmanned aerial vehicles (UAVs. The UAVs are set to perform attack tasks on a collection of ground targets in a severe uncertain environment. The UAVs have different attack capabilities and are located at different positions. Each UAV should be assigned an attack task before the mission starts. Due to uncertain information, many criteria values essential to task assignment were random or fuzzy, and the weights of criteria were not precisely known. In this study, a novel task assignment approach based on stochastic Multicriteria acceptability analysis (SMAA method was proposed to address this problem. The uncertainties in the criteria were analyzed, and a task assignment procedure was designed. The results of simulation experiments show that the proposed approach is useful for finding a satisfactory assignment under severe uncertain circumstances.

  7. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  8. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  9. Life sciences payload definition and integration study, task C and D. Volume 2: Payload definition, integration, and planning studies

    Science.gov (United States)

    1973-01-01

    The Life Sciences Payload Definition and Integration Study was composed of four major tasks. Tasks A and B, the laboratory definition phase, were the subject of prior NASA study. The laboratory definition phase included the establishment of research functions, equipment definitions, and conceptual baseline laboratory designs. These baseline laboratories were designated as Maxi-Nom, Mini-30, and Mini-7. The outputs of Tasks A and B were used by the NASA Life Sciences Payload Integration Team to establish guidelines for Tasks C and D, the laboratory integration phase of the study. A brief review of Tasks A and B is presented provide background continuity. The tasks C and D effort is the subject of this report. The Task C effort stressed the integration of the NASA selected laboratory designs with the shuttle sortie module. The Task D effort updated and developed costs that could be used by NASA for preliminary program planning.

  10. Task modulated brain connectivity of the amygdala: a meta-analysis of psychophysiological interactions.

    Science.gov (United States)

    Di, Xin; Huang, Jia; Biswal, Bharat B

    2017-01-01

    Understanding functional connectivity of the amygdala with other brain regions, especially task modulated connectivity, is a critical step toward understanding the role of the amygdala in emotional processes and the interactions between emotion and cognition. The present study performed coordinate-based meta-analysis on studies of task modulated connectivity of the amygdala which used psychophysiological interaction (PPI) analysis. We first analyzed 49 PPI studies on different types of tasks using activation likelihood estimation (ALE) meta-analysis. Widespread cortical and subcortical regions showed consistent task modulated connectivity with the amygdala, including the medial frontal cortex, bilateral insula, anterior cingulate, fusiform gyrus, parahippocampal gyrus, thalamus, and basal ganglia. These regions were in general overlapped with those showed coactivations with the amygdala, suggesting that these regions and amygdala are not only activated together, but also show different levels of interactions during tasks. Further analyses with subsets of PPI studies revealed task specific functional connectivities with the amygdala that were modulated by fear processing, face processing, and emotion regulation. These results suggest a dynamic modulation of connectivity upon task demands, and provide new insights on the functions of the amygdala in different affective and cognitive processes. The meta-analytic approach on PPI studies may offer a framework toward systematical examinations of task modulated connectivity.

  11. Frequency Analysis of Gradient Estimators in Volume Rendering

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Lichtenbelt, Barthold B.A.; Malzbender, Tom

    1996-01-01

    Gradient information is used in volume rendering to classify and color samples along a ray. In this paper, we present an analysis of the theoretically ideal gradient estimator and compare it to some commonly used gradient estimators. A new method is presented to calculate the gradient at arbitrary

  12. Simplifying the spectral analysis of the volume operator

    NARCIS (Netherlands)

    Loll, R.

    1997-01-01

    The volume operator plays a central role in both the kinematics and dynamics of canonical approaches to quantum gravity which are based on algebras of generalized Wilson loops. We introduce a method for simplifying its spectral analysis, for quantum states that can be realized on a cubic

  13. Linkage Technologies Which Enhance the Utility of Task-Based Occupational Analysis

    National Research Council Canada - National Science Library

    Phalen, William

    1999-01-01

    .... It is alleged that traditional task-based occupational analysis is too labor intensive, too costly, too cumbersome, and too static to meet the emerging and rapidly changing needs of a business...

  14. Cognitive and collaborative demands of freight conductor activities: results and implications of a cognitive task analysis

    Science.gov (United States)

    2012-07-31

    This report presents the results of a cognitive task analysis (CTA) that examined the cognitive and collaborative demands placed on conductors, as well as the knowledge and skills that experienced conductors have developed that enable them to operate...

  15. Task versus relationship conflict, team performance and team member satisfaction: a meta-analysis

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Weingart, L.R.

    2003-01-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, resultsrevealed strong and negative correlations between relationship conflict, team performance, and

  16. A Task Analysis of Underway Replenishment for Virtual Environment Ship-Handling Simulator Scenario Development

    National Research Council Canada - National Science Library

    Norris, Steven

    1998-01-01

    ...) in Newport, RI, researchers at the Naval Air Warfare Center Training Systems Division (NAWCTSD) in Orlando, FL discovered a need for a task analysis of a Conning Officer during an Underway Replenishment...

  17. Using link analysis to explore the impact of the physical environment on pharmacist tasks.

    Science.gov (United States)

    Lester, Corey A; Chui, Michelle A

    2016-01-01

    National community pharmacy organizations have been redesigning pharmacies to better facilitate direct patient care. However, evidence suggests that changing the physical layout of a pharmacy prior to understanding how the environment impacts pharmacists' work may not achieve the desired benefits. This study describes an objective method to understanding how the physical layout of the pharmacy may affect how pharmacists perform tasks. Link analysis is a systems engineering method used to describe the influence of the physical environment on task completion. This study used a secondary data set of field notes collected from 9 h of direct observation in one mass-merchandise community pharmacy in the U.S. State, Wisconsin. A node is an individual location in the environment. A link is the movement between two nodes. Tasks were inventoried and task themes identified. The mean, minimum, and maximum number of links needed to complete each task were then determined and used to construct a link table. A link diagram is a graphical display showing the links in conjunction with the physical layout of the pharmacy. A total of 92 unique tasks were identified resulting in 221 links. Tasks were sorted into five themes: patient care activities, insurance issues, verifying prescriptions, filling prescriptions, and other. Insurance issues required the greatest number of links with a mean of 4.75. Verifying prescriptions and performing patient care were the most commonly performed tasks with 36 and 30 unique task occurrences, respectively. Link analysis provides an objective method for identifying how a pharmacist interacts with the physical environment to complete tasks. This method provides designers with useful information to target interventions to improve the effectiveness of pharmacist work. Analysis beyond link analysis should be considered for large scale system redesign. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Pull Incentives for Antibacterial Drug Development: An Analysis by the Transatlantic Task Force on Antimicrobial Resistance.

    Science.gov (United States)

    Årdal, Christine; Røttingen, John-Arne; Opalska, Aleksandra; Van Hengel, Arjon J; Larsen, Joseph

    2017-10-15

    New alternative market models are needed to incentivize companies to invest in developing new antibacterial drugs. In a previous publication, the Transatlantic Task Force on Antimicrobial Resistance (TATFAR) summarized the key areas of consensus for economic incentives for antibacterial drug development. That work determined that there was substantial agreement on the need for a mixture of push and pull incentives and particularly those that served to delink the revenues from the volumes sold. Pull incentives reward successful development by increasing or ensuring future revenue. Several pull incentives have been proposed that could substantially reward the development of new antibacterial drugs. In this second article authored by representatives of TATFAR, we examine the advantages and disadvantages of different pull incentives for antibacterial drug development. It is TATFAR's hope that this analysis, combined with other related analyses, will provide actionable information that will shape policy makers' thinking on this important issue. Published by Oxford University Press for the Infectious Diseases Society of America 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  19. Sectional analysis for volume determination and selection of volume equations for the Tapajos Nacional Forest

    Directory of Open Access Journals (Sweden)

    Renato Bezerra da Silva Ribeiro

    2014-12-01

    Full Text Available The aim of this study was to analyze different sections lengths for volume determination, fitting of volumetric models for timber production estimation in an area of forest management in the Tapajós National Forest (FNT. Six treatments for sectioning were tested in 152 logs of 12 commercial species. The obtained volumes were statistically compared by analysis of variance (ANOVA for the choice of the best method of sectioning and calculating the actual volume of 2,094 sample trees in different diameter commercial classes. Ten mathematical models were fitted to the whole data and to the species Manilkara huberi (Ducke Chevalier (maçaranduba Lecythis lurida (Miers Samori (jarana and Hymenaea courbaril L. (Jatobá. The criteria to choose the best model were adjusted coefficient of determination in percentage (R2adj%, standard error of estimate in percentage (Syx%, significance of the parameters, normality of residuals, Variance Inflation Factor (VIF and residuals graphic distribution. There was no statistical difference between the methods of sectioning and thus the total length of the logs was more operational in the field. The models in logarithmic form of Schumacher and Hall and Spurr were the best to estimate the volume for the species and for the whole sample set.

  20. Advanced Integrated Multi-Sensor Surveillance (AIMS): Mission, Function, Task Analysis

    Science.gov (United States)

    2007-06-01

    DEFENCE DÉFENSE & Advanced Integrated Multi-sensor Surveillance (AIMS) Mission, Function, Task Analysis Kevin Baker and Gord Youngson CAE...Canada This page intentionally left blank. Advanced Integrated Multi-sensor Surveillance (AIMS) Mission, Function, Task Analysis Kevin Baker...les principaux opérateurs. Il est recommandé d’en tenir compte durant l’intégration du système AIMS aux projets du FWSAR et du CP-140 du PIMPA

  1. Advanced Integrated Multi-sensor Surveillance (AIMS). Mission, Function, Task Analysis

    Science.gov (United States)

    2007-06-01

    Defence R&D Canada – Atlantic DEFENCE DÉFENSE & Advanced Integrated Multi-sensor Surveillance (AIMS) Mission, Function, Task Analysis Kevin Baker and...sensor Surveillance (AIMS) Mission, Function, Task Analysis Kevin Baker Gord Youngson CAE Professional Services CAE Professional Services 1135...d’en tenir compte durant l’intégration du système AIMS aux projets du FWSAR et du CP-140 du PIMPA. Baker, K. and Youngson, G. 2007. Advanced

  2. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    Science.gov (United States)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  3. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    Science.gov (United States)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  4. Dashboard task monitor for managing ATLAS user analysis on the grid

    International Nuclear Information System (INIS)

    Sargsyan, L; Andreeva, J; Karavakis, E; Saiz, P; Tuckett, D; Jha, M; Kokoszkiewicz, L; Schovancova, J

    2014-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  5. Volume conduction effects on wavelet cross-bicoherence analysis

    International Nuclear Information System (INIS)

    Memon, I.A.; Channa, C.

    2013-01-01

    Cross-bicoherence analysis is one of the important nonlinear signal processing tools which is used to measure quadratic phase coupling between frequencies of two different time series. It is frequently used in the diagnosis of various cognitive and neurological disorders in EEG (Electroencephalography) analysis. Volume conduction effects of various uncorrelated sources present in the brain can produce biased estimates into the estimated values of cross-bicoherence function. Previous studies have discussed volume conduction effects on coherence function which is used to measure linear relationship between EEG signals in terms of their phase and amplitude. However, volume conduction effect on cross-bicoherence analysis which is quite a different technique has not been investigated up to now to the best of our knowledge. This study is divided into two major parts, the first part deals with the investigation of VCUS (Volume Conduction effects due to Uncorrelated Sources) characteristics on EEG-cross-bicoherence analysis. The simulated EEG data due to uncorrelated sources present in the brain was used in this part of study. The next part of study is based upon investigating the effects of VCUS on the statistical analysis of results of EEG-based cross-bicoherence analysis. The study provides an important clinical application because most of studies based on EEG cross-bicoherence analysis have avoided the issue of VCUS. The cross-bicoherence analysis was performed by detecting the change in MSCB (Magnitude Square Cross-Bicoherence Function) between EEG activities of change detection and no-change detection trials. The real EEG signals were used. (author)

  6. Mission Task Analysis for the NATO Defence Requirements Review

    National Research Council Canada - National Science Library

    Armstrong, Stuart

    2005-01-01

    This paper gives a general outline of the NATO Defense Requirements Review (DRR) and how mission analysis has been used to provide a consistent and detailed approach to the decomposition of complex military missions...

  7. Analysis of the chemical equilibrium of combustion at constant volume

    Directory of Open Access Journals (Sweden)

    Marius BREBENEL

    2014-04-01

    Full Text Available Determining the composition of a mixture of combustion gases at a given temperature is based on chemical equilibrium, when the equilibrium constants are calculated on the assumption of constant pressure and temperature. In this paper, an analysis of changes occurring when combustion takes place at constant volume is presented, deriving a specific formula of the equilibrium constant. The simple reaction of carbon combustion in pure oxygen in both cases (constant pressure and constant volume is next considered as example of application, observing the changes occurring in the composition of the combustion gases depending on temperature.

  8. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  9. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  10. Task-related component analysis for functional neuroimaging and application to near-infrared spectroscopy data.

    Science.gov (United States)

    Tanaka, Hirokazu; Katura, Takusige; Sato, Hiroki

    2013-01-01

    Reproducibility of experimental results lies at the heart of scientific disciplines. Here we propose a signal processing method that extracts task-related components by maximizing the reproducibility during task periods from neuroimaging data. Unlike hypothesis-driven methods such as general linear models, no specific time courses are presumed, and unlike data-driven approaches such as independent component analysis, no arbitrary interpretation of components is needed. Task-related components are constructed by a linear, weighted sum of multiple time courses, and its weights are optimized so as to maximize inter-block correlations (CorrMax) or covariances (CovMax). Our analysis method is referred to as task-related component analysis (TRCA). The covariance maximization is formulated as a Rayleigh-Ritz eigenvalue problem, and corresponding eigenvectors give candidates of task-related components. In addition, a systematic statistical test based on eigenvalues is proposed, so task-related and -unrelated components are classified objectively and automatically. The proposed test of statistical significance is found to be independent of the degree of autocorrelation in data if the task duration is sufficiently longer than the temporal scale of autocorrelation, so TRCA can be applied to data with autocorrelation without any modification. We demonstrate that simple extensions of TRCA can provide most distinctive signals for two tasks and can integrate multiple modalities of information to remove task-unrelated artifacts. TRCA was successfully applied to synthetic data as well as near-infrared spectroscopy (NIRS) data of finger tapping. There were two statistically significant task-related components; one was a hemodynamic response, and another was a piece-wise linear time course. In summary, we conclude that TRCA has a wide range of applications in multi-channel biophysical and behavioral measurements. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Micro analysis of fringe field formed inside LDA measuring volume

    International Nuclear Information System (INIS)

    Ghosh, Abhijit; Nirala, A K

    2016-01-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement. (paper)

  12. Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.

    Science.gov (United States)

    De Dreu, Carsten K W; Weingart, Laurie R

    2003-08-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.

  13. Task 11 - systems analysis of environmental management technologies. Topical report

    International Nuclear Information System (INIS)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy ampersand Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech

  14. Task 11 - systems analysis of environmental management technologies

    Energy Technology Data Exchange (ETDEWEB)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy & Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech.

  15. National Trends in Prostate Biopsy and Radical Prostatectomy Volumes Following the US Preventive Services Task Force Guidelines Against Prostate-Specific Antigen Screening.

    Science.gov (United States)

    Halpern, Joshua A; Shoag, Jonathan E; Artis, Amanda S; Ballman, Karla V; Sedrakyan, Art; Hershman, Dawn L; Wright, Jason D; Shih, Ya Chen Tina; Hu, Jim C

    2017-02-01

    Studies demonstrate that use of prostate-specific antigen screening decreased significantly following the US Preventive Services Task Force (USPSTF) recommendation against prostate-specific antigen screening in 2012. To determine downstream effects on practice patterns in prostate cancer diagnosis and treatment following the 2012 USPSTF recommendation. Procedural volumes of certifying and recertifying urologists from 2009 through 2016 were evaluated for variation in prostate biopsy and radical prostatectomy (RP) volume. Trends were confirmed using the New York Statewide Planning and Research Cooperative System and Nationwide Inpatient Sample. The study included a representative sample of urologists across practice settings and nationally representative sample of all RP discharges. We obtained operative case logs from the American Board of Urology and identified urologists performing at least 1 prostate biopsy (n = 5173) or RP (n = 3748), respectively. The 2012 USPSTF recommendation against routine population-wide prostate-specific antigen screening. Change in median biopsy and RP volume per urologist and national procedural volume. Following the USPSTF recommendation, median biopsy volume per urologist decreased from 29 to 21 (interquartile range [IQR}, 12-34; P following 2012 (parameter estimate, -0.25; SE, 0.03; P following the USPSTF recommendation, median RP volume per urologist decreased from 7 (IQR, 3-15) to 6 (IQR, 2-12) (P Following the 2012 USPSTF recommendation, prostate biopsy and RP volumes decreased significantly. A panoramic vantage point is needed to evaluate the long-term consequences of the 2012 USPSTF recommendation.

  16. Price-volume multifractal analysis of the Moroccan stock market

    Science.gov (United States)

    El Alaoui, Marwane

    2017-11-01

    In this paper, we analyzed price-volume multifractal cross-correlations of Moroccan Stock Exchange. We chose the period from January 1st 2000 to January 20th 2017 to investigate the multifractal behavior of price change and volume change series. Then, we used multifractal detrended cross-correlations analysis method (MF-DCCA) and multifractal detrended fluctuation analysis (MF-DFA) to analyze the series. We computed bivariate generalized Hurst exponent, Rényi exponent and spectrum of singularity for each pair of indices to measure quantitatively cross-correlations. Furthermore, we used detrended cross-correlations coefficient (DCCA) and cross-correlation test (Q(m)) to analyze cross-correlation quantitatively and qualitatively. By analyzing results, we found existence of price-volume multifractal cross-correlations. The spectrum width has a strong multifractal cross-correlation. We remarked that volume change series is anti-persistent when we analyzed the generalized Hurst exponent for all moments q. The cross-correlation test showed the presence of a significant cross-correlation. However, DCCA coefficient had a small positive value, which means that the level of correlation is not very significant. Finally, we analyzed sources of multifractality and their degree of contribution in the series.

  17. SDSS Log Viewer: visual exploratory analysis of large-volume SQL log data

    Science.gov (United States)

    Zhang, Jian; Chen, Chaomei; Vogeley, Michael S.; Pan, Danny; Thakar, Ani; Raddick, Jordan

    2012-01-01

    User-generated Structured Query Language (SQL) queries are a rich source of information for database analysts, information scientists, and the end users of databases. In this study a group of scientists in astronomy and computer and information scientists work together to analyze a large volume of SQL log data generated by users of the Sloan Digital Sky Survey (SDSS) data archive in order to better understand users' data seeking behavior. While statistical analysis of such logs is useful at aggregated levels, efficiently exploring specific patterns of queries is often a challenging task due to the typically large volume of the data, multivariate features, and data requirements specified in SQL queries. To enable and facilitate effective and efficient exploration of the SDSS log data, we designed an interactive visualization tool, called the SDSS Log Viewer, which integrates time series visualization, text visualization, and dynamic query techniques. We describe two analysis scenarios of visual exploration of SDSS log data, including understanding unusually high daily query traffic and modeling the types of data seeking behaviors of massive query generators. The two scenarios demonstrate that the SDSS Log Viewer provides a novel and potentially valuable approach to support these targeted tasks.

  18. Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis.

    Science.gov (United States)

    Wiens, Stefan; Szychowska, Malina; Nilsson, Mats E

    2016-01-01

    Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN.

  19. CADDIS Volume 4. Data Analysis: Exploratory Data Analysis

    Science.gov (United States)

    Intro to exploratory data analysis. Overview of variable distributions, scatter plots, correlation analysis, GIS datasets. Use of conditional probability to examine stressor levels and impairment. Exploring correlations among multiple stressors.

  20. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 1: Issues, Impacts, and Economics of Wind and Hydropower Integration

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  1. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  2. Pressure fluctuation analysis for charging pump of chemical and volume control system of nuclear power plant

    Directory of Open Access Journals (Sweden)

    Chen Qiang

    2016-01-01

    Full Text Available Equipment Failure Root Cause Analysis (ERCA methodology is employed in this paper to investigate the root cause for charging pump’s pressure fluctuation of chemical and volume control system (RCV in pressurized water reactor (PWR nuclear power plant. RCA project task group has been set up at the beginning of the analysis process. The possible failure modes are listed according to the characteristics of charging pump’s actual pressure fluctuation and maintenance experience during the analysis process. And the failure modes are analysed in proper sequence by the evidence-collecting. It suggests that the gradually untightened and loosed shaft nut in service should be the root cause. And corresponding corrective actions are put forward in details.

  3. Tool development in threat assessment: syntax regularization and correlative analysis. Final report Task I and Task II, November 21, 1977-May 21, 1978. [Linguistic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miron, M.S.; Christopher, C.; Hirshfield, S.; Su, B.

    1978-05-01

    Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to root form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.

  4. Diversion Path Analysis handbook. Volume 4 (of 4 volumes). Computer Program 2

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 2 (DPACP-2) is used to produce tables and statistics on modifications identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 259088 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-2 assist the DPA team in analyzing and evaluating modifications to the plant's safeguards system that would eliminate, or reduce the severity of, vulnerabilities identified by means of the DPA. These vulnerabilities relate to the capability of the plant's material control and material accounting subsystems to indicate diversion of special nuclear material (SNM) by a knowledgeable insider

  5. Analysis of volume expansion data for periclase, lime, corundum ...

    Indian Academy of Sciences (India)

    Abstract. We have presented an analysis of the volume expansion data for periclase (MgO), lime (CaO), corundum. (Al2O3) and spinel (MgAl2O4) determined experimentally by Fiquet et al (1999) from 300K up to 3000K. The ther- mal equation of state due to Suzuki et al (1979) and Shanker et al (1997) are used to study the ...

  6. AGAPE-ET for human error analysis of emergency tasks and its application

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2002-01-01

    The paper presents a proceduralised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), covering both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET method is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of the performance influencing factors (PIFs) on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations and a human error analysis procedure based on the error analysis items is organised to help the analysts cue or guide overall human error analysis. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The paper also presents the application of AGAPE-ET to 31 nuclear emergency tasks and its results

  7. Repowering analysis: Hanford Generating Project (HGP), Task Order Number 6

    International Nuclear Information System (INIS)

    1988-12-01

    The Hanford Generating Project (HGP), owned by the Washington Public Power Supply System, consists of two low pressure steam turbines, generators, and associated equipment located adjacent to the Department of Energy's (DOE) N-Reactor. HGP has been able to produce approximately 800 MWe with low pressure steam supplied by N-Reactor. DOE has placed N-Reactor in cold standby status for an undetermined length of time. This results in the idling of the HGP since no alternative source of steam is available. Bonneville Power Administration contracted with Fluor Daniel, Inc. to investigate the feasibility and cost of constructing a new source of steam for (repowering) one of the HGP turbines. The steam turbine is currently operated with 135 psia steam. The turbines can be rebuilt to operate with 500 psia steam pressure by adding additional stages, buckets, nozzles, and diaphragms. Because of the low pressure design, this turbine can never achieve the efficiencies possible in new high pressure turbines by the presences of existing equipment reduces the capital cost of a new generating resource. Five repowering options were investigated in this study. Three cases utilizing gas turbine combined cycle steam generation equipment, one case utilizing a gas fired boiler, and a case utilizing a coal fired boiler. This report presents Fluor Daniel's analysis of these repowering options

  8. Analysis of Member State RED implementation. Final Report (Task 2)

    Energy Technology Data Exchange (ETDEWEB)

    Peters, D.; Alberici, S.; Toop, G. [Ecofys, Utrecht (Netherlands); Kretschmer, B. [Institute for European Environmental Policy IEEP, London (United Kingdom)

    2012-12-15

    This report describes the way EU Member States have transposed the sustainability and chain of custody requirements for biofuels as laid down in the Renewable Energy Directive (RED) and Fuel Quality Directive (FQD). In the assessment of Member States' implementation, the report mainly focuses on effectiveness and administrative burden. Have Member States transposed the Directives in such a way that compliance with the sustainability criteria can be ensured as effectively as possible? To what extent does the Member States' implementation lead to unnecessary administrative burden for economic operators in the (bio)fuel supply chain? The report focuses specifically on the transposition of the sustainability and chain of custody requirements, not on the target for renewables on transport. This means that for example the double counting provision is not included as part of the scope of this report. This report starts with an introduction covering the implementation of the Renewable Energy (and Fuel Quality) Directive into national legislation, the methodology by which Member States were assessed against effectiveness and administrative burden and the categorisation of Member State's national systems for RED-implementation (Chapter 1). The report continues with a high level description of each Member State system assessed (Chapter 2). Following this, the report includes analysis of the Member States on the effectiveness and administrative burden of a number of key ('major') measures (Chapter 3). The final chapter presents the conclusions and recommendations (Chapter 4)

  9. Degradation mode analysis: An approach to establish effective predictive maintenance tasks

    International Nuclear Information System (INIS)

    Sonnett, D.E.; Douglass, P.T.; Barnard, D.D.

    1991-01-01

    A significant number of nuclear generating stations have been employing Reliability Centered Maintenance methodology to arrive at applicable and effective maintenance tasks for their plant equipment. The resultant endpoint of most programs has been an increased emphasis on predictive maintenance as the task of choice for monitoring and trending plant equipment condition to address failure mechanisms of the analyses. Many of these plants have spent several years conducting reliability centered analysis before they seriously begin implementing predictive program improvements. In this paper we present another methodology, entitled Degradation Mode Analysis, which provides a more direct method to quickly and economically achieve the major benefit of reliability centered analysis, namely predictive maintenance. (author)

  10. Energy use in the marine transportation industry: Task III. Efficiency improvements; Task IV. Industry future. Final report, Volume IV. [Projections for year 2000

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Tasks III and IV measure the characteristics of potential research and development programs that could be applied to the maritime industry. It was necessary to identify potential operating scenarios for the maritime industry in the year 2000 and determine the energy consumption that would result given those scenarios. After the introductory chapter the operational, regulatory, and vessel-size scenarios for the year 2000 are developed in Chapter II. In Chapter III, future cargo flows and expected levels of energy use for the baseline 2000 projection are determined. In Chapter IV, the research and development programs are introduced into the future US flag fleet and the energy-savings potential associated with each is determined. The first four appendices (A through D) describe each of the generic technologies. The fifth appendix (E) contains the baseline operating and cost parameters against which 15 program areas were evaluated. (MCW)

  11. Do tasks make a difference? Accounting for heterogeneity of performance of children with reading difficulties on tasks of executive function: findings from a meta-analysis.

    Science.gov (United States)

    Booth, Josephine N; Boyle, James M E; Kelly, Steve W

    2010-03-01

    Research studies have implicated executive functions in reading difficulties (RD). But while some studies have found children with RD to be impaired on tasks of executive function other studies report unimpaired performance. A meta-analysis was carried out to determine whether these discrepant findings can be accounted for by differences in the tasks of executive function that are utilized. A total of 48 studies comparing the performance on tasks of executive function of children with RD with their typically developing peers were included in the meta-analysis, yielding 180 effect sizes. An overall effect size of 0.57 (SE .03) was obtained, indicating that children with RD have impairments on tasks of executive function. However, effect sizes varied considerably suggesting that the impairment is not uniform. Moderator analysis revealed that task modality and IQ-achievement discrepancy definitions of RD influenced the magnitude of effect; however, the age and gender of participants and the nature of the RD did not have an influence. While the children's RD were associated with executive function impairments, variation in effect size is a product of the assessment task employed, underlying task demands, and definitional criteria.

  12. District heating and cooling systems for communities through power-plant retrofit and distribution network. Volume 2. Tasks 1-3. Final report. [Downtown Toledo steam system

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Each of the tasks is described separately: Task 1 - Demonstration Team; Task 2 - Identify Thermal Energy Source(s) and Potential Service Area(s); and Task 3 - Energy Market Analysis. The purpose of the project is to establish and implement measures in the downtown Toledo steam system for conserving scarce fuel supplies through cogeneration, by retrofit of existing base- or intermediate-loaded electric-generating plants to provide for central heating and cooling systems, with the ultimate purpose of applying the results to other communities. For Task 1, Toledo Edison Company has organized a Demonstration Team (Battelle Columbus Laboratories; Stone and Webster; Ohio Dept. of Energy; Public Utilities Commission of Ohio; Toledo Metropolitan Area Council of Governments; and Toledo Edison) that it hopes has the expertise to evaluate the technical, legal, economic, and marketing issues related to the utilization of by-product heat from power generation to supply district heating and cooling services. Task 2 gives a complete technical description of the candidate plant(s), its thermodynamic cycle, role in load dispatch, ownership, and location. It is concluded that the Toledo steam distribution system can be the starting point for developing a new district-heating system to serve an expanding market. Battelle is a member of the team employed as a subcontractor to complete the energy market analysis. The work is summarized in Task 3. (MCW)

  13. Diversion Path Analysis handbook. Volume 3 (of 4 volumes). Computer Program 1

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 1 (DPACP-1), is used to assemble and tabulate the data for Specific Diversion Paths (SDPs) identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 255498 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-1 are used by the DPA team to assist in analyzing vulnerabilities, in a plant's material control and material accounting subsystems, to diversion of special nuclear material (SNM) by a knowledgable insider. Based on this analysis, the DPA team can identify, and propose to plant management, modifications to the plant's safeguards system that would eliminate, or reduce the severity of, the identified vulnerabilities. The data are also used by plant supervision when investigating a potential diversion

  14. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    Science.gov (United States)

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  15. Closed-loop, pilot/vehicle analysis of the approach and landing task

    Science.gov (United States)

    Anderson, M. R.; Schmidt, D. K.

    1986-01-01

    In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.

  16. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. Using plant procedures as the basis for conducting a job and task analysis

    International Nuclear Information System (INIS)

    Haynes, F.H.; Ruth, B.W.

    1985-01-01

    Plant procedures were selected, by Northeast Utilities (NU), as the basis for conducting Job and Task Analyses (JTA). The resultant JTA was used to design procedure based simulator training programs for Millstone 1, 2, and Connecticut Yankee. The task listings were both plant specific and exhibited excellent correlation to INPO's generic PWR and BWR task analyses. Using the procedures based method enabled us to perform the JTA using plant and training staff. This proved cost effective in terms of both time and money. Learning objectives developed from the JTA were easily justified and correlated directly to job performance within the context of the plant procedures. In addition, the analysis generated a comprehensive review of plant procedures and, conversely, the plant's normal procedure revision process generated an automatic trigger for updating the task data

  18. Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics

    Science.gov (United States)

    Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph

    2011-11-01

    Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.

  19. [ELECTROPHYSIOLOGIC ANALYSIS OF MENTAL ARITHMETIC TASK BY THE "MINIMUM SPANNING TREE" METHOD].

    Science.gov (United States)

    Boha, Roland; Tóth Brigitta; Kardos, Zsófia; Bálint, File; Gaál, Zsófia Anna; Molnár, Márk

    2016-03-30

    In the present study basic arithmetic induced rearrangements in functional connections of the brain were investigated by using graph theoretical analysis what becomes increasingly important both in theoretical neuroscience and also in clinical investigations. During mental arithmetic operations (working) memory plays an important role, but there are only a few studies in which an attempt was made to separate this effect from the process of arithmetic operations themselves. The goal of our study was to separate the neural networks involved in cognitive functions. As an attempt to clarify this issue the graph-theoretical "minimal spanning tree" method was used for the analysis of EEG recorded during task performance. The effects of passive viewing, number recognition and mental arithmetic on PLI based minimal spanning trees (MST) were investigated on the EEG in young adults (adding task: 17 subjects; passive viewing and number recognition: 16 subjects) in the θ (4-8 Hz) frequency band. Occipital task relevant synchronization was found by using the different methods, probably related to the effect of visual stimulation. With respect to diameter, eccentricity and fraction of leafs different task-related changes were found. It was shown that the task related changes of various graph indices are capable to identify networks behind the various relevant dominant functions. Thus the "minimal spanning tree" method is suitable for the analysis of the reorganization of the brain with respect to cognitive functions.

  20. Performance monitoring and analysis of task-based OpenMP.

    Directory of Open Access Journals (Sweden)

    Yi Ding

    Full Text Available OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  1. Performance monitoring and analysis of task-based OpenMP.

    Science.gov (United States)

    Ding, Yi; Hu, Kai; Wu, Kai; Zhao, Zhenlong

    2013-01-01

    OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  2. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  3. Synfuel program analysis. Volume I. Procedures-capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This is the first of the two volumes describing the analytic procedures and resulting capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative synfuel projects and integrated programs. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specifiy cases and interpret outputs. It also contains an explicit description (with examples) of the types of results which can be obtained when applied to: the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. In all cases, the objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  4. Sensitivity analysis of project appraisal variables. Volume I. Key variables

    Energy Technology Data Exchange (ETDEWEB)

    1979-07-01

    The Division of Fossil Fuel Utilization within the US Department of Energy (DOE) uses a project appraisal methodology for annual assessment of its research and development projects. Exercise of the methodology provides input to the budget preparation and planning process. Consequently, it is essential that all apraisal inputs and outputs are as accurate and credible as possible. The purpose of this task is to examine the accuracy and credibility of 1979 appraisal results by conducting a sensitivity analysis of several appraisal inputs. This analysis is designed to: examine the sensitivity of the results to adjustments in the values of selected parameters; explain the differences between computed ranks and professional judgment ranks; and revise the final results of 1979 project appraisal and provide the first inputs to refinement of the appraisal methodology for future applications.

  5. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  6. Benchmarking of Document Image Analysis Tasks for Palm Leaf Manuscripts from Southeast Asia

    Directory of Open Access Journals (Sweden)

    Made Windu Antara Kesiman

    2018-02-01

    Full Text Available This paper presents a comprehensive test of the principal tasks in document image analysis (DIA, starting with binarization, text line segmentation, and isolated character/glyph recognition, and continuing on to word recognition and transliteration for a new and challenging collection of palm leaf manuscripts from Southeast Asia. This research presents and is performed on a complete dataset collection of Southeast Asian palm leaf manuscripts. It contains three different scripts: Khmer script from Cambodia, and Balinese script and Sundanese script from Indonesia. The binarization task is evaluated on many methods up to the latest in some binarization competitions. The seam carving method is evaluated for the text line segmentation task, compared to a recently new text line segmentation method for palm leaf manuscripts. For the isolated character/glyph recognition task, the evaluation is reported from the handcrafted feature extraction method, the neural network with unsupervised learning feature, and the Convolutional Neural Network (CNN based method. Finally, the Recurrent Neural Network-Long Short-Term Memory (RNN-LSTM based method is used to analyze the word recognition and transliteration task for the palm leaf manuscripts. The results from all experiments provide the latest findings and a quantitative benchmark for palm leaf manuscripts analysis for researchers in the DIA community.

  7. Task Inhibition and Response Inhibition in Older versus Younger Adults: A Diffusion Model Analysis

    Directory of Open Access Journals (Sweden)

    Stefanie Schuch

    2016-11-01

    Full Text Available Differences in inhibitory ability between older (64-79 years, N=24 and younger adults (18-26 years, N =24 were investigated using a diffusion model analysis. Participants performed a task-switching paradigm that allows assessing n-2 task repetition costs, reflecting inhibitory control on the level of tasks, as well as n-1 response-repetition costs, reflecting inhibitory control on the level of responses. N-2 task repetition costs were of similar size in both age groups. Diffusion model analysis revealed that for both younger and older adults, drift rate parameters were smaller in the inhibition condition relative to the control condition, consistent with the idea that persisting task inhibition slows down response selection. Moreover, there was preliminary evidence for task inhibition effects in threshold separation and non-decision time in the older, but not the younger adults, suggesting that older adults might apply different strategies when dealing with persisting task inhibition. N-1 response-repetition costs in mean RT tended to be larger in older than younger adults, but in mean error rates were larger in younger than older adults. Diffusion-model analysis revealed longer non-decision times in response repetitions than response switches in both age groups, consistent with the idea that motor processes take longer in response repetitions than response switches due to persisting response inhibition of a previously executed response. The data also revealed age-related differences in overall performance: Older adults responded more slowly and more accurately than young adults, which was reflected by a higher threshold separation parameter in diffusion model analysis. Moreover, older adults showed larger non-decision times and higher variability in non-decision time than young adults, possibly reflecting slower and more variable motor processes. In contrast, overall drift rate did not differ between older and younger adults. Taken together

  8. Use of Job Task Analysis (JTA) in the development of craft training programs

    International Nuclear Information System (INIS)

    Gonyeau, J.A.; Long, R.E.

    1985-01-01

    Northern States Power Company is making a major effort to develop performance based training. It is finding the use of JTA data very helpful in the revision of its maintenance craft training programs. The technique being used involves a group of interns from the Training and Development Program of the University of Minnesota. These interns are largely graduate students, but with no nuclear and little mechanical/electrical experience. A Job Analysis for each discipline was used to: guide the following task analysis, determine program content, evaluate existing OJT check lists, and to define the four crafts used for mechanical maintenance. From the Job Analysis, a Training Task List was developed and correlated to training materials. The analysis of the tasks on the Training Task List is proceeding. Taxonomies of systems or subjects are compared to existing lesson plans. These taxonomies are useful when writing new lesson plans. The taxonomies are an excellent start for the development of enabling objectives. A Nine-Step Plan is being followed in the application of JTA data to the development and refinement of performance based training

  9. Formative Research on the Simplifying Conditions Method (SCM) for Task Analysis and Sequencing.

    Science.gov (United States)

    Kim, YoungHwan; Reigluth, Charles M.

    The Simplifying Conditions Method (SCM) is a set of guidelines for task analysis and sequencing of instructional content under the Elaboration Theory (ET). This article introduces the fundamentals of SCM and presents the findings from a formative research study on SCM. It was conducted in two distinct phases: design and instruction. In the first…

  10. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  11. Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia

    Science.gov (United States)

    Gucev, Gligor V.

    2012-01-01

    Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…

  12. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    International Nuclear Information System (INIS)

    Torralba, B.; Martinez-Arias, R.

    2007-01-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  13. Cognitive Task Analysis (CTA) in the Continuing/ Higher Education Methods Using Games (CHERMUG) Project

    NARCIS (Netherlands)

    Boyle, Elizabeth; Van Rosmalen, Peter; MacArthur, Ewan; Connolly, Thomas; Hainey, Thomas

    2012-01-01

    Boyle, E., Van Rosmalen, P., MacArthur, E., Connolly, T., Hainey, T. & et al. (2012). Cognitive Task Analysis (CTA) in the Continuing/ Higher Education Methods Using Games (CHERMUG) Project. In P. Felicia (Ed.), Proceedings of the 6th European Conference on Games Based Learning (pp. 63-71). October,

  14. An Analysis of Problem-Posing Tasks in Chinese and US Elementary Mathematics Textbooks

    Science.gov (United States)

    Cai, Jinfa; Jiang, Chunlian

    2017-01-01

    This paper reports on 2 studies that examine how mathematical problem posing is integrated in Chinese and US elementary mathematics textbooks. Study 1 involved a historical analysis of the problem-posing (PP) tasks in 3 editions of the most widely used elementary mathematics textbook series published by People's Education Press in China over 3…

  15. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    Energy Technology Data Exchange (ETDEWEB)

    Torralba, B.; Martinez-Arias, R.

    2007-07-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  16. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    NARCIS (Netherlands)

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    PURPOSE: Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the

  17. Boundary error analysis and categorization in the TRECVID news story segmentation task

    NARCIS (Netherlands)

    Arlandis, J.; Over, P.; Kraaij, W.

    2005-01-01

    In this paper, an error analysis based on boundary error popularity (frequency) including semantic boundary categorization is applied in the context of the news story segmentation task from TRECVTD1. Clusters of systems were defined based on the input resources they used including video, audio and

  18. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    Directory of Open Access Journals (Sweden)

    Yigzaw T

    2016-05-01

    Full Text Available Tegbar Yigzaw,1 Catherine Carr,2 Jelle Stekelenburg,3,4 Jos van Roosmalen,5 Hannah Gibson,1 Mintwab Gelagay,1 Azeb Admassu6 1Jhpiego, Addis Ababa, Ethiopia; 2Jhpiego, Washington DC, USA; 3Department of Obstetrics and Gynecology, Leeuwarden Medical Centre, Leeuwarden, 4Department of Health Sciences, Global Health, University Medical Centre Groningen, University of Groningen, Groningen, 5Faculty of Earth and Life Sciences, Vrije Universiteit, Amsterdam, the Netherlands; 6Federal Ministry of Health, Addis Ababa, Ethiopia Purpose: Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. Methods: We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest “not capable” and “never” done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. Results: One hundred and thirty-eight midwives participated in the study. The majority of

  19. Analysis of increasing trend of mortgage volume in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Petra Střelcová

    2009-01-01

    Full Text Available The aim of this paper is an empirical analysis of mortgage volume in the Czech Republic and factors identification of the increasing trend of the mortgage volume in the period from 2001 to 2007. Firstly, analysis of quarterly time series of mortgage volume and average mortgage rate are performed. Consequently, causality between mortgage volume and average mortgage rate is analysed. The morgage rate is the most important factor for economic subjects decision of residential investment. Afterwards, it is analysed causality between mortgage volume and selected factors via multiple regression analysis. Based on this analysis, influencing factors for multiple regression analysis describing mortgage volume are selected. Our empirical analysis validate the causality between mortgage volume and mortgage rate, unemployment rate and price level of real estates. Part of this paper is also economic eduction of causality and estimation of expect progress of mortgage volume especially in connection with present economic and business recession.

  20. Factor analysis for imperfect maintenance planning at nuclear power plants by cognitive task analysis

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Iida, Hiroyasu

    2011-01-01

    Imperfect maintenance planning was frequently identified in domestic nuclear power plants. To prevent such an event, we analyzed causal factors in maintenance planning stages and showed the directionality of countermeasures in this study. There is a pragmatic limit in finding the causal factors from the items based on report descriptions. Therefore, the idea of the systemic accident model, which is used to monitor the performance variability in normal circumstances, is taken as a new concept instead of investigating negative factors. As an actual method for analyzing usual activities, cognitive task analysis (CTA) was applied. Persons who experienced various maintenance activities at one electric power company were interviewed about sources related to decision making during maintenance planning, and then usual factors affecting planning were extracted as performance variability factors. The tendency of domestic events was analyzed using the classification item of those factors, and the directionality of countermeasures was shown. The following are critical for preventing imperfect maintenance planning: the persons in charge should fully understand the situation of the equipment for which they are responsible in the work planning and maintenance evaluation stages, and they should definitely understand, for example, the maintenance bases of that equipment. (author)

  1. Directionality analysis on functional magnetic resonance imaging during motor task using Granger causality.

    Science.gov (United States)

    Anwar, A R; Muthalib, M; Perrey, S; Galka, A; Granert, O; Wolff, S; Deuschl, G; Raethjen, J; Heute, U; Muthuraman, M

    2012-01-01

    Directionality analysis of signals originating from different parts of brain during motor tasks has gained a lot of interest. Since brain activity can be recorded over time, methods of time series analysis can be applied to medical time series as well. Granger Causality is a method to find a causal relationship between time series. Such causality can be referred to as a directional connection and is not necessarily bidirectional. The aim of this study is to differentiate between different motor tasks on the basis of activation maps and also to understand the nature of connections present between different parts of the brain. In this paper, three different motor tasks (finger tapping, simple finger sequencing, and complex finger sequencing) are analyzed. Time series for each task were extracted from functional magnetic resonance imaging (fMRI) data, which have a very good spatial resolution and can look into the sub-cortical regions of the brain. Activation maps based on fMRI images show that, in case of complex finger sequencing, most parts of the brain are active, unlike finger tapping during which only limited regions show activity. Directionality analysis on time series extracted from contralateral motor cortex (CMC), supplementary motor area (SMA), and cerebellum (CER) show bidirectional connections between these parts of the brain. In case of simple finger sequencing and complex finger sequencing, the strongest connections originate from SMA and CMC, while connections originating from CER in either direction are the weakest ones in magnitude during all paradigms.

  2. Does complexity matter? Meta-analysis of learner performance in artificial grammar tasks.

    Science.gov (United States)

    Schiff, Rachel; Katan, Pesia

    2014-01-01

    Complexity has been shown to affect performance on artificial grammar learning (AGL) tasks (categorization of test items as grammatical/ungrammatical according to the implicitly trained grammar rules). However, previously published AGL experiments did not utilize consistent measures to investigate the comprehensive effect of grammar complexity on task performance. The present study focused on computerizing Bollt and Jones's (2000) technique of calculating topological entropy (TE), a quantitative measure of AGL charts' complexity, with the aim of examining associations between grammar systems' TE and learners' AGL task performance. We surveyed the literature and identified 56 previous AGL experiments based on 10 different grammars that met the sampling criteria. Using the automated matrix-lift-action method, we assigned a TE value for each of these 10 previously used AGL systems and examined its correlation with learners' task performance. The meta-regression analysis showed a significant correlation, demonstrating that the complexity effect transcended the different settings and conditions in which the categorization task was performed. The results reinforced the importance of using this new automated tool to uniformly measure grammar systems' complexity when experimenting with and evaluating the findings of AGL studies.

  3. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    Science.gov (United States)

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    Purpose Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. Methods We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest “not capable” and “never” done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. Results One hundred and thirty-eight midwives participated in the study. The majority of respondents recognized the importance of midwifery tasks (89%), felt they were capable (91.8%), reported doing them frequently (63.9%), and learned them during preservice education (56.3%). We identified competence gaps in tasks related to obstetric complications, gynecology, public health, professional duties, and prevention of mother to child transmission of HIV. Moreover, our study helped to determine composition of the licensing exam for university graduates. Conclusion The task analysis indicates that midwives provide critical reproductive

  4. NEQ and task in dual-energy imaging: from cascaded systems analysis to human observer performance

    Science.gov (United States)

    Richard, Samuel; Siewerdsen, Jeffrey H.; Tward, Daniel J.

    2008-03-01

    The relationship between theoretical descriptions of imaging performance (Fourier-based cascaded systems analysis) and the performance of real human observers was investigated for various detection and discrimination tasks. Dual-energy (DE) imaging provided a useful basis for investigating this relationship, because it presents a host of acquisition and processing parameters that can significantly affect signal and noise transfer characteristics and, correspondingly, human observer performance. The detectability index was computed theoretically using: 1) cascaded systems analysis of the modulation transfer function (MTF), and noise-power spectrum (NPS) for DE imaging; 2) a Fourier description of imaging task; and 3.) integration of MTF, NPS, and task function according to various observer models, including Fisher-Hotelling and non-prewhitening with and without an eye filter and internal noise. Three idealized tasks were considered: sphere detection, shape discrimination (sphere vs. disk), and texture discrimination (uniform vs. textured disk). Using images of phantoms acquired on a prototype DE imaging system, human observer performance was assessed in multiple-alternative forced choice (MAFC) tests, giving an estimate of area under the ROC curve (A Ζ). The degree to which the theoretical detectability index correlated with human observer performance was investigated, and results agreed well over a broad range of imaging conditions, depending on the choice of observer model. Results demonstrated that optimal DE image acquisition and decomposition parameters depend significantly on the imaging task. These studies provide important initial validation that the detectability index derived theoretically by Fourier-based cascaded systems analysis correlates well with actual human observer performance and represents a meaningful metric for system optimization.

  5. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  6. Control system of the inspection robots group applying auctions and multi-criteria analysis for task allocation

    Science.gov (United States)

    Panfil, Wawrzyniec; Moczulski, Wojciech

    2017-10-01

    In the paper presented is a control system of a mobile robots group intended for carrying out inspection missions. The main research problem was to define such a control system in order to facilitate a cooperation of the robots resulting in realization of the committed inspection tasks. Many of the well-known control systems use auctions for tasks allocation, where a subject of an auction is a task to be allocated. It seems that in the case of missions characterized by much larger number of tasks than number of robots it will be better if robots (instead of tasks) are subjects of auctions. The second identified problem concerns the one-sided robot-to-task fitness evaluation. Simultaneous assessment of the robot-to-task fitness and task attractiveness for robot should affect positively for the overall effectiveness of the multi-robot system performance. The elaborated system allows to assign tasks to robots using various methods for evaluation of fitness between robots and tasks, and using some tasks allocation methods. There is proposed the method for multi-criteria analysis, which is composed of two assessments, i.e. robot's concurrency position for task among other robots and task's attractiveness for robot among other tasks. Furthermore, there are proposed methods for tasks allocation applying the mentioned multi-criteria analysis method. The verification of both the elaborated system and the proposed tasks' allocation methods was carried out with the help of simulated experiments. The object under test was a group of inspection mobile robots being a virtual counterpart of the real mobile-robot group.

  7. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 2. Appendixes

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume 2 contains program listings including subroutines for the four TSC frequency domain programs described in V...

  8. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  9. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  10. Characterisation of radiotherapy planning volumes using textural analysis.

    Science.gov (United States)

    Nailon, William H; Redpath, Anthony T; McLaren, Duncan B

    2008-01-01

    Computer-based artificial intelligence methods for classification and delineation of the gross tumour volume (GTV) on computerised tomography (CT) and magnetic resonance (MR) images do not, at present, provide the accuracy required for radiotherapy applications. This paper describes an image analysis method for classification of distinct regions within the GTV, and other clinically relevant regions, on CT images acquired on eight bladder cancer patients at the radiotherapy planning stage and thereafter at regular intervals during treatment. Statistical and fractal textural features (N=27) were calculated on the bladder, rectum and a control region identified on axial, coronal and sagittal CT images. Unsupervised classification results demonstrate that with a reduced feature set (N=3) the approach offers significant classification accuracy on axial, coronal and sagittal CT image planes and has the potential to be developed further for radiotherapy applications, particularly towards an automatic outlining approach.

  11. Characterisation of radiotherapy planning volumes using textural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nailon, William H.; Redpath, Anthony T.; McLaren, Duncan B. (Dept. of Oncology Physics, Edinburgh Cancer Centre, Western General Hospital, Edinburgh (United Kingdom))

    2008-08-15

    Computer-based artificial intelligence methods for classification and delineation of the gross tumour volume (GTV) on computerised tomography (CT) and magnetic resonance (MR) images do not, at present, provide the accuracy required for radiotherapy applications. This paper describes an image analysis method for classification of distinct regions within the GTV, and other clinically relevant regions, on CT images acquired on eight bladder cancer patients at the radiotherapy planning stage and thereafter at regular intervals during treatment. Statistical and fractal textural features (N=27) were calculated on the bladder, rectum and a control region identified on axial, coronal and sagittal CT images. Unsupervised classification results demonstrate that with a reduced feature set (N=3) the approach offers significant classification accuracy on axial, coronal and sagittal CT image planes and has the potential to be developed further for radiotherapy applications, particularly towards an automatic outlining approach

  12. Task-evoked brain functional magnetic susceptibility mapping by independent component analysis (χICA).

    Science.gov (United States)

    Chen, Zikuan; Calhoun, Vince D

    2016-03-01

    Conventionally, independent component analysis (ICA) is performed on an fMRI magnitude dataset to analyze brain functional mapping (AICA). By solving the inverse problem of fMRI, we can reconstruct the brain magnetic susceptibility (χ) functional states. Upon the reconstructed χ dataspace, we propose an ICA-based brain functional χ mapping method (χICA) to extract task-evoked brain functional map. A complex division algorithm is applied to a timeseries of fMRI phase images to extract temporal phase changes (relative to an OFF-state snapshot). A computed inverse MRI (CIMRI) model is used to reconstruct a 4D brain χ response dataset. χICA is implemented by applying a spatial InfoMax ICA algorithm to the reconstructed 4D χ dataspace. With finger-tapping experiments on a 7T system, the χICA-extracted χ-depicted functional map is similar to the SPM-inferred functional χ map by a spatial correlation of 0.67 ± 0.05. In comparison, the AICA-extracted magnitude-depicted map is correlated with the SPM magnitude map by 0.81 ± 0.05. The understanding of the inferiority of χICA to AICA for task-evoked functional map is an ongoing research topic. For task-evoked brain functional mapping, we compare the data-driven ICA method with the task-correlated SPM method. In particular, we compare χICA with AICA for extracting task-correlated timecourses and functional maps. χICA can extract a χ-depicted task-evoked brain functional map from a reconstructed χ dataspace without the knowledge about brain hemodynamic responses. The χICA-extracted brain functional χ map reveals a bidirectional BOLD response pattern that is unavailable (or different) from AICA. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    Directory of Open Access Journals (Sweden)

    Guan Yu

    Full Text Available Accurately identifying mild cognitive impairment (MCI individuals who will progress to Alzheimer's disease (AD is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI and fluorodeoxyglucose positron emission tomography (FDG-PET. However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI subjects and 226 stable MCI (sMCI subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images and also the single-task classification method (using only MRI or only subjects with both MRI and

  14. Stereological analysis of nuclear volume in recurrent meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1994-01-01

    A stereological estimation of nuclear volume in recurrent and non-recurrent meningiomas was made. The aim was to investigate whether this method could discriminate between these two groups. We found that the mean nuclear volumes in recurrent meningiomas were all larger at debut than in any...... nuclear volume in meningiomas might help identify a group at risk of recurrence....

  15. Brain activity across the development of automatic categorization: A comparison of categorization tasks using multi-voxel pattern analysis

    Science.gov (United States)

    Soto, Fabian A.; Waldschmidt, Jennifer G.; Helie, Sebastien; Ashby, F. Gregory

    2013-01-01

    Previous evidence suggests that relatively separate neural networks underlie initial learning of rule-based and information-integration categorization tasks. With the development of automaticity, categorization behavior in both tasks becomes increasingly similar and exclusively related to activity in cortical regions. The present study uses multi-voxel pattern analysis to directly compare the development of automaticity in different categorization tasks. Each of three groups of participants received extensive training in a different categorization task: either an information-integration task, or one of two rule-based tasks. Four training sessions were performed inside an MRI scanner. Three different analyses were performed on the imaging data from a number of regions of interest (ROIs). The common patterns analysis had the goal of revealing ROIs with similar patterns of activation across tasks. The unique patterns analysis had the goal of revealing ROIs with dissimilar patterns of activation across tasks. The representational similarity analysis aimed at exploring (1) the similarity of category representations across ROIs and (2) how those patterns of similarities compared across tasks. The results showed that common patterns of activation were present in motor areas and basal ganglia early in training, but only in the former later on. Unique patterns were found in a variety of cortical and subcortical areas early in training, but they were dramatically reduced with training. Finally, patterns of representational similarity between brain regions became increasingly similar across tasks with the development of automaticity. PMID:23333700

  16. USAF Advanced Terrestrial Energy Study. Volume 4. Analysis, Data, and Bibliography.

    Science.gov (United States)

    1983-04-01

    ICI EXPERIENCIE OF SUCN COAT1INGS APPLIED MY PACKLFENENIATION. %.ALVAMIC PROCEbSES AND PLASMA SPRAYING. AS AN EXAMPLE THE TNeRSQCHIE1MCAL aEuRAIJATION...Il OEMONSTRATION IROJECT. PHASE ZERO. TASK NUMBER 3: OLL IVE wAULE NUMBE 9. VOLUME 2. MARKET ASSESSMENTU COMBUSTION TLST PSOU"AM. SUPPORTING RESULTS...UtMONSTRAT ION PROJECT. PHASE ZERO. TASK NUMBER 3: DELIVERAWE JUWIER 9. VOLUME I. MARKET ASSESSMENT: MARKET OPPORTUNITY I-OR S C-Il FOAL OILCROORATE

  17. Hippocampal volume and auditory attention on a verbal memory task with adult survivors of pediatric brain tumor.

    Science.gov (United States)

    Jayakar, Reema; King, Tricia Z; Morris, Robin; Na, Sabrina

    2015-03-01

    We examined the nature of verbal memory deficits and the possible hippocampal underpinnings in long-term adult survivors of childhood brain tumor. 35 survivors (M = 24.10 ± 4.93 years at testing; 54% female), on average 15 years post-diagnosis, and 59 typically developing adults (M = 22.40 ± 4.35 years, 54% female) participated. Automated FMRIB Software Library (FSL) tools were used to measure hippocampal, putamen, and whole brain volumes. The California Verbal Learning Test-Second Edition (CVLT-II) was used to assess verbal memory. Hippocampal, F(1, 91) = 4.06, ηp² = .04; putamen, F(1, 91) = 11.18, ηp² = .11; and whole brain, F(1, 92) = 18.51, ηp² = .17, volumes were significantly lower for survivors than controls (p memory indices of auditory attention list span (Trial 1: F(1, 92) = 12.70, η² = .12) and final list learning (Trial 5: F(1, 92) = 6.01, η² = .06) were significantly lower for survivors (p auditory attention, but none of the other CVLT-II indices. Secondary analyses for the effect of treatment factors are presented. Volumetric differences between survivors and controls exist for the whole brain and for subcortical structures on average 15 years post-diagnosis. Treatment factors seem to have a unique effect on subcortical structures. Memory differences between survivors and controls are largely contingent upon auditory attention list span. Only hippocampal volume is associated with the auditory attention list span component of verbal memory. These findings are particularly robust for survivors treated with radiation. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  18. Differential recruitment of theory of mind brain network across three tasks: An independent component analysis.

    Science.gov (United States)

    Thye, Melissa D; Ammons, Carla J; Murdaugh, Donna L; Kana, Rajesh K

    2018-03-28

    Social neuroscience research has focused on an identified network of brain regions primarily associated with processing Theory of Mind (ToM). However, ToM is a broad cognitive process, which encompasses several sub-processes, such as mental state detection and intentional attribution, and the connectivity of brain regions underlying the broader ToM network in response to paradigms assessing these sub-processes requires further characterization. Standard fMRI analyses which focus only on brain activity cannot capture information about ToM processing at a network level. An alternative method, independent component analysis (ICA), is a data-driven technique used to isolate intrinsic connectivity networks, and this approach provides insight into network-level regional recruitment. In this fMRI study, three complementary, but distinct ToM tasks assessing mental state detection (e.g. RMIE: Reading the Mind in the Eyes; RMIV: Reading the Mind in the Voice) and intentional attribution (Causality task) were each analyzed using ICA in order to separately characterize the recruitment and functional connectivity of core nodes in the ToM network in response to the sub-processes of ToM. Based on visual comparison of the derived networks for each task, the spatiotemporal network patterns were similar between the RMIE and RMIV tasks, which elicited mentalizing about the mental states of others, and these networks differed from the network derived for the Causality task, which elicited mentalizing about goal-directed actions. The medial prefrontal cortex, precuneus, and right inferior frontal gyrus were seen in the components with the highest correlation with the task condition for each of the tasks highlighting the role of these regions in general ToM processing. Using a data-driven approach, the current study captured the differences in task-related brain response to ToM in three distinct ToM paradigms. The findings of this study further elucidate the neural mechanisms associated

  19. Modification of the Ladder Rung Walking Task?New Options for Analysis of Skilled Movements

    OpenAIRE

    Antonow-Schlorke, Iwa; Ehrhardt, Julia; Knieling, Marcel

    2013-01-01

    Method sensitivity is critical for evaluation of poststroke motor function. Skilled walking was assessed in horizontal, upward, and downward rung ladder walking to compare the demands of the tasks and test sensitivity. The complete step sequence of a walk was subjected to analysis aimed at demonstrating the walking pattern, step sequence, step cycle, limb coordination, and limb interaction to complement the foot fault scoring system. Rats (males, n = 10) underwent unilateral photothrombotic l...

  20. Grammar Teaching in the EFL Classroom: An Analysis of Grammar Tasks in Three Textbooks

    OpenAIRE

    Askeland, Eilén

    2013-01-01

    The aim of this thesis is to examine the grammar tasks in three EFL textbooks. Despite other tools available, the textbook remains an important instructional medium in the classroom. There are several reasons for analysing textbooks. First, it is important that the textbooks are good in order for the teaching to be effective. Second, it is important to investigate whether the textbook is in accordance with the current trends in language teaching. Third, a textbook analysi...

  1. Development of calibration training and procedures using job-task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.A.

    1993-12-01

    Efforts to handle an increased workload with dwindling manpower in the Physical and Electrical Standards Laboratory (Standards Lab) at the Oak Ridge Y-12 Plant are described. Empowerment of workers via Total Quality Management (TQM) is the basis for their efforts. A survey and follow-up team work was the course of action. The job-task analysis received honors by their peers at the Y-12 Plant.

  2. Change Best: Task 2.3. Analysis of policy mix and development of Energy Efficiency Services

    International Nuclear Information System (INIS)

    Boonekamp, P.; Vethman, P.

    2010-04-01

    The aim of the Change Best project is to promote the development of an energy efficiency service (EES) market and to give good practice examples of changes in energy service business, strategies, and supportive policies and measures in the course of the implementation of Directive 2006/32/EC on Energy End-Use Efficiency and Energy Services. This report addresses task 2.3: Analysis of policy mix and development of Energy Efficiency Services.

  3. Concurrent multidisciplinary mechanical design based on design task analysis and knowledge sharing; Sekkei task bunseki to joho kyoyu ni yoru mechatronics kyocho sekkei

    Energy Technology Data Exchange (ETDEWEB)

    Kondo, K.; Ozawa, M.; Mori, T. [Toshiba Corp., Tokyo (Japan)

    1999-09-01

    We have developed a systematic design task planning method based on a design structure matrix(DSM) and a lumped model- based framework for knowledge sharing in a concurrent design environment as key techniques for developing higher quality products in a shorter design time. The DSM facilitates systematic analysis of dependencies among design tasks and optimization of the design process. The framework based on a lumped model description of mechanical systems enables concurrent and cooperative work among multidisciplinary designers at an early stage of the design process. In this paper, we also discuss the relationships between these techniques and the product development flow from product definition to detailed design. (author)

  4. Hawaii Energy Strategy Project 2: Fossil Energy Review. Task IV. Scenario development and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, N.D.; Breazeale, K. [ed.

    1993-12-01

    The Hawaii Energy Strategy (HES) Program is a seven-project effort led by the State of Hawaii Department of Business, Economic Development & Tourism (DBEDT) to investigate a wide spectrum of Hawaii energy issues. The East-West Center`s Program on Resources: Energy and Minerals, has been assigned HES Project 2, Fossil Energy Review, which focuses on fossil energy use in Hawaii and the greater regional and global markets. HES Project 2 has four parts: Task I (World and Regional Fossil Energy Dynamics) covers petroleum, natural gas, and coal in global and regional contexts, along with a discussion of energy and the environment. Task II (Fossil Energy in Hawaii) focuses more closely on fossil energy use in Hawaii: current utilization and trends, the structure of imports, possible future sources of supply, fuel substitutability, and energy security. Task III`s emphasis is Greenfield Options; that is, fossil energy sources not yet used in Hawaii. This task is divided into two sections: first, an in-depth {open_quotes}Assessment of Coal Technology Options and Implications for the State of Hawaii,{close_quotes} along with a spreadsheet analysis model, which was subcontracted to the Environmental Assessment and Information Sciences Division of Argonne National Laboratory; and second, a chapter on liquefied natural gas (LNG) in the Asia-Pacific market and the issues surrounding possible introduction of LNG into the Hawaii market.

  5. Proposal of Constraints Analysis Method Based on Network Model for Task Planning

    Science.gov (United States)

    Tomiyama, Tomoe; Sato, Tatsuhiro; Morita, Toyohisa; Sasaki, Toshiro

    Deregulation has been accelerating several activities toward reengineering business processes, such as railway through service and modal shift in logistics. Making those activities successful, business entities have to regulate new business rules or know-how (we call them ‘constraints’). According to the new constraints, they need to manage business resources such as instruments, materials, workers and so on. In this paper, we propose a constraint analysis method to define constraints for task planning of the new business processes. To visualize each constraint's influence on planning, we propose a network model which represents allocation relations between tasks and resources. The network can also represent task ordering relations and resource grouping relations. The proposed method formalizes the way of defining constraints manually as repeatedly checking the network structure and finding conflicts between constraints. Being applied to crew scheduling problems shows that the method can adequately represent and define constraints of some task planning problems with the following fundamental features, (1) specifying work pattern to some resources, (2) restricting the number of resources for some works, (3) requiring multiple resources for some works, (4) prior allocation of some resources to some works and (5) considering the workload balance between resources.

  6. Pertinent anatomy and analysis for midface volumizing procedures.

    Science.gov (United States)

    Surek, Christopher C; Beut, Javier; Stephens, Robert; Jelks, Glenn; Lamb, Jerome

    2015-05-01

    The study was conducted to construct an anatomically inspired midfacial analysis facilitating safe, accurate, and dynamic nonsurgical rejuvenation. Emphasis is placed on determining injection target areas and adverse event zones. Twelve hemifacial fresh cadavers were dissected in a layered fashion. Dimensional measurements between the midfacial fat compartments, prezygomatic space, mimetic muscles, and neurovascular bundles were used to develop a topographic analysis for clinical injections. A longitudinal line from the base of the alar crease to the medial edge of the levator anguli oris muscle (1.9 cm), lateral edge of the levator anguli oris muscle (2.6 cm), and zygomaticus major muscle (4.6 cm) partitions the cheek into two aesthetic regions. A six-step facial analysis outlines three target zones and two adverse event zones and triangulates the point of maximum cheek projection. The lower adverse event zone yields an anatomical explanation to inadvertent jowling during anterior cheek injection. The upper adverse event zone localizes the palpebral branch of the infraorbital artery. The medial malar target area isolates quadrants for anterior cheek projection and tear trough effacement. The middle malar target area addresses lid-cheek blending and superficial compartment turgor. The lateral malar target area highlights lateral cheek projection and locates the prezygomatic space. This stepwise analysis illustrates target areas and adverse event zones to achieve midfacial support, contour, and profile in the repose position and simultaneous molding of a natural shape during animation. This reproducible method can be used both procedurally and in record-keeping for midface volumizing procedures.

  7. Construction of mammographic examination process ontology using bottom-up hierarchical task analysis.

    Science.gov (United States)

    Yagahara, Ayako; Yokooka, Yuki; Jiang, Guoqian; Tsuji, Shintarou; Fukuda, Akihisa; Nishimoto, Naoki; Kurowarabi, Kunio; Ogasawara, Katsuhiko

    2018-03-01

    Describing complex mammography examination processes is important for improving the quality of mammograms. It is often difficult for experienced radiologic technologists to explain the process because their techniques depend on their experience and intuition. In our previous study, we analyzed the process using a new bottom-up hierarchical task analysis and identified key components of the process. Leveraging the results of the previous study, the purpose of this study was to construct a mammographic examination process ontology to formally describe the relationships between the process and image evaluation criteria to improve the quality of mammograms. First, we identified and created root classes: task, plan, and clinical image evaluation (CIE). Second, we described an "is-a" relation referring to the result of the previous study and the structure of the CIE. Third, the procedural steps in the ontology were described using the new properties: "isPerformedBefore," "isPerformedAfter," and "isPerformedAfterIfNecessary." Finally, the relationships between tasks and CIEs were described using the "isAffectedBy" property to represent the influence of the process on image quality. In total, there were 219 classes in the ontology. By introducing new properties related to the process flow, a sophisticated mammography examination process could be visualized. In relationships between tasks and CIEs, it became clear that the tasks affecting the evaluation criteria related to positioning were greater in number than those for image quality. We developed a mammographic examination process ontology that makes knowledge explicit for a comprehensive mammography process. Our research will support education and help promote knowledge sharing about mammography examination expertise.

  8. State space analysis of timing: exploiting task redundancy to reduce sensitivity to timing.

    Science.gov (United States)

    Cohen, Rajal G; Sternad, Dagmar

    2012-01-01

    Timing is central to many coordinated actions, and the temporal accuracy of central nervous system commands presents an important limit to skilled performance. Using target-oriented throwing in a virtual environment as an example task, this study presents a novel analysis that quantifies contributions of timing accuracy and shaping of hand trajectories to performance. Task analysis reveals that the result of a throw is fully determined by the projectile position and velocity at release; zero error can be achieved by a manifold of position and velocity combinations (solution manifold). Four predictions were tested. 1) Performers learn to release the projectile closer to the optimal moment for a given arm trajectory, achieving timing accuracy levels similar to those reported in other timing tasks (~10 ms). 2) Performers develop a hand trajectory that follows the solution manifold such that zero error can be achieved without perfect timing. 3) Skilled performers exploit both routes to improvement more than unskilled performers. 4) Long-term improvement in skilled performance relies on continued optimization of the arm trajectory as timing limits are reached. Average and skilled subjects practiced for 6 and 15 days, respectively. In 6 days, both timing and trajectory alignment improved for all subjects, and skilled subjects showed an advantage in timing. With extended practice, performance continued to improve due to continued shaping of the trajectory, whereas timing accuracy reached an asymptote at 9 ms. We conclude that skilled subjects first maximize timing accuracy and then optimize trajectory shaping to compensate for intrinsic limitations of timing accuracy.

  9. Conversation analysis of the two-chair self-soothing task in emotion-focused therapy.

    Science.gov (United States)

    Sutherland, Olga; Peräkylä, Anssi; Elliott, Robert

    2014-01-01

    Despite an increasing recognition of the relevance and significance of self-compassion processes, little research has explored interventions that seek to enhance these in therapy. In this study, we examined the compassionate self-soothing task of emotion-focused therapy involving two-chair work, with seven clients. Conversation analysis was used to examine client-therapst interaction. The analysis yielded a detailed description of interactional practices and processes involved in the accomplishment of self-soothing, drawing on Goffman's concept of the participation frame. We show how therapists and clients collaborate to move from the ordinary frame of therapeutic conversation to a self-soothing frame and back again by using various interactional practices: Therapists' instructions to clients, specific ways of sequencing actions in interaction, explanations and justification of the importance of the self-soothing task, pronouns as a way to distinguish among addressees (e.g., clients versus soothing agents), corrections of clients' talk, and response tokens (hm mm, yeah, good). These practices are used to help clients accomplish self-soothing in the form of self-praise, disclosing caring, and offering of helpful advice. This study offers therapists a specific account of how to respond to clients at specific junctures in self-soothing dialogues and how to structure and accomplish the self-soothing task.

  10. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  11. Thermal-Hydraulic Analysis Tasks for ANAV NPPs in Support of Plant Operation and Control

    Directory of Open Access Journals (Sweden)

    L. Batet

    2007-11-01

    Full Text Available Thermal-hydraulic analysis tasks aimed at supporting plant operation and control of nuclear power plants are an important issue for the Asociación Nuclear Ascó-Vandellòs (ANAV. ANAV is the consortium that runs the Ascó power plants (2 units and the Vandellòs-II power plant. The reactors are Westinghouse-design, 3-loop PWRs with an approximate electrical power of 1000 MW. The Technical University of Catalonia (UPC thermal-hydraulic analysis team has jointly worked together with ANAV engineers at different levels in the analysis and improvement of these reactors. This article is an illustration of the usefulness of computational analysis for operational support. The contents presented were operational between 1985 and 2001 and subsequently changed slightly following various organizational adjustments. The paper has two different parts. In the first part, it describes the specific aspects of thermal-hydraulic analysis tasks related to operation and control and, in the second part, it briefly presents the results of three examples of analyses that were performed. All the presented examples are related to actual situations in which the scenarios were studied by analysts using thermal-hydraulic codes and prepared nodalizations. The paper also includes a qualitative evaluation of the benefits obtained by ANAV through thermal-hydraulic analyses aimed at supporting operation and plant control.

  12. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    Science.gov (United States)

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  13. Efficacy of bronchoscopic lung volume reduction: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Iftikhar IH

    2014-05-01

    Full Text Available Imran H Iftikhar,1 Franklin R McGuire,1 Ali I Musani21Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, University of South Carolina, Columbia, SC, USA; 2Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, National Jewish Health, Denver, CO, USABackground: Over the last several years, the morbidity, mortality, and high costs associated with lung volume reduction (LVR surgery has fuelled the development of different methods for bronchoscopic LVR (BLVR in patients with emphysema. In this meta-analysis, we sought to study and compare the efficacy of most of these methods.Methods: Eligible studies were retrieved from PubMed and Embase for the following BLVR methods: one-way valves, sealants (BioLVR, LVR coils, airway bypass stents, and bronchial thermal vapor ablation. Primary study outcomes included the mean change post-intervention in the lung function tests, the 6-minute walk distance, and the St George's Respiratory Questionnaire. Secondary outcomes included treatment-related complications.Results: Except for the airway bypass stents, all other methods of BLVR showed efficacy in primary outcomes. However, in comparison, the BioLVR method showed the most significant findings and was the least associated with major treatment-related complications. For the BioLVR method, the mean change in forced expiratory volume (in first second was 0.18 L (95% confidence interval [CI]: 0.09 to 0.26; P<0.001; in 6-minute walk distance was 23.98 m (95% CI: 12.08 to 35.88; P<0.01; and in St George's Respiratory Questionnaire was −8.88 points (95% CI: −12.12 to −5.64; P<0.001.Conclusion: The preliminary findings of our meta-analysis signify the importance of most methods of BLVR. The magnitude of the effect on selected primary outcomes shows noninferiority, if not equivalence, when compared to what is known for surgical LVR.Keyword: emphysema, endobronchial valves, sealants, stents, coils

  14. Implementation of Hierarchical Task Analysis for User Interface Design in Drawing Application for Early Childhood Education

    Directory of Open Access Journals (Sweden)

    Mira Kania Sabariah

    2016-05-01

    Full Text Available Draw learning in early childhood is an important lesson and full of stimulation of the process of growth and development of children which could help to train the fine motor skills. We have had a lot of applications that can be used to perform learning, including interactive learning applications. Referring to the observations that have been conducted showed that the experiences given by the applications that exist today are very diverse and have not been able to represent the model of learning and characteristics of early childhood (4-6 years. Based on the results, Hierarchical Task Analysis method generated a list of tasks that must be done in designing an user interface that represents the user experience in draw learning. Then by using the Heuristic Evaluation method the usability of the model has fulfilled a very good level of understanding and also it can be enhanced and produce a better model.

  15. F-16 Task Analysis Criterion-Referenced Objective and Objectives Hierarchy Report. Volume I

    Science.gov (United States)

    1981-03-01

    which must planning tactical I I be collected for missions without presission planning for olkiSSIi. IontACtical missions. II PNae : 7 -- 1 SCoilect...nd/o -T- Pace: j4 rFerfork normal takeof procedures ( PNae :152 Ferform before taxi ichecks 1.2.1.81 Match Defore toxi I checklist items with tneir

  16. F-16 Task Analysis Criterion-Referenced Objective and Objectives Hierarchy Report. Volume 3

    Science.gov (United States)

    1981-03-01

    two-ship 180 degree in-place turn (DI) [Hands-orn 1.7.5.1.1.2.3.1 Describe the steps in t’-e proced’Ure for per-formira a two-ship ISO degree in-place...1.1.7.6.4.1.3.1.5 Pace: 913 fDelie f ee-f al1 (Pag e: 9126 Deliver free-fall munitions anually .using LADD delivery. Deliver riucleor Suraitixons

  17. Feasibility study of modern airships, phase 1. Volume 2: Parametric analysis (task 3). [lift, weight (mass)

    Science.gov (United States)

    Lancaster, J. W.

    1975-01-01

    Various types of lighter-than-air vehicles from fully buoyant to semibuoyant hybrids were examined. Geometries were optimized for gross lifting capabilities for ellipsoidal airships, modified delta planform lifting bodies, and a short-haul, heavy-lift vehicle concept. It is indicated that: (1) neutrally buoyant airships employing a conservative update of materials and propulsion technology provide significant improvements in productivity; (2) propulsive lift for VTOL and aerodynamic lift for cruise significantly improve the productivity of low to medium gross weight ellipsoidal airships; and (3) the short-haul, heavy-lift vehicle, consisting of a simple combination of an ellipsoidal airship hull and existing helicopter componentry, provides significant potential for low-cost, near-term applications for ultra-heavy lift missions.

  18. EMG amplitude, fatigue threshold, and time to task failure: A meta-analysis.

    Science.gov (United States)

    McCrary, J Matt; Ackermann, Bronwen J; Halaki, Mark

    2017-11-11

    Electromyographic (EMG) fatigue threshold (EMG FT ) is utilised as a correlate of critical power, torque, and force thresholds that establishes a theoretical exercise intensity-the power, torque, or force at which the rate of change of EMG amplitude (ΔEM¯G) is zero-below which neuromuscular fatigue is negligible and unpredictable. Recent studies demonstrating neuromuscular fatigue below critical thresholds raise questions about the construct validity of EMG FT . The purpose of this analysis is to evaluate the construct validity of EMGFT by aggregating ΔEM¯G and time to task failure (T lim ) data. Meta-analysis. Database search of MEDLINE, SPORTDiscus, Web of Science, and Cochrane (inception - September 2016) conducted using terms relevant to EMG and muscle fatigue. Inclusion criteria were studies reporting agonist muscle EMG amplitude data during constant force voluntary isometric contractions taken to task failure. Linear and nonlinear regression models were used to relate ΔEM¯G and T lim data extracted from included studies. Regression analyses included data from 837 healthy adults from 43 studies. Relationships between ΔEM¯G and T lim were strong in both nonlinear (R 2 =0.65) and linear (R 2 =0.82) models. ΔEM¯G at EMG FT was significantly nonzero overall and in 3 of 5 cohorts in the nonlinear model (pEMG FT lacks face validity as currently calculated; models for more precise EMG FT calculation are proposed. A new framework for prediction of task failure using EMG amplitude data alone is presented. The ΔEM¯G vs. Tlim relationship remains consistent across sexes and force vs. position tasks. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  19. Analysis of respiratory pressure-volume curves in intensive care medicine using inductive machine learning.

    Science.gov (United States)

    Ganzert, Steven; Guttmann, Josef; Kersting, Kristian; Kuhlen, Ralf; Putensen, Christian; Sydow, Michael; Kramer, Stefan

    2002-01-01

    We present a case study of machine learning and data mining in intensive care medicine. In the study, we compared different methods of measuring pressure-volume curves in artificially ventilated patients suffering from the adult respiratory distress syndrome (ARDS). Our aim was to show that inductive machine learning can be used to gain insights into differences and similarities among these methods. We defined two tasks: the first one was to recognize the measurement method producing a given pressure-volume curve. This was defined as the task of classifying pressure-volume curves (the classes being the measurement methods). The second was to model the curves themselves, that is, to predict the volume given the pressure, the measurement method and the patient data. Clearly, this can be defined as a regression task. For these two tasks, we applied C5.0 and CUBIST, two inductive machine learning tools, respectively. Apart from medical findings regarding the characteristics of the measurement methods, we found some evidence showing the value of an abstract representation for classifying curves: normalization and high-level descriptors from curve fitting played a crucial role in obtaining reasonably accurate models. Another useful feature of algorithms for inductive machine learning is the possibility of incorporating background knowledge. In our study, the incorporation of patient data helped to improve regression results dramatically, which might open the door for the individual respiratory treatment of patients in the future.

  20. Analysis of individual tree volume equations for Cupressus ...

    African Journals Online (AJOL)

    Three different volume equations were fitted to individual tree volume (V) data collected on 260 Cupressus lusitanica trees from 49 plantations in Munessa Shashemene Forest, Ethiopia. The data were first split randomly into equation development and equation testing data sets of equal size. Diameter at breast height (D) ...

  1. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  2. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stéphanie Martine; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler

  3. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  4. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  5. A Qualitative Force Structure Analysis of the Global Mobility Task Force

    Science.gov (United States)

    2003-06-01

    Force GSTF Global Strike Task Force HLSTF Homeland Response Task Force HUMRO Humanitarian Relief Operation ISR Intelligence...Task Force (S&C4ISRTF) 3. Global Strike Task Force ( GSTF ) 4. Global Response Task Force (GRTF) 5. Homeland Security Task Force (HLSTF) 6. Global...enable the “ GSTF and GRTF to deploy and employ rapidly anywhere in the world at any time” (DAF, 2002:16). Therefore, the GMTF has three key

  6. Self-narrative reconstruction in emotion-focused therapy: A preliminary task analysis.

    Science.gov (United States)

    Cunha, Carla; Mendes, Inês; Ribeiro, António P; Angus, Lynne; Greenberg, Leslie S; Gonçalves, Miguel M

    2017-11-01

    This research explored the consolidation phase of emotion-focused therapy (EFT) for depression and studies-through a task-analysis method-how client-therapist dyads evolved from the exploration of the problem to self-narrative reconstruction. Innovative moments (IMs) were used to situate the process of self-narrative reconstruction within sessions, particularly through reconceptualization and performing change IMs. We contrasted the observation of these occurrences with a rational model of self-narrative reconstruction, previously built. This study presents the rational model and the revised rational-empirical model of the self-narrative reconstruction task in three EFT dyads, suggesting nine steps necessary for task resolution: (1) Explicit recognition of differences in the present and steps in the path of change; (2) Development of a meta-perspective contrast between present self and past self; (3) Amplification of contrast in the self; (4) A positive appreciation of changes is conveyed; (5) Occurrence of feelings of empowerment, competence, and mastery; (6) Reference to difficulties still present; (7) Emphasis on the loss of centrality of the problem; (8) Perception of change as a gradual, developing process; and (9) Reference to projects, experiences of change, or elaboration of new plans. Central aspects of therapist activity in facilitating the client's progression along these nine steps are also elaborated.

  7. Electroencephalogram complexity analysis in children with attention-deficit/hyperactivity disorder during a visual cognitive task.

    Science.gov (United States)

    Zarafshan, Hadi; Khaleghi, Ali; Mohammadi, Mohammad Reza; Moeini, Mahdi; Malmir, Nastaran

    2016-01-01

    The aim of this study was to investigate electroencephalogram (EEG) dynamics using complexity analysis in children with attention-deficit/hyperactivity disorder (ADHD) compared with healthy control children when performing a cognitive task. Thirty 7-12-year-old children meeting Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition (DSM-5) criteria for ADHD and 30 healthy control children underwent an EEG evaluation during a cognitive task, and Lempel-Ziv complexity (LZC) values were computed. There were no significant differences between ADHD and control groups on age and gender. The mean LZC of the ADHD children was significantly larger than healthy children over the right anterior and right posterior regions during the cognitive performance. In the ADHD group, complexity of the right hemisphere was higher than that of the left hemisphere, but the complexity of the left hemisphere was higher than that of the right hemisphere in the normal group. Although fronto-striatal dysfunction is considered conclusive evidence for the pathophysiology of ADHD, our arithmetic mental task has provided evidence of structural and functional changes in the posterior regions and probably cerebellum in ADHD.

  8. Utilizing job/task analysis to establish content validity in the design of training programs

    Energy Technology Data Exchange (ETDEWEB)

    Nay, W.E.

    1988-01-01

    The decade of the 1980's has been a turbulent time for the Department of Energy. With concern mounting about the terrorist threat, a wave of congressional inquiries and internal inspections crossed the nation and engulfed many of the nuclear laboratories and facilities operated by DOE contractors. A typical finding was the need to improve, and increase, the training of the protective force. The immediate reaction resulted in a wide variety of responses, with most contractors feeling safer with too much, rather than not enough training. As soon as the initial pressures to upgrade subsided, a task force was established to evaluate the overall training needs. Representatives from the contractor facilities worked together to conduct a job analysis of the protective force. A generic task inventory was established, and validated at the different sites. This list has been invaluable for determining the tasks, conditions, and standards needed to develop well stated learning objectives. The enhanced training programs are being refined to ensure job content validity based on the data collected.

  9. Analysis of Mexico wind tunnel measurements. Final report of IEA Task 29, Mexnext (Phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G.; Boorsma, K. [Energy research Center of the Netherlands ECN, Petten (Netherlands); Cho, T. [Korea Aerospace Research Institute KARI, Daejeon (Korea, Republic of); Gomez-Iradi, S. [National Renewable Energy Center of Spain CENER, Sarriguren (Spain); Schaffarczyk, P. [A. Jeromin University of Applied Sciences, CEWind EG, Kiel (Germany); Shen, W.Z. [The Technical University of Denmark, Kongens Lyngby (Denmark); Lutz, T. [K. Meister University of Stuttgart, Stuttgart (Germany); Stoevesandt, B. [ForWind, Zentrum fuer Windenergieforschung, Oldenburg (Germany); Schreck, S. [National Renewable Energy Laboratory NREL, Golden, CO (United States); Micallef, D.; Pereira, R.; Sant, T. [Delft University of Technology TUD, Delft (Netherlands); Madsen, H.A.; Soerensen, N. [Risoe-DTU, Roskilde (Denmark)

    2012-02-15

    This report describes the work performed within the first phase of IEA Task 29 Mexnext. In this IEA Task 29 a total of 20 organisations from 11 different countries collaborated in analysing the measurements which have been performed in the EU project 'Mexico'. Within this Mexico project 9 European institutes carried out a wind tunnel experiment in the Large Low Speed Facility (LLF) of the German Dutch Wind Facilities DNW on a rotor with a diameter of 4.5 m. Pressure distributions were measured at five locations along the blade along with detailed flow field measurements around the rotor plane using stereo PIV. As a result of the international collaboration within this task a very thorough analysis of the data could be carried out and a large number of codes were validated not only in terms of loads but also in terms of underlying flow field. The detailed pressure measurements along the blade in combination with the detailed flow field measurements gave a unique opportunity to better understand the response of a wind turbine to the incoming flow field. Deficiencies in modelling have been established and directions for model improvement can be given.

  10. Reliability of steam-turbine rotors. Task 1. Lifetime prediction analysis system. Final report

    International Nuclear Information System (INIS)

    Nair, P.K.; Pennick, H.G.; Peters, J.E.; Wells, C.H.

    1982-12-01

    Task 1 of RP 502, Reliability of Steam Turbine Rotors, resulted in the development of a computerized lifetime prediction analysis system (STRAP) for the automatic evaluation of rotor integrity based upon the results of a boresonic examination of near-bore defects. Concurrently an advanced boresonic examination system (TREES), designed to acquire data automatically for lifetime analysis, was developed and delivered to the maintenance shop of a major utility. This system and a semi-automated, state-of-the-art system (BUCS) were evaluated on two retired rotors as part of the Task 2 effort. A modified nonproprietary version of STRAP, called SAFER, is now available for rotor lifetime prediction analysis. STRAP and SAFER share a common fracture analysis postprocessor for rapid evaluation of either conventional boresonic amplitude data or TREES cell data. The final version of this postprocessor contains general stress intensity correlations for elliptical cracks in a radial stress gradient and provision for elastic-plastic instability of the ligament between an imbedded crack and the bore surface. Both linear elastic and ligament rupture models were developed for rapid analysis of linkup within three-dimensional clusters of defects. Bore stress-rupture criteria are included, but a creep-fatigue crack growth data base is not available. Physical and mechanical properties of air-melt 1CrMoV forgings are built into the program; however, only bounding values of fracture toughness versus temperature are available. Owing to the lack of data regarding the probability of flaw detection for the boresonic systems and of quantitative verification of the flaw linkup analysis, automatic evlauation of boresonic results is not recommended, and the lifetime prediction system is currently restricted to conservative, deterministic analysis of specified flaw geometries

  11. Muscle Fatigue Analysis of the Deltoid during Three Head-Related Static Isometric Contraction Tasks

    Directory of Open Access Journals (Sweden)

    Wenxiang Cui

    2017-05-01

    Full Text Available This study aimed to investigate the fatiguing characteristics of muscle-tendon units (MTUs within skeletal muscles during static isometric contraction tasks. The deltoid was selected as the target muscle and three head-related static isometric contraction tasks were designed to activate three heads of the deltoid in different modes. Nine male subjects participated in this study. Surface electromyography (SEMG signals were collected synchronously from the three heads of the deltoid. The performances of five SEMG parameters, including root mean square (RMS, mean power frequency (MPF, the first coefficient of autoregressive model (ARC1, sample entropy (SE and Higuchi’s fractal dimension (HFD, in quantification of fatigue, were evaluated in terms of sensitivity to variability ratio (SVR and consistency firstly. Then, the HFD parameter was selected as the fatigue index for further muscle fatigue analysis. The experimental results demonstrated that the three deltoid heads presented different activation modes during three head-related fatiguing contractions. The fatiguing characteristics of the three heads were found to be task-dependent, and the heads kept in a relatively high activation level were more prone to fatigue. In addition, the differences in fatiguing rate between heads increased with the increase in load. The findings of this study can be helpful in better understanding the underlying neuromuscular control strategies of the central nervous system (CNS. Based on the results of this study, the CNS was thought to control the contraction of the deltoid by taking the three heads as functional units, but a certain synergy among heads might also exist to accomplish a contraction task.

  12. Task and error analysis balancing benefits over business of electronic medical records.

    Science.gov (United States)

    Carstens, Deborah Sater; Rodriguez, Walter; Wood, Michael B

    2014-01-01

    Task and error analysis research was performed to identify: a) the process for healthcare organisations in managing healthcare for patients with mental illness or substance abuse; b) how the process can be enhanced and; c) if electronic medical records (EMRs) have a role in this process from a business and safety perspective. The research question is if EMRs have a role in enhancing the healthcare for patients with mental illness or substance abuse. A discussion on the business of EMRs is addressed to understand the balancing act between the safety and business aspects of an EMR.

  13. Task analysis and structure scheme for center manager station in large container inspection system

    International Nuclear Information System (INIS)

    Li Zheng; Gao Wenhuan; Wang Jingjin; Kang Kejun; Chen Zhiqiang

    1997-01-01

    LCIS works as follows: the accelerator generates beam pulses which are formed into fan shape; the scanning system drags a lorry with a container passing through the beam in constant speed; the detector array detects the beam penetrating the lorry; the projection data acquisition system reads the projections and completes an inspection image of the lorry. All these works are controlled and synchronized by the center manage station. The author will describe the process of the projection data acquisition in scanning mode and the methods of real-time projection data processing. the task analysis and the structure scheme of center manager station is presented

  14. Wristbands as aids to reduce misidentification: an ethnographically guided task analysis.

    Science.gov (United States)

    Smith, Andrew F; Casey, Kate; Wilson, James; Fischbacher-Smith, Denis

    2011-10-01

    Wristbands are recommended in the UK as a means of verifying patient identity but have been little studied. We aimed to document how wristbands are used in practice. and participants Task analysis of wristband application and use, drawing on qualitative analysis of workplace observation of, and interviews with, clinical and non-clinical staff. Two acute district general hospitals in northern England. Our findings indicate high levels of awareness amongst clinical staff of local and national policies on wristband use, but some ambiguity about the details therein. In contrast, non-clinical staff such as ward clerks and porters were less aware of policy, although their actions also expose patients to risks resulting from misidentification. Of seven subtasks identified by the task analysis of wristband application and use, three appeared to offer particular opportunity for error. Making the decision to apply, especially in emergency patients, is important because delay in application can delay correct identification. Advance preparation of wristbands for elective admission without the patient being present can risk erroneous data or misapplication. Lastly, utilization of wristbands to verify patient identity was greater in some clinical circumstances (blood transfusion and medication administration) than in others (before transferring patients around the hospital and during handovers of care). Wristbands for patient identification are not being used to their full potential. Attention to detail in application and use, especially during handover and transfer, and an appreciation of the role played by 'non-clinical' staff, may offer further gains in patient safety.

  15. Nonlinear analysis of electroencephalogram at rest and during cognitive tasks in patients with schizophrenia

    Science.gov (United States)

    Carlino, Elisa; Sigaudo, Monica; Pollo, Antonella; Benedetti, Fabrizio; Mongini, Tullia; Castagna, Filomena; Vighetti, Sergio; Rocca, Paola

    2012-01-01

    Background In spite of the large number of studies on schizophrenia, a full understanding of its core pathology still eludes us. The application of the nonlinear theory of electroencephalography (EEG) analysis provides an interesting tool to differentiate between physiologic conditions (e.g., resting state and mathematical task) and normal and pathologic brain activities. The aim of the present study was to investigate nonlinear EEG activity in patients with schizophrenia. Methods We recorded 19-lead EEGs in patients with stable schizophrenia and healthy controls under 4 different conditions: eyes closed, eyes open, forward counting and backward counting. A nonlinear measure of complexity was calculated by means of correlation dimension (D2). Results We included 17 patients and 17 controls in our analysis. Comparing the 2 populations, we observed greater D2 values in the patient group. In controls, increased D2 values were observed during active states (eyes open and the 2 cognitive tasks) compared with baseline conditions. This increase of brain complexity, which can be interpreted as an increase of information processing and integration, was not preserved in the patient population. Limitations Patients with schizophrenia were taking antipsychotic medications, so the presence of medication effects cannot be excluded. Conclusion Our results suggest that patients with schizophrenia present changes in brain activity compared with healthy controls, and this pathologic alteration can be successfully studied with nonlinear EEG analysis. PMID:22353633

  16. Modeling and strength analysis of the prototype of the multi-tasking car trailer

    Directory of Open Access Journals (Sweden)

    Posiadała Bogdan

    2018-01-01

    Full Text Available The subject of the paper is modeling and strength analysis of the prototype of the multi-tasking car trailer. The innovativeness of the solutions applied consists in the combination of several single-purpose trailers into one, allowing for multitasking and the combination of several specialized trailers. Using the author's solution, the prototype of a multi-purpose trailer has been developed, which has the characteristics of several specialized trailers, such as for the transportation of general cargo, kayaks, motorcycles, quads etc. The model has been created by using SolidWorks package and it has been used for full strength analysis by simulation module. The sample analysis results cover various load configurations with respect to selected trailer construction components.

  17. Rapid analysis of hay attributes using NIRS. Final report, Task II alfalfa supply system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-24

    This final report provides technical information on the development of a near infrared reflectance spectroscopy (NIRS) system for the analysis of alfalfa hay. The purpose of the system is to provide consistent quality for processing alfalfa stems for fuel and alfalfa leaf meal products for livestock feed. Project tasks were to: (1) develop an NIRS driven analytical system for analysis of alfalfa hay and processed alfalfa products; (2) assist in hiring a qualified NIRS technician and recommend changes in testing equipment necessary to provide accurate analysis; (3) calibrate the NIRS instrument for accurate analyses; and (4) develop prototype equipment and sampling procedures as a first step towards development of a totally automated sampling system that would rapidly sample and record incoming feedstock and outbound product. An accurate hay testing program was developed, along with calibration equations for analyzing alfalfa hay and sun-cured alfalfa pellets. A preliminary leaf steam calibration protocol was also developed. 7 refs., 11 figs., 10 tabs.

  18. Analysis of brain activity and response to colour stimuli during learning tasks: an EEG study

    Science.gov (United States)

    Folgieri, Raffaella; Lucchiari, Claudio; Marini, Daniele

    2013-02-01

    The research project intends to demonstrate how EEG detection through BCI device can improve the analysis and the interpretation of colours-driven cognitive processes through the combined approach of cognitive science and information technology methods. To this end, firstly it was decided to design an experiment based on comparing the results of the traditional (qualitative and quantitative) cognitive analysis approach with the EEG signal analysis of the evoked potentials. In our case, the sensorial stimulus is represented by the colours, while the cognitive task consists in remembering the words appearing on the screen, with different combination of foreground (words) and background colours. In this work we analysed data collected from a sample of students involved in a learning process during which they received visual stimuli based on colour variation. The stimuli concerned both the background of the text to learn and the colour of the characters. The experiment indicated some interesting results concerning the use of primary (RGB) and complementary (CMY) colours.

  19. Study of Automobile Market Dynamics : Volume 2. Analysis.

    Science.gov (United States)

    1977-08-01

    Volume II describes the work in providing statistical inputs to a computer model by examining the effects of various options on the number of automobiles sold; the distribution of sales among small, medium and large cars; the distribution between aut...

  20. An Empirical Analysis of Interspersal Research Evidence, Implications, and Applications of the Discrete Task Completion Hypothesis.

    Science.gov (United States)

    Skinner, Christopher H.

    2002-01-01

    Researchers have posited that when students work on assignments with many discrete tasks, that each completed discrete task may be a conditioned reinforcer. If the discrete task completion hypothesis is accurate, then relative task completion rates should influence choice behavior in the same manner as relative rates of reinforcement. Results of a…

  1. Psychometric evaluation of revised Task-Related Worry Scale (TRWS-R: A Mokken model analysis

    Directory of Open Access Journals (Sweden)

    Martin Marko

    2016-01-01

    Full Text Available Task-related worries can be understood as an inherent component of an anxious state and stress response. Under evaluating conditions (e.g. cognitive testing, these worries, due to cognitive interference they create, may have undesirable effects on a cognitive performance at hand. Since cognitive interference has been documented to affect a broad spectrum of cognitive performance (Hembree, 1988, development of a method for its assessment is required. For this purpose we modified a part of the original Cognitive Interference Questionnaire (Sarason et al., 1986 in order to create the revised Task- Related Worry Scale (TRWS-R and investigated its psychometric properties. Data from two hundreds of participants (72 male, 139 female; age ranging from 18 to 24 were obtained to inspect the modified scale’s properties on Slovak sample. After the scale was reformulated and shortened, the resulting set of eight items was subjected for examination of internal consistency (Cronbach'salpha, Revelle’sbeta, Armor'stheta, and McDonald'somega coefficients, expected unidimensionality (confirmatory factor analysis, and scalability (nonparametric item response model - Mokken scale analysis. The results indicate that the scale has rather reasonable consistency. Both mean inter-item correlation and corrected mean item-score correlation were relatively high (r= .469 and r = .636 respectively. Additionally, all estimated consistency coefficients reached required thresholds (namely: ? = .88,ß = .79,? = .86,? =.88. Robust confirmatory factor analysis and Cronbach-Mesbah curve convergently supported the hypothesized unidimensional factor solution (CFA fit indexes: ?2 (28= 26.73, p = .143, CFI = .994, TLI = .992, RMSEA = .041, SRMR = .055.. Moreover, Mokken scale analysis indicated that the scale is scalable (scale’s H = .496 and satisfies the criteria of both monotone homogenity model and double monotonicity model (no significant violations were present. Consistency

  2. Communications data delivery system analysis task 2 report : high-level options for secure communications data delivery systems.

    Science.gov (United States)

    2012-05-16

    This Communications Data Delivery System Analysis Task 2 report describes and analyzes options for Vehicle to Vehicle (V2V) and Vehicle to Infrastructure (V2I) communications data delivery systems using various communication media (Dedicated Short Ra...

  3. Demonstration project as a procedure for accelerating the application of new technology (Charpie Task Force report). Volume II

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-02-01

    This report examines the issues associated with government programs proposed for the ''commercialization'' of new energy technologies; these programs are intended to hasten the pace at which target technologies are adopted by the private sector. The ''commercial demonstration'' is the principal tool used in these programs. Most previous government interventions in support of technological change have focused on R and D and left to the private sector the decision as to adoption for commercial utilization; thus there is relatively little in the way of analysis or experience which bears direct application. The analysis is divided into four sections. First, the role of R, D, and D within the structure of the national energy goals and policies is examined. The issue of ''prices versus gaps'' is described as a crucial difference of viewpoint concerning the role of the government in the future of the energy system. Second, the process of technological change as it occurs with respect to energy technologies is then examined for possible sources of misalignment of social and private incentives. The process is described as a series of investments. Third, correction of these sources of misalignment then becomes the goal of commercial demonstration programs as this goal and the means for attaining it are explored. Government-supported commercialization may be viewed as a subsidy to the introduction stage of the process; the circumstances under which such subsidies are likely to affect the success of the subsequent diffusion stage are addressed. The discussion then turns to the political, legal, and institutional problems. Finally, methods for evaluation and planning of commercial demonstration programs are analyzed. The critical areas of ignorance are highlighted and comprise a research agenda for improved analytical techniques to support decisions in this area.

  4. The human factors and job task analysis in nuclear power plant operation

    International Nuclear Information System (INIS)

    Stefanescu, Petre; Mihailescu, Nicolae; Dragusin, Octavian

    1999-01-01

    After a long period of time, during the development of the NPP technology, where the plant hardware has been considered to be the main factor for a safe, reliable and economic operation, the industry is now changing to an adequate responsibility of plant hardware and operation. Since the human factors has been not discussed methodically so far, there is still a lack of improved classification systems for human errors as well as a lack of methods for the systematic approach in designing the operator's working system, as for instance by using the job task analysis (J.T.A.). The J.T.A. appears to be an adequate method to study the human factor in the nuclear power plant operation, enabling an easy conversion to operational improvements. While the results of the analysis of human errors tell 'what' is to be improved, the J.T.A. shows 'how' to improve, for increasing the quality of the work and the safety of the operator's working system. The paper analyses the issue of setting the task and displays four criteria used to select aspects in NPP operation which require special consideration as personal training, design of control room, content and layout of the procedure manual, or organizing the operating personnel. The results are given as three tables giving: 1- Evaluation of deficiencies in the Working System; 2- Evaluation of the Deficiencies of Operator's Disposition; 3- Evaluation of the Mental Structure of Operation

  5. Analysis of the spelling patterns of 4th grade students based on a word dictation task.

    Science.gov (United States)

    Santos, Maria Thereza Mazorra dos; Befi-Lopes, Debora Maria

    2013-01-01

    The purpose of this study was to establish a profile of the spelling patterns studied in students from public and private schools and to describe a word spelling task for clinical and educational settings. Eighty-two fourth grade students belonging to the elementary school of public and private schools in São Paulo, ranging in age from nine to ten years, took part in this study. The spelling task consisted of a list of ten high frequency words (HFW), ten low frequency words (LFW), and ten pseudowords (PW), in which the typology and number of spelling errors were described. To compare the average number of mistakes on the HFWs, LFWs, and PWs, we used an analysis of variance and Tukey's multiple comparisons (pSpelling errors are a part of the process of learning to write, and students can show some variance in spelling performance. Furthermore, students need to be stimulated to analyze words and their aspects of phonology, morphology, and semantics. An analysis from the types of errors is not enough to plan intervention programs, but instead is necessary to understand the strategies that the child uses to write.

  6. Adapting Cognitive Task Analysis to Investigate Clinical Decision Making and Medication Safety Incidents.

    Science.gov (United States)

    Russ, Alissa L; Militello, Laura G; Glassman, Peter A; Arthur, Karen J; Zillich, Alan J; Weiner, Michael

    2017-05-03

    Cognitive task analysis (CTA) can yield valuable insights into healthcare professionals' cognition and inform system design to promote safe, quality care. Our objective was to adapt CTA-the critical decision method, specifically-to investigate patient safety incidents, overcome barriers to implementing this method, and facilitate more widespread use of cognitive task analysis in healthcare. We adapted CTA to facilitate recruitment of healthcare professionals and developed a data collection tool to capture incidents as they occurred. We also leveraged the electronic health record (EHR) to expand data capture and used EHR-stimulated recall to aid reconstruction of safety incidents. We investigated 3 categories of medication-related incidents: adverse drug reactions, drug-drug interactions, and drug-disease interactions. Healthcare professionals submitted incidents, and a subset of incidents was selected for CTA. We analyzed several outcomes to characterize incident capture and completed CTA interviews. We captured 101 incidents. Eighty incidents (79%) met eligibility criteria. We completed 60 CTA interviews, 20 for each incident category. Capturing incidents before interviews allowed us to shorten the interview duration and reduced reliance on healthcare professionals' recall. Incorporating the EHR into CTA enriched data collection. The adapted CTA technique was successful in capturing specific categories of safety incidents. Our approach may be especially useful for investigating safety incidents that healthcare professionals "fix and forget." Our innovations to CTA are expected to expand the application of this method in healthcare and inform a wide range of studies on clinical decision making and patient safety.

  7. District heating and cooling systems for communities through power plant retrofit and distribution network. Volume 3. Tasks 4-6. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Stone and Webster Engineering Corporation is a member of the Demonstration Team to review and assess the technical aspects of cogeneration for district heating. Task 4 details the most practical retrofit schemes. Of the cogeneration schemes studied, a back-pressure turbine is considered the best source of steam for district heating. Battelle Columbus Laboratories is a member of the Demonstration Team employed to investigate several institutional issues affecting the success of district heating. The Toledo Edison legal staff reviewed the legal aspects of mandate to serve, easement and franchise requirements, and corporate charter requirements. The principal findings of both the Battelle investigations and the legal research are summarized in Task 5. A complete discussion of each issue is included in the two sections labeled Legal Issues and Institutional Issues. In Task 6, Battelle Columbus Laboratories completed a preliminary economic analysis, incorporating accurate input parameters applicable to utility ownership of the proposed district-heating system. The methodology used is summarized, the assumptions are listed, and the results are briefly reviewed.

  8. An analysis of cerebral blood flow from middle cerebral arteries during cognitive tasks via functional transcranial Doppler recordings.

    Science.gov (United States)

    Li, Meng; Huang, Hanrui; Boninger, Michael L; Sejdić, Ervin

    2014-07-01

    Functional transcranial Doppler (fTCD) is a useful medical imaging technique to monitor cerebral blood flow velocity (CBFV) in major cerebral arteries. In this paper, CBFV changes in the right and left middle cerebral arteries (MCA) caused by cognitive tasks, such as word generation tasks and mental rotation tasks, were examined using fTCD. CBFV recordings were collected from 20 healthy subjects (10 females, 10 males). We obtained both the raw CBFV signal and the envelope CBFV signal, which is the maximal velocity to gain more information about the changes and hemisphere lateralization in cognitive tasks compared to the resting state. Time, frequency, time-frequency, and information-theoretic features were calculated and compared. Sex effects were also taken into consideration. The results of our analysis demonstrated that the raw CBFV signal contained more descriptive information than the envelope signals. Furthermore, both types of cognitive tasks produced higher values in most signal features. Geometric tasks were more distinguished from the rest-state than verbal tasks and the lateralization was exhibited in right MCA during geometric tasks. Our results show that the raw CBFV signals provided valuable information when studying the effects of cognitive tasks and lateralization in the MCA. Copyright © 2014 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.

  9. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    Natalizio, A.; Kalyanam, K.M.

    1994-09-01

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  10. Comparative evaluation of three cognitive error analysis methods through an application to accident management tasks in NPPs

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kim, Jae Whan; Ha, Jae Joo; Yoon, Wan C.

    1999-01-01

    This study was performed to comparatively evaluate selected Human Reliability Analysis (HRA) methods which mainly focus on cognitive error analysis, and to derive the requirement of a new human error analysis (HEA) framework for Accident Management (AM) in nuclear power plants(NPPs). In order to achieve this goal, we carried out a case study of human error analysis on an AM task in NPPs. In the study we evaluated three cognitive HEA methods, HRMS, CREAM and PHECA, which were selected through the review of the currently available seven cognitive HEA methods. The task of reactor cavity flooding was chosen for the application study as one of typical tasks of AM in NPPs. From the study, we derived seven requirement items for a new HEA method of AM in NPPs. We could also evaluate the applicability of three cognitive HEA methods to AM tasks. CREAM is considered to be more appropriate than others for the analysis of AM tasks. But, PHECA is regarded less appropriate for the predictive HEA technique as well as for the analysis of AM tasks. In addition to these, the advantages and disadvantages of each method are described. (author)

  11. CT volumetric analysis of pleural effusions: a comparison with thoracentesis volumes.

    Science.gov (United States)

    Chiao, David; Hanley, Michael; Olazagasti, Juan M

    2015-09-01

    The primary objective of this study was to compare computed tomography (CT) volumetric analysis of pleural effusions with thoracentesis volumes. The secondary objective of this study was to compare subjective grading of pleural effusion size with thoracentesis volumes. This was a retrospective study of 67 patients with free-flowing pleural effusions who underwent therapeutic thoracentesis. CT volumetric analysis was performed on all patients; the CT volumes were compared with the thoracentesis volumes. In addition, the subjective grading of pleural effusion size was compared with the thoracentesis volumes. The average difference between CT volume and thoracentesis volume was 9.4 mL (1.3%) ± 290 mL (30%); these volumes were not statistically different (P = .79, paired two-tailed Student's t-test). The thoracentesis volume of a "small," "moderate," and "large" pleural effusion, as graded on chest CT, was found to be approximately 410 ± 260 cc, 770 ± 270 mL and 1370 ± 650 mL, respectively; the thoracentesis volume of a "small," "moderate," and "large" pleural effusion, as graded on chest radiograph, was found to be approximately 610 ± 320 mL, 1040 ± 460 mL, and 1530 ± 830 mL, respectively. CT volumetric analysis is an accessible tool that can be used to accurately quantify the size of pleural effusions. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  12. Full correlation matrix analysis (FCMA): An unbiased method for task-related functional connectivity.

    Science.gov (United States)

    Wang, Yida; Cohen, Jonathan D; Li, Kai; Turk-Browne, Nicholas B

    2015-08-15

    The analysis of brain imaging data often requires simplifying assumptions because exhaustive analyses are computationally intractable. Standard univariate and multivariate analyses of brain activity ignore interactions between regions and analyses of interactions (functional connectivity) reduce the computational challenge by using seed regions of interest or brain parcellations. To meet this challenge, we developed full correlation matrix analysis (FCMA), which leverages and optimizes algorithms from parallel computing and machine learning to efficiently analyze the pairwise correlations of all voxels in the brain during different cognitive tasks, with the goal of identifying task-related interactions in an unbiased manner. When applied to a localizer dataset on a small compute cluster, FCMA accelerated a naive, serial approach by four orders of magnitude, reducing running time from two years to one hour. In addition to this performance gain, FCMA emphasized different brain areas than existing methods. In particular, beyond replicating known category selectivity in visual cortex, FCMA also revealed a region of medial prefrontal cortex whose selectivity derived from differential patterns of functional connectivity across categories. For benchmarking, we started with a naive approach and progressively built up to the complete FCMA procedure by adding optimized classifier algorithms, multi-threaded parallelism, and multi-node parallelism. To evaluate what can be learned with FCMA, we compared it against multivariate pattern analysis of activity and seed-based analysis of functional connectivity. FCMA demonstrates how advances in computer science can alleviate computational bottlenecks in neuroscience. We have released a software toolbox to help others evaluate FCMA. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Job/task analysis for I ampersand C [Instrumentation and Controls] instrument technicians at the High Flux Isotope Reactor

    International Nuclear Information System (INIS)

    Duke, L.L.

    1989-09-01

    To comply with Department of Energy Order 5480.XX (Draft), a job/task analysis was initiated by the Maintenance Management Department at Oak Ridge National Laboratory (ORNL). The analysis was applicable to instrument technicians working at the ORNL High Flux Isotope Reactor (HFIR). This document presents the procedures and results of that analysis. 2 refs., 2 figs

  14. STATE-OF-THE-ART TASKS AND ACHIEVEMENTS OF PARALINGUISTIC SPEECH ANALYSIS SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. A. Karpov

    2016-07-01

    Full Text Available We present analytical survey of state-of-the-art actual tasks in the area of computational paralinguistics, as well as the recent achievements of automatic systems for paralinguistic analysis of conversational speech. Paralinguistics studies non-verbal aspects of human communication and speech such as: natural emotions, accents, psycho-physiological states, pronunciation features, speaker’s voice parameters, etc. We describe architecture of a baseline computer system for acoustical paralinguistic analysis, its main components and useful speech processing methods. We present some information on an International contest called Computational Paralinguistics Challenge (ComParE, which is held each year since 2009 in the framework of the International conference INTERSPEECH organized by the International Speech Communication Association. We present sub-challenges (tasks that were proposed at the ComParE Challenges in 2009-2016, and analyze winning computer systems for each sub-challenge and obtained results. The last completed ComParE-2015 Challenge was organized in September 2015 in Germany and proposed 3 sub-challenges: 1 Degree of Nativeness (DN sub-challenge, determination of nativeness degree of speakers based on acoustics; 2 Parkinson's Condition (PC sub-challenge, recognition of a degree of Parkinson’s condition based on speech analysis; 3 Eating Condition (EC sub-challenge, determination of the eating condition state during speaking or a dialogue, and classification of consumed food type (one of seven classes of food by the speaker. In the last sub-challenge (EC, the winner was a joint Turkish-Russian team consisting of the authors of the given paper. We have developed the most efficient computer-based system for detection and classification of the corresponding (EC acoustical paralinguistic events. The paper deals with the architecture of this system, its main modules and methods, as well as the description of used training and evaluation

  15. Geometrical considerations in dose volume analysis in intracavitary treatment

    International Nuclear Information System (INIS)

    Deshpande, D.D.; Shrivastava, S.K.; Pradhan, A.S.; Viswanathan, P.S.; Dinshaw, K.A.

    1996-01-01

    The present work was aimed at to study the relationship between the volume enclosed by reference iodose surface and various geometrical parameters of the intracavitary applicator in treatment of carcinoma of cervix. Pearshape volume of the reference isodose derived from the Total Reference Air Kerma (TRAK) and the product of its dimensions, height H, width W and thickness T which is dependent on the applicator geometry, were estimated for 100 intracavitary applications treated by Selectron LDR machine. Orthogonal radiographs taken for each patient were used for measurement of actual geometric dimensions of the applicator and carrying out the dosimetry on TP-11 treatment planning system. The dimensions H, W and T of reference isodose surface (60 Gy) were also noted. Ratio of the product HWT and the pearshape volume was found mainly to be a function of colpostat separation and not of other geometrical parameters like maximum vertical and anterio-posterior dimension of the applicator. The ratio remained almost constant for a particular combination of uterine tandem and colpostat. Variation in the ratios were attributed to the non-standard geometry. The ratio of the volume of reference isodose surface to the product of its dimensions in the applicator depends upon the colpostat separation. (orig./MG) [de

  16. Motor expertise and performance in spatial tasks: A meta-analysis.

    Science.gov (United States)

    Voyer, Daniel; Jansen, Petra

    2017-08-01

    The present study aimed to provide a summary of findings relevant to the influence of motor expertise on performance in spatial tasks and to examine potential moderators of this effect. Studies of relevance were those in which individuals involved in activities presumed to require motor expertise were compared to non-experts in such activities. A final set of 62 effect sizes from 33 samples was included in a multilevel meta-analysis. The results showed an overall advantage in favor of motor experts in spatial tasks (d=0.38). However, the magnitude of that effect was moderated by expert type (athlete, open skills/ball sports, runner/cyclist, gymnast/dancers, musicians), stimulus type (2D, blocks, bodies, others), test category (mental rotation, spatial perception, spatial visualization), specific test (Mental Rotations Test, generic mental rotation, disembedding, rod-and-frame test, other), and publication status. These findings are discussed in the context of embodied cognition and the potential role of activities requiring motor expertise in promoting good spatial performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Toward mutual support: a task analysis of the relational justice approach to infidelity.

    Science.gov (United States)

    Williams, Kirstee; Galick, Aimee; Knudson-Martin, Carmen; Huenergardt, Douglas

    2013-07-01

    Gender, culture, and power issues are intrinsic to the etiology of infidelity, but the clinical literature offers little guidance on how to work with these concerns. The Relational Justice Approach (RJA) to infidelity (Williams, Family Process, 2011, 50, 516) uniquely places gender and power issues at the heart of clinical change; however, this approach has not been systematically studied. Therefore a qualitative task analysis was utilized to understand how change occurs in RJA. The findings indicated four necessary tasks: (a) creating an equitable foundation for healing, (b) creating space for alternate gender discourse, (c) pursuing relational responsibility of powerful partner, and (d) new experience of mutual support. Therapists' attention to power dynamics that organize couple relationships, leadership in intervening in power processes, and socio-cultural attunement to gender discourses were foundational to this work. These findings help clarify the processes by which mutual healing from the trauma of infidelity may occur and offer empirically based actions that therapists can take to facilitate mutual support. © 2012 American Association for Marriage and Family Therapy.

  18. Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis

    Science.gov (United States)

    Caulkins, J. L.

    2010-12-01

    We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever

  19. Hydrogen Safety Project chemical analysis support task: Window ``C`` volatile organic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  20. Hydrogen Safety Project chemical analysis support task: Window C'' volatile organic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  1. Modelling the hare and the tortoise: predicting the range of in-vehicle task times using critical path analysis.

    Science.gov (United States)

    Harvey, Catherine; Stanton, Neville A

    2013-01-01

    Analytic models can enable predictions about important aspects of the usability of in-vehicle information systems (IVIS) to be made at an early stage of the product development process. Task times provide a quantitative measure of user performance and are therefore important in the evaluation of IVIS usability. In this study, critical path analysis (CPA) was used to model IVIS task times in a stationary vehicle, and the technique was extended to produce predictions for slowperson and fastperson performance, as well as average user (middleperson) performance. The CPA-predicted task times were compared to task times recorded in an empirical simulator study of IVIS interaction, and the predicted times were, on average, within acceptable precision limits. This work forms the foundation for extension of the CPA model to predict IVIS task times in a moving vehicle, to reflect the demands of the dual-task driving scenario. The CPA method was extended for the prediction of slowperson and fastperson IVIS task times. Comparison of the model predictions with empirical data demonstrated acceptable precision. The CPA model can be used in early IVIS evaluation; however, there is a need to extend it to represent the dual-task driving scenario.

  2. Stimulating historical thinking in a collaborative learning task: An analysis of student talk and written answers

    NARCIS (Netherlands)

    Havekes, H.G.F.; Boxtel, C.A.M. van; Coppen, P.A.J.M.; Luttenberg, J.M.

    2016-01-01

    This study investigates how a collaborative learning task in history, designed to trigger domainspecific thinking, can stimulate high quality student talk and answers and how the task, student talk and student answers are related. We developed a collaborative task that was tested in two cycles.

  3. Hierarchical task analysis for ergonomics research. An application of the method to the design and evaluation of sound mixing consoles.

    Science.gov (United States)

    Hodgkinson, G P; Crawshaw, C M

    1985-12-01

    Hierarchical task analysis, the procedure originally devised by Annett and his colleagues for determining training needs, was applied to the task of mixing sound in order to identify the human factors requirements that need to be taken into consideration in the design and evaluation of sound mixing consoles. A number of ergonomics problems were identified and potential solutions tentatively suggested. Following the task analysis a comparative simulation study was devised in order to test the hypothesis that the functional grouping of control knobs, with increased spacing between functional groups relative to the spacing within functional groups, is superior to functional grouping per se. Reaction time data strongly support the hypothesis. This suggests that the present practice in mixing console design of arranging control panels so that the components are spaced equidistant or quasi-equidistant, irrespective of their functions, is detrimental to operator performance. The role and importance of task analysis in human factors research is discussed. Hierarchical task analysis is advocated on the grounds that the resulting task description facilitates the systematic identification of ergonomics problems.

  4. Assessing the teaching of procedural skills: can cognitive task analysis add to our traditional teaching methods?

    Science.gov (United States)

    Sullivan, Maura E; Ortega, Adrian; Wasserberg, Nir; Kaufman, Howard; Nyquist, Julie; Clark, Richard

    2008-01-01

    The purpose of this study was to determine if a cognitive task analysis (CTA) could capture steps and decision points that were not articulated during traditional teaching of a colonoscopy. Three expert colorectal surgeons were videotaped performing a colonoscopy. After the videotapes were transcribed, the experts participated in a CTA. A 26-step procedural checklist and a 16-step cognitive demands table was created by using information obtained in the CTA. The videotape transcriptions were transposed onto the procedural checklist and cognitive demands table to identify steps and decision points that were omitted during traditional teaching. Surgeon A described 50% of "how-to" steps and 43% of decision points. Surgeon B described 30% of steps and 25% of decisions. Surgeon C described 26% of steps and 38% of cognitive decisions. By using CTA, we were able to identify relevant steps and decision points that were omitted during traditional teaching by all 3 experts.

  5. Human factors assessment in PRA using Task Analysis Linked Evaluation Technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1991-01-01

    Thirty years ago the US military and US aviation industry, and more recently, in response to the US Three Mile Island and USSR Chernobyl accidents, the US commercial nuclear power industry, acknowledged that human error, as an immediate precursor, and as a latent or indirect influence in the form of training, maintainability, inservice test, and surveillance programs, is a primary contributor to unreality and risk in complex high-reliability systems. A 1985 Nuclear Regulatory Commission (NRC) study of Licensee Event Reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Despite the magnitude and nature of human error cited in that study, there has been limited attention to personnel-centered issues, especially person-to-person issues involving group processes, management and organizational environment. The paper discusses NRC integration and applications research with respect to the Task Analysis Linked Evaluation Technique (TALENT) in risk assessment applications

  6. Structural analysis (Siemens) of the Euratom coil for the large coil task

    International Nuclear Information System (INIS)

    Maurer, A.

    1981-01-01

    The structural analysis of coil and casing of large superconducting magnets is essential to ensure the safety in the design and is important for the concept of even larger magnet units in future projects. For the Large Coil Task calculations are performed by the finite element computer code NASTRAN to obtain the stress on the various structural parts under thermal and magnetic loads. The mechanical behavior of the coil and casing under normal as well as alternative load conditions is discussed. Plots demonstrate the state of deformation belonging to the single structure parts. The results for the components of normal and shear stresses in the coil as well as for the equivalent stresses in the casing are summarized. The finite element model used is presented. The assumptions relating to the material properties, the force transmitted between coil and casing, the loading conditions, and the boundary conditions are discussed. 2 refs

  7. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  8. Graph theoretical analysis of EEG effective connectivity in vascular dementia patients during a visual oddball task.

    Science.gov (United States)

    Wang, Chao; Xu, Jin; Zhao, Songzhen; Lou, Wutao

    2016-01-01

    The study was dedicated to investigating the change in information processing in brain networks of vascular dementia (VaD) patients during the process of decision making. EEG was recorded from 18 VaD patients and 19 healthy controls when subjects were performing a visual oddball task. The whole task was divided into several stages by using global field power analysis. In the stage related to the decision-making process, graph theoretical analysis was applied to the binary directed network derived from EEG signals at nine electrodes in the frontal, central, and parietal regions in δ (0.5-3.5Hz), θ (4-7Hz), α1 (8-10Hz), α2 (11-13Hz), and β (14-30Hz) frequency bands based on directed transfer function. A weakened outgoing information flow, a decrease in out-degree, and an increase in in-degree were found in the parietal region in VaD patients, compared to healthy controls. In VaD patients, the parietal region may also lose its hub status in brain networks. In addition, the clustering coefficient was significantly lower in VaD patients. Impairment might be present in the parietal region or its connections with other regions, and it may serve as one of the causes for cognitive decline in VaD patients. The brain networks of VaD patients were significantly altered toward random networks. The present study extended our understanding of VaD from the perspective of brain functional networks, and it provided possible interpretations for cognitive deficits in VaD patients. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Southern forest inventory and analysis volume equation user’s guide

    Science.gov (United States)

    Christopher M. Oswalt; Roger C. Conner

    2011-01-01

    Reliable volume estimation procedures are fundamental to the mission of the Forest Inventory and Analysis (FIA) program. Moreover, public access to FIA program procedures is imperative. Here we present the volume estimation procedures used by the southern FIA program of the U.S. Department of Agriculture Forest Service Southern Research Station. The guide presented...

  10. Analysis of some nuclear waste management options. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Berman, L.E.; Ensminger, D.A.; Giuffre, M.S.; Koplik, C.M.; Oston, S.G.; Pollak, G.D.; Ross, B.I.

    1978-10-10

    This report describes risk analyses performed on that portion of a nuclear fuel cycle which begins following solidification of high-level waste. Risks associated with handling, interim storage and transportation of the waste are assessed, as well as the long term implications of disposal in deep mined cavities. The risk is expressed in terms of expected dose to the general population and peak dose to individuals in the population. This volume consists of appendices which provide technical details of the work performed.

  11. Analysis of airborne radiometric data. Volume 3. Topical reports

    Energy Technology Data Exchange (ETDEWEB)

    Reed, J.H.; Shreve, D.C.; Sperling, M.; Woolson, W.A.

    1978-05-01

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors.

  12. Predicted stand volume for Eucalyptus plantations by spatial analysis

    Science.gov (United States)

    Latifah, Siti; Teodoro, RV; Myrna, GC; Nathaniel, CB; Leonardo, M. F.

    2018-03-01

    The main objective of the present study was to assess nonlinear models generated by integrating the stand volume growth rate to estimate the growth and yield of Eucalyptus. The primary data was done for point of interest (POI) of permanent sample plots (PSPs) and inventory sample plots, in Aek Nauli sector, Simalungun regency,North Sumatera Province,Indonesia. from December 2008- March 2009. Today,the demand for forestry information has continued to grow over recent years. Because many forest managers and decision makers face complex decisions, reliable information has become the necessity. In the assessment of natural resources including plantation forests have been widely used geospatial technology.The yield of Eucalyptus plantations represented by merchantable volume as dependent variable while factors affecting yield namely stands variables and the geographic variables as independent variables. The majority of the areas in the study site has stand volume class 0 - 50 m3/ha with 16.59 ha or 65.85 % of the total study site.

  13. An analysis of the application of AI to the development of intelligent aids for flight crew tasks

    Science.gov (United States)

    Baron, S.; Feehrer, C.

    1985-01-01

    This report presents the results of a study aimed at developing a basis for applying artificial intelligence to the flight deck environment of commercial transport aircraft. In particular, the study was comprised of four tasks: (1) analysis of flight crew tasks, (2) survey of the state-of-the-art of relevant artificial intelligence areas, (3) identification of human factors issues relevant to intelligent cockpit aids, and (4) identification of artificial intelligence areas requiring further research.

  14. Gait disorders in the elderly and dual task gait analysis: a new approach for identifying motor phenotypes.

    Science.gov (United States)

    Auvinet, Bernard; Touzard, Claude; Montestruc, François; Delafond, Arnaud; Goeb, Vincent

    2017-01-31

    Gait disorders and gait analysis under single and dual-task conditions are topics of great interest, but very few studies have looked for the relevance of gait analysis under dual-task conditions in elderly people on the basis of a clinical approach. An observational study including 103 patients (mean age 76.3 ± 7.2, women 56%) suffering from gait disorders or memory impairment was conducted. Gait analysis under dual-task conditions was carried out for all patients. Brain MRI was performed in the absence of contra-indications. Three main gait variables were measured: walking speed, stride frequency, and stride regularity. For each gait variable, the dual task cost was computed and a quartile analysis was obtained. Nonparametric tests were used for all the comparisons (Wilcoxon, Kruskal-Wallis, Fisher or Chi 2 tests). Four clinical subgroups were identified: gait instability (45%), recurrent falls (29%), memory impairment (18%), and cautious gait (8%). The biomechanical severity of these subgroups was ordered according to walking speed and stride regularity under both conditions, from least to most serious as follows: memory impairment, gait instability, recurrent falls, cautious gait (p gait disorders, 5 main pathological subgroups were identified (musculoskeletal diseases (n = 11), vestibular diseases (n = 6), mild cognitive impairment (n = 24), central nervous system pathologies, (n = 51), and without diagnosis (n = 8)). The dual task cost for walking speed, stride frequency and stride regularity were different among these subgroups (p analysis of dual task cost for stride frequency and stride regularity allowed the identification of 3 motor phenotypes (p Gait analysis under dual-task conditions in elderly people suffering from gait disorders or memory impairment is of great value in assessing the severity of gait disorders, differentiating between peripheral pathologies and central nervous system pathologies, and identifying motor

  15. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  16. Cognitive task analysis of nuclear power plant operators for man-machine interface design

    International Nuclear Information System (INIS)

    Itoh, J.I.; Yoshimura, S.; Ohtsuka, T.

    1990-01-01

    This paper aims to ascertain and further develop design guidelines for a man-machine interface compatible with plant operators' problem solving strategies. As the framework for this study, operator's information processing activities were modeled, based on J. Rasmussen's framework for cognitive task analysis. Two experiments were carried out. One was an experiment aimed at gaining an understanding of internal mechanisms involved in mistakes and slips which occurred in operators' responses to incidents and accidents. As a result of fifteen cases of operator performance analysis, sixty one human errors were identified. Further analysis of the errors showed that frequently occurring error mechanisms were absent-mindedness, lack of recognition of patterns in diagnosis and failed procedure formulation due to memory lapses. The other kind of experiment was carried out to identify the envelope of trajectories for the operator's search in the problem space consisting of the two dimensions of means-ends and whole-part relations while dealing with transients. Two cases of experimental sessions were conducted with the thinking-aloud method. From analyses based on verbal protocols, trajectories of operator's search were derived, covering from the whole plant level through the component level in the whole-part dimension and covering from the functional purpose level through the physical form level in the means-ends dimension. The findings obtained from these analyses serve as a basis for developing design guidelines for man-machine interfaces in control rooms of nuclear power plants

  17. Modification of the ladder rung walking task-new options for analysis of skilled movements.

    Science.gov (United States)

    Antonow-Schlorke, Iwa; Ehrhardt, Julia; Knieling, Marcel

    2013-01-01

    Method sensitivity is critical for evaluation of poststroke motor function. Skilled walking was assessed in horizontal, upward, and downward rung ladder walking to compare the demands of the tasks and test sensitivity. The complete step sequence of a walk was subjected to analysis aimed at demonstrating the walking pattern, step sequence, step cycle, limb coordination, and limb interaction to complement the foot fault scoring system. Rats (males, n = 10) underwent unilateral photothrombotic lesion of the motor cortex of the forelimb and hind limb areas. Locomotion was video recorded before the insult and at postischemic days 7 and 28. Analysis of walking was performed frame-by-frame. Walking along the rung ladder revealed different results that were dependent on ladder inclination. Horizontal walking was found to discriminate lesion-related motor deficits in forelimb, whereas downward walking demonstrates hind limb use most sensitively. A more frequent use of the impaired forelimb that possibly supported poststroke motor learning in rats was shown. The present study provides a novel system for a detailed analysis of the complete walking sequence and will help to provide a better understanding of how rats deal with motor impairments.

  18. Precursor systems analyses of automated highway systems. Commercial and transit AHS analysis. Volume 7. Final report, 9 September 1993-30 October 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    The program described by the eight-volume report, a resource materials document type, identified the issues and risks associated with the potential design, development, and operation of an Automated Highway System (AHS), a highway system that utilizes limited access roadways and provides `hands off` driving. The AHS effort was conducted by a team formed and directed by the Calspan Advanced Technology Center. Primary Team members included Calspan, Parsons Brinckerhoff, Dunn Engineering Associates, and Princeton University. Supporting members of the team were BMW, New York Thruway Authority, New York State Department of Transportation, Massachusetts Department of Transportation, the New Jersey Department of Transportation, Boston Research, Vitro Corporation, and Michael P. Walsh of Walsh Associates. The 17 task reports (A through P plus Representative Systems Configurations) are organized into 8 volumes. This volume describes Commercial and Transit AHS Analysis (Task F.) This task was performed as two independent and parallel efforts. Parsons Brinckerhoff performed the work reported in the main body of the report and Appendix A. That work was supervised by Marvin Gersten and supported by Jeanine Jankowski, both of Parsons Brinckerhoff. A separate and parallel analysis, performed by Prinecton University, appears as Appendix B (with its own Executive Summary). The work was developed by Alain Kornhauser.

  19. COMMIT at SemEval-2017 Task 5: Ontology-based Method for Sentiment Analysis of Financial Headlines

    NARCIS (Netherlands)

    Schouten, Kim; Frasincar, Flavius; de Jong, F.M.G.

    2017-01-01

    This paper describes our submission to Task 5 of SemEval 2017, Fine-Grained Sentiment Analysis on Financial Microblogs and News, where we limit ourselves to performing sentiment analysis on news headlines only (track 2). The approach presented in this paper uses a Support Vector Machine to do the

  20. Assessment of solar options for small power systems applications. Volume III. Analysis of concepts

    Energy Technology Data Exchange (ETDEWEB)

    Laity, W.W.; Aase, D.T.; Apley, W.J.; Bird, S.P.; Drost, M.K.; Garrett-Price, B.A.; Williams, T.A.

    1980-09-01

    A comparative analysis of solar thermal conversion concepts that are potentially suitable for development as small electric power systems (1 to 10 MWe) is given. Seven generic types of collectors, together with associated subsystems for electric power generation, were considered. The collectors can be classified into three categories: (1) two-axis tracking (with compound-curvature reflecting surfaces; (2) one-axis tracking (with single-curvature reflecting suraces; and (3) nontracking (with low-concentration reflecting surfaces). All seven collectors were analyzed in conceptual system configurations with Rankine-cycle engines. In addition, two of the collectors (the Point Focus Central Receiver and the Point Focus Distributed Receiver) were analyzed with Brayton-cycle engines, and the latter of the two also was analyzed with Stirling-cycle engines. This volume describes the systems analyses performed on all the alternative configurations of the seven generic collector concepts and the results obtained. The SOLSTEP computer code used to determine each configuration's system cost and performance is briefly described. The collector and receiver performance calculations used are also presented. The capital investment and related costs that were obtained from the systems studies are presented, and the levelized energy costs are given as a function of capacity factor obtained from the systems studies. Included also are the values of the other attributes used in the concepts' final ranking. The comments, conclusions, and recommendations developed by the PNL study team during the concept characterization and systems analysis tasks of the study are presented. (WHK)

  1. Computer Assisted Data Analysis in the Dye Dilution Technique for Plasma Volume Measurement.

    Science.gov (United States)

    Bishop, Marvin; Robinson, Gerald D.

    1981-01-01

    Describes a method for undergraduate physiology students to measure plasma volume by the dye dilution technique, in which a computer is used to interpret data. Includes the computer program for the data analysis. (CS)

  2. Civic Improvement Program. Volume 2. Fallout Protection Factor Analysis Capability

    Science.gov (United States)

    1987-08-15

    be comprised of a 0.5 inch plasterboard layer on each side 7 supported by 1.5-in x 3.5-in two-by-four studding on 16-inch centers. The volume fractions...for these components would be 0.222 (1/4.5) for the plasterboard and 0.073 (1.5 x 3.5/ 4.5 x 16) for the wood. When density information is available...thick drywall plasterboard below. The foundation thickness is 10 inches of poured concrete. Figure 12 shows front and rear views of the baseline two

  3. Ensemble lymph node detection from CT volumes combining local intensity structure analysis approach and appearance learning approach

    Science.gov (United States)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Oda, Masahiro; Kitasaka, Takayuki; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Mori, Kensaku

    2016-03-01

    This paper presents an ensemble lymph node detection method combining two automated lymph node detection methods from CT volumes. Detecting enlarged abdominal lymph nodes from CT volumes is an important task for the pre-operative diagnosis and planning done for cancer surgery. Although several research works have been conducted toward achieving automated abdominal lymph node detection methods, such methods still do not have enough accuracy for detecting lymph nodes of 5 mm or larger. This paper proposes an ensemble lymph node detection method that integrates two different lymph node detection schemes: (1) the local intensity structure analysis approach and (2) the appearance learning approach. This ensemble approach is introduced with the aim of achieving high sensitivity and specificity. Each component detection method is independently designed to detect candidate regions of enlarged abdominal lymph nodes whose diameters are over 5 mm. We applied the proposed ensemble method to 22 cases using abdominal CT volumes. Experimental results showed that we can detect about 90.4% (47/52) of the abdominal lymph nodes with about 15.2 false-positives/case for lymph nodes of 5mm or more in diameter.

  4. Job task and functional analysis of the Division of Reactor Projects, office of Nuclear Reactor Regulation. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, J.A.; Gilmore, W.; Hahn, H.A.

    1998-07-10

    A job task and functional analysis was recently completed for the positions that make up the regional Divisions of Reactor Projects. Among the conclusions of that analysis was a recommendation to clarify roles and responsibilities among site, regional, and headquarters personnel. As that analysis did not cover headquarters personnel, a similar analysis was undertaken of three headquarters positions within the Division of Reactor Projects: Licensing Assistants, Project Managers, and Project Directors. The goals of this analysis were to systematically evaluate the tasks performed by these headquarters personnel to determine job training requirements, to account for variations due to division/regional assignment or differences in several experience categories, and to determine how, and by which positions, certain functions are best performed. The results of this analysis include recommendations for training and for job design. Data to support this analysis was collected by a survey instrument and through several sets of focus group meetings with representatives from each position.

  5. Set-based Tasks within the Singularity-robust Multiple Task-priority Inverse Kinematics Framework: General Formulation, Stability Analysis and Experimental Results

    Directory of Open Access Journals (Sweden)

    Signe eMoe

    2016-04-01

    Full Text Available Inverse kinematics algorithms are commonly used in robotic systems to transform tasks to joint references, and several methods exist to ensure the achievement of several tasks simultaneously. The multiple task-priority inverse kinematicsframework allows tasks to be considered in a prioritized order by projecting task velocities through the nullspaces of higherpriority tasks. This paper extends this framework to handle setbased tasks, i.e. tasks with a range of valid values, in addition to equality tasks, which have a specific desired value. Examples of set-based tasks are joint limit and obstacle avoidance. The proposed method is proven to ensure asymptotic convergence of the equality task errors and the satisfaction of all high-priority set-based tasks. The practical implementation of the proposed algorithm is discussed, and experimental results are presented where a number of both set-based and equality tasks have been implemented on a 6 degree of freedom UR5 which is an industrial robotic arm from Universal Robots. The experiments validate thetheoretical results and confirm the effectiveness of the proposed approach.

  6. Bayesian change-point analysis reveals developmental change in a classic theory of mind task.

    Science.gov (United States)

    Baker, Sara T; Leslie, Alan M; Gallistel, C R; Hood, Bruce M

    2016-12-01

    Although learning and development reflect changes situated in an individual brain, most discussions of behavioral change are based on the evidence of group averages. Our reliance on group-averaged data creates a dilemma. On the one hand, we need to use traditional inferential statistics. On the other hand, group averages are highly ambiguous when we need to understand change in the individual; the average pattern of change may characterize all, some, or none of the individuals in the group. Here we present a new method for statistically characterizing developmental change in each individual child we study. Using false-belief tasks, fifty-two children in two cohorts were repeatedly tested for varying lengths of time between 3 and 5 years of age. Using a novel Bayesian change point analysis, we determined both the presence and-just as importantly-the absence of change in individual longitudinal cumulative records. Whenever the analysis supports a change conclusion, it identifies in that child's record the most likely point at which change occurred. Results show striking variability in patterns of change and stability across individual children. We then group the individuals by their various patterns of change or no change. The resulting patterns provide scarce support for sudden changes in competence and shed new light on the concepts of "passing" and "failing" in developmental studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Kolla, Hemanth [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Borghesi, Giulio [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-05-01

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- merical tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.

  8. Comparison of causality analysis on simultaneously measured fMRI and NIRS signals during motor tasks.

    Science.gov (United States)

    Anwar, Abdul Rauf; Muthalib, Makii; Perrey, Stephane; Galka, Andreas; Granert, Oliver; Wolff, Stephan; Deuschl, Guenther; Raethjen, Jan; Heute, Ulrich; Muthuraman, Muthuraman

    2013-01-01

    Brain activity can be measured using different modalities. Since most of the modalities tend to complement each other, it seems promising to measure them simultaneously. In to be presented research, the data recorded from Functional Magnetic Resonance Imaging (fMRI) and Near Infrared Spectroscopy (NIRS), simultaneously, are subjected to causality analysis using time-resolved partial directed coherence (tPDC). Time-resolved partial directed coherence uses the principle of state space modelling to estimate Multivariate Autoregressive (MVAR) coefficients. This method is useful to visualize both frequency and time dynamics of causality between the time series. Afterwards, causality results from different modalities are compared by estimating the Spearman correlation. In to be presented study, we used directionality vectors to analyze correlation, rather than actual signal vectors. Results show that causality analysis of the fMRI correlates more closely to causality results of oxy-NIRS as compared to deoxy-NIRS in case of a finger sequencing task. However, in case of simple finger tapping, no clear difference between oxy-fMRI and deoxy-fMRI correlation is identified.

  9. Iowa Gambling Task in patients with early-onset Parkinson's disease: strategy analysis.

    Science.gov (United States)

    Gescheidt, Tomáš; Czekóová, Kristína; Urbánek, Tomáš; Mareček, Radek; Mikl, Michal; Kubíková, Radka; Telecká, Sabina; Andrlová, Hana; Husárová, Ivica; Bareš, Martin

    2012-12-01

    The aim of our study was to analyse decision making in early-onset Parkinson's disease (PD) patients performing the Iowa Gambling Task (IGT). We compared 19 patients with early-onset PD (≤ 45 years) on dopaminergic medication (no evidence of depression, dementia, executive dysfunction according to the Tower of London test and the Stroop test, or pathological gambling) with 20 age-matched controls. A computer version of the IGT was employed. The PD patients achieved slightly lower IGT scores than the control group. A detailed analysis based on 'shift frequencies' between the individual decks showed that the patients tended to change their preferences for the decks more frequently, with a higher preference for the 'disadvantageous' deck B. Control subjects seemed to develop a more effective strategy. These differences could be caused by the poorer ability of the patients to develop any strategy at all. We observed changes in decision making during IGT performance in patients with early-onset PD, although they had no executive dysfunction as measured by established neuropsychological tests. The more detailed analysis employed in the present study could lead to a more accurate study of IGT performance and application of IGT in clinical practice.

  10. Requirements for psychological models to support design: Towards ecological task analysis

    Science.gov (United States)

    Kirlik, Alex

    1991-01-01

    Cognitive engineering is largely concerned with creating environmental designs to support skillful and effective human activity. A set of necessary conditions are proposed for psychological models capable of supporting this enterprise. An analysis of the psychological nature of the design product is used to identify a set of constraints that models must meet if they can usefully guide design. It is concluded that cognitive engineering requires models with resources for describing the integrated human-environment system, and that these models must be capable of describing the activities underlying fluent and effective interaction. These features are required in order to be able to predict the cognitive activity that will be required given various design concepts, and to design systems that promote the acquisition of fluent, skilled behavior. These necessary conditions suggest that an ecological approach can provide valuable resources for psychological modeling to support design. Relying heavily on concepts from Brunswik's and Gibson's ecological theories, ecological task analysis is proposed as a framework in which to predict the types of cognitive activity required to achieve productive behavior, and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The framework is described in terms, and illustrated with an example from the previous research on modeling skilled human-environment interaction.

  11. Human factors assessment in PRA using task analysis linked evaluation technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1990-01-01

    Human error is a primary contributor to risk in complex high-reliability systems. A 1985 U.S. Nuclear Regulatory Commission (USNRC) study of licensee event reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Since then, the USNRC has initiated research to fully and properly integrate human errors into the probabilistic risk assessment (PRA) process. The resulting implementation procedure is known as the Task Analysis Linked Evaluation Technique (TALENT). As indicated, TALENT is a broad-based method for integrating human factors expertise into the PRA process. This process achieves results which: (1) provide more realistic estimates of the impact of human performance on nuclear power safety, (2) can be fully audited, (3) provide a firm technical base for equipment-centered and personnel-centered retrofit/redesign of plants enabling them to meet internally and externally imposed safety standards, and (4) yield human and hardware data capable of supporting inquiries into human performance issues that transcend the individual plant. The TALENT procedure is being field-tested to verify its effectiveness and utility. The objectives of the field-test are to examine (1) the operability of the process, (2) its acceptability to the users, and (3) its usefulness for achieving measurable improvements in the credibility of the analysis. The field-test will provide the information needed to enhance the TALENT process

  12. Effectiveness of part-task training and increasing-difficulty training strategies: a meta-analysis approach.

    Science.gov (United States)

    Wickens, Christopher D; Hutchins, Shaun; Carolan, Thomas; Cumming, John

    2013-04-01

    The objective was to conduct meta-analyses that investigated the effects of two training strategies, increasing difficulty (ID) and part-task training (PTT), on transfer of skills and the variables that moderate effectiveness of the strategies. Cognitive load theory (CLT) provides a basis for predicting that training strategies reducing the intrinsic load of a task during training avail more resources to be devoted to learning. Two strategies that accomplish this goal, by dividing tasks in parts or by simplifying tasks in early training trials, have offered only mixed success. A pair of complementary effect size measures were used in the meta-analyses conducted on 37 transfer studies employing the two training strategies: (a) a transfer ratio analysis on the ratio of treatment transfer performance to control transfer performance and (b) a Hedges' g analysis on the standardized difference between treatment and control group means. PTT generally produced negative transfer when the parts were performed concurrently in the whole transfer task but not when the parts were performed in sequence. Variable-priority training of the whole task was a successful technique. ID training was successful when the increases were implemented adaptively but not when increased in fixed steps. Both strategies provided evidence that experienced learners benefited less, or suffered more, from the strategy, consistent with CLT. PTT can be successful if the integrated parts are varied in the priority they are given to the learner. ID training is successful if the increases are adaptive. The fundamental elements of CLT are confirmed.

  13. A Technology for BigData Analysis Task Description using Domain-Specific Languages

    OpenAIRE

    Kovalchuk, Sergey V.; Zakharchuk, Artem V.; Liao, Jiaqi; Ivanov, Sergey V.; Boukhanovsky, Alexander V.

    2014-01-01

    The article presents a technology for dynamic knowledge-based building of Domain-Specific Languages (DSL) to describe data-intensive scientific discovery tasks using BigData technology. The proposed technology supports high level abstract definition of analytic and simulation parts of the task as well as integration into the composite scientific solutions. Automatic translation of the abstract task definition enables seamless integration of various data sources within single solution.

  14. Multifamily Building Operator Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Building Operator JTA identifies and catalogs all of the tasks performed by multifamily building operators, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  15. Multifamily Quality Control Inspector Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Quality Control Inspector JTA identifies and catalogs all of the tasks performed by multifamily quality control inspectors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  16. Multifamily Retrofit Project Manager Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Retrofit Project Manager JTA identifies and catalogs all of the tasks performed by multifamily retrofit project managers, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  17. Multifamily Energy Auditor Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Energy Auditor JTA identifies and catalogs all of the tasks performed by multifamily energy auditors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  18. Self-presentation processes in job analysis: a field experiment investigating inflation in abilities, tasks, and competencies.

    Science.gov (United States)

    Morgeson, Frederick P; Delaney-Klinger, Kelly; Mayfield, Melinda S; Ferrara, Philip; Campion, Michael A

    2004-08-01

    Although job analysis is a widely used organizational data collection technique, little research has investigated the extent to which job analysis information is affected by self-presentation processes. This study represents the first direct test of the propositions offered by F. P. Morgeson and M. A. Campion (1997) concerning self-presentation in job analysis measurement. Using an experimental design, the authors examined job incumbent response differences across ability, task, and competency statements. Results indicated that ability statements were more subject to inflation than were task statements across all rating scales. Greater endorsement of nonessential ability statements was responsible for the differences. This produced higher endorsement of ability items but lower mean ratings. Finally, frequency and importance ratings of global competency statements were generally higher than decomposed ability and task scales, but required-at-entry judgments demonstrated the opposite relationship. (c) 2004 APA

  19. Turnaround operations analysis for OTV. Volume 2: Detailed technical report

    Science.gov (United States)

    1988-01-01

    The objectives and accomplishments were to adapt and apply the newly created database of Shuttle/Centaur ground operations. Previously defined turnaround operations analyses were to be updated for ground-based OTVs (GBOTVs) and space-based OTVs (SBOTVs), design requirements identified for both OTV and Space Station accommodations hardware, turnaround operations costs estimated, and a technology development plan generated to develop the required capabilities. Technical and programmatic data were provided for NASA pertinent to OTV round and space operations requirements, turnaround operations, task descriptions, timelines and manpower requirements, OTV modular design and booster and Space Station interface requirements. SBOTV accommodations development schedule, cost and turnaround operations requirements, and a technology development plan for ground and space operations and space-based accommodations facilities and support equipment. Significant conclusion are discussed.

  20. Study on the utilization of the cognitive architecture EPIC to the task analysis of a nuclear power plant operator

    International Nuclear Information System (INIS)

    Soares, Herculano Vieira

    2003-02-01

    This work presents a study of the use of the integrative cognitive architecture EPIC - Executive-Process - Interactive-Control, designed to evaluate the performance of a person performing tasks in parallel in a man-machine interface, as a methodology for Cognitive Task Analysis of a nuclear power plant operator. A comparison of the results obtained by the simulation by EPIC and the results obtained by application of the MHP model to the tasks performed by a shift operator during the execution of the procedure PO-E-3 - Steam Generator Tube Rupture of Angra 1 Nuclear Power Plant is done. To subsidize that comparison, an experiment was performed at the Angra 2 Nuclear Power Plant Full Scope Simulator in which three operator tasks were executed, its completion time measured and compared with the results of MHP and EPIC modeling. (author)

  1. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  2. The Impact of Sleep Disruption on Complex Cognitive Tasks: A Meta-Analysis.

    Science.gov (United States)

    Wickens, Christopher D; Hutchins, Shaun D; Laux, Lila; Sebok, Angelia

    2015-09-01

    We aimed to build upon the state of knowledge about the impacts of sleep disruption into the domain of complex cognitive task performance for three types of sleep disruption: total sleep deprivation, sleep restriction, and circadian cycle. Sleep disruption affects human performance by increasing the likelihood of errors or the time it takes to complete tasks, such as the Psychomotor Vigilance Task. It is not clear whether complex tasks are affected in the same way. Understanding the impact of sleep disruption on complex cognitive tasks is important for, and in some instances more relevant to, professional workers confronted with unexpected, catastrophic failures following a period of disrupted sleep. Meta-analytic review methods were applied to each of the three different areas of sleep disruption research. Complex cognitive task performance declines over consecutive hours of continuous wakefulness as well as consecutive days of restricted sleep, is worse for severely restricted sleep (4 or fewer hours in bed), is worse during the circadian nadir than apex, and appears less degraded than simple task performance. The reviews suggest that complex cognitive task performance may not be impacted by disrupted sleep as severely as simple cognitive task performance. Findings apply to predicting effects of sleep disruption on workers in safety-critical environments, such as health care, aviation, the military, process control, and in particular, safety-critical environments involving shiftwork or long-duration missions. © 2015, Human Factors and Ergonomics Society.

  3. Physical activity interventions differentially affect exercise task and barrier self-efficacy: A meta-analysis

    Science.gov (United States)

    Higgins, Torrance J.; Middleton, Kathryn R.; Winner, Larry; Janelle, Christopher M.; Middleton, Kathryn R.

    2014-01-01

    Objective Researchers have yet to establish how interventions to increase physical activity influence specific self-efficacy beliefs. The current study sought to quantify the effect of interventions to increase physical activity among healthy adults on exercise task (EXSE) and barrier self-efficacy (BSE) via meta-analysis. Intervention characteristics associated with self-efficacy and physical activity changes were also identified. Methods A systematic database search and manual searches through reference lists of related publications were conducted for articles on randomized, controlled physical activity interventions. Published intervention studies reporting changes in physical activity behavior and either EXSE or BSE in healthy adults were eligible for inclusion. Results Of the 1,080 studies identified, 20 were included in the meta-analyses. Interventions had a significant effect of g = 0.208, 95% confidence interval (CI) [0.027, 0.388], p physical activity. Moderator analyses indicated shorter interventions that did not include structured exercise sessions effectively increased EXSE and physical activity, whereas long interventions improved BSE. Interventions that did not provide support increased BSE and physical activity levels. Further, interventions that did not require the use of daily exercise logs improved EXSE and physical activity behavior. Conclusion Interventions designed to increase physical activity differentially influenced EXSE and BSE. EXSE appeared to play a more significant role during exercise adoption, whereas BSE was involved in the maintenance of exercise behavior. Recommendations are offered for the design of future interventions. PMID:23957904

  4. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.

  5. Functional connectivity changes during a Working memory task in rat via NMF analysis

    Directory of Open Access Journals (Sweden)

    Jing eWei

    2015-01-01

    Full Text Available Working memory (WM is necessary in higher cognition. The brain as a complex network is formed by interconnections among neurons. Connectivity results in neural dynamics to support cognition. The first aim is to investigate connectivity dynamics in medial prefrontal cortex (mPFC networks during WM. As brain neural activity is sparse, the second aim is to find the intrinsic connectivity property in a feature space. Using multi-channel electrode recording techniques, spikes were simultaneously obtained from mPFC of rats that performed a Y-maze WM task. Continuous time series converted from spikes were embedded in a low-dimensional space by non-negative matrix factorization (NMF. mPFC network in original space was constructed by measuring connections among neurons. And the same network in NMF space was constructed by computing connectivity values between the extracted NMF components. Causal density (Cd and global efficiency (E were estimated to present the network property. The results showed that Cd and E significantly peaked in the interval right before the maze choice point in correct trials. However, the increase did not emerge in error trials. Additionally, Cd and E in two spaces displayed similar trends in correct trials. The difference was that the measures in NMF space were significantly greater than those in original space. Our findings indicated that the anticipatory changes in mPFC networks may have an effect on future WM behavioral choices. Moreover, the NMF analysis achieves a better characterization for a brain network.

  6. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  7. Multiple-task real-time PDP-15 operating system for data acquisition and analysis

    International Nuclear Information System (INIS)

    Myers, W.R.

    1974-01-01

    The RAMOS operating system is capable of handling up to 72 simultaneous tasks in an interrupt-driven environment. The minimum viable hardware configuration includes a Digital Equipment Corporation PDP-15 computer with 16384 words of memory, extended arithmetic element, automatic priority interrupt, a 256K-word RS09 DECdisk, two DECtape transports, and an alphanumeric keyboard/typer. The monitor executes major tasks by loading disk-resident modules to memory for execution; modules are written in a format that allows page-relocation by the monitor, and can be loaded into any available page. All requests for monitor service by tasks, including input/output, floating point arithmetic, request for additional memory, task initiation, etc., are implemented by privileged monitor calls (CAL). All IO device handlers are capable of queuing requests for service, allowing several tasks ''simultaneous'' use of all resources. All alphanumeric IO (including the PC05) is completely buffered and handled by a single multiplexing routine. The floating point arithmetic software is re-entrant to all operating modules and includes matrix arithmetic functions. One of the system tasks can be a ''batch'' job, controlled by simulating an alphanumeric command terminal through cooperative functions of the disk handler and alphanumeric device software. An alphanumeric control sequence may be executed, automatically accessing disk-resident tasks in any prescribed order; a library of control sequences is maintained on bulk storage for access by the monitor. (auth)

  8. Analysis of dual-task elderly gait using wearable plantar-pressure insoles and accelerometer.

    Science.gov (United States)

    Howcroft, Jennifer D; Lemaire, Edward D; Kofman, Jonathan; McIlroy, William E

    2014-01-01

    Dual-task gait allows assessment of impaired executive function and mobility control in older individuals, which are risk factors of falls. This study investigated gait changes in older individuals due to the addition of a cognitive load, using wearable pressure-sensing insole and tri-axial accelerometer measures. These wearable sensors can be applied at the point-of-care. Eleven elderly (65 years or older) individuals walked 7.62 m with and without a verbal fluency cognitive load task while wearing FScan 3000E pressure-sensing insoles in both shoes and a Gulf Coast X16-1C tri-axial accelerometer at the pelvis. Plantar-pressure derived parameters included center of force (CoF) path and temporal measures. Acceleration derived measures were descriptive statistics, Fast Fourier Transform quartile, ratio of even-to-odd harmonics, and maximum Lyapunov exponent. Stride time, stance time, and swing time all significantly increased during dual-task compared to single-task walking. Minimum, mean, and median CoF stance velocity; cadence; and vertical, anterior-posterior, and medial-lateral harmonic ratio all significantly decreased during dual-task walking. Wearable plantar pressure-sensing insole and lower back accelerometer derived-measures can identify gait differences between single-task and dual-task walking in older individuals and could be used in point-of-care environments to assess for deficits in executive function and mobility impairments.

  9. When working memory updating requires updating: analysis of serial position in a running memory task.

    Science.gov (United States)

    Botto, Marta; Basso, Demis; Ferrari, Marcella; Palladino, Paola

    2014-05-01

    This study aimed to investigate updating in working memory (WM), analyzing the effects of task demand and memory resources on serial position curve (SPC), in a running memory task with slow pace presentation and a probed recognition procedure. These task conditions were supposed to produce an easier WM updating task, which may allow evidencing whether the task is performed through an active or a passive updating. Serial position curves were compared in conditions of high or low memory load, and with or without interference of a secondary (prospective memory, PM) task. With either a high WM load, or a high PM load, results showed a SPC with both primacy and recency effects, indicating the use of an active strategy. When resources were taken up by both PM task and high WM demand the usual pattern with only recency effect was obtained. Taken together, these findings support the ideas that 1--people can effectively update WM, and 2--the performance is dependent on both memory and executive resource availability. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Analysis of modeling cumulative noise from simultaneous flights volume 1 : analysis at four national parks

    Science.gov (United States)

    2012-12-31

    This is the first of two volumes of the report on modeling cumulative noise from simultaneous flights. This volume includes: an overview of the time compression algorithms used to model simultaneous aircraft; revised summary of a preliminary study (w...

  11. Do skeletal cephalometric characteristics correlate with condylar volume, surface and shape? A 3D analysis

    Directory of Open Access Journals (Sweden)

    Saccucci Matteo

    2012-05-01

    Full Text Available Abstract Objective The purpose of this study was to determine the condylar volume in subjects with different mandibular divergence and skeletal class using cone-beam computed tomography (CBCT and analysis software. Materials and methods For 94 patients (46 females and 48 males; mean age 24.3 ± 6.5 years, resultant rendering reconstructions of the left and right temporal mandibular joints (TMJs were obtained. Subjects were then classified on the base of ANB angle the GoGn-SN angle in three classes (I, II, III . The data of the different classes were compared. Results No significant difference was observed in the whole sample between the right and the left sides in condylar volume. The analysis of mean volume among low, normal and high mandibular plane angles revealed a significantly higher volume and surface in low angle subjects (p  Class III subjects also tended to show a higher condylar volume and surface than class I and class II subjects, although the difference was not significant. Conclusions Higher condylar volume was a common characteristic of low angle subjects compared to normal and high mandibular plane angle subjects. Skeletal class also appears to be associated to condylar volume and surface.

  12. Left ventricular pressure and volume data acquisition and analysis using LabVIEW.

    Science.gov (United States)

    Cassidy, S C; Teitel, D F

    1997-03-01

    To automate analysis of left ventricular pressure-volume data, we used LabVIEW to create applications that digitize and display data recorded from conductance and manometric catheters. Applications separate data into cardiac cycles, calculate parallel conductance, and calculate indices of left ventricular function, including end-systolic elastance, preload-recruitable stroke work, stroke volume, ejection fraction, stroke work, maximum and minimum derivative of ventricular pressure, heart rate, indices of relaxation, peak filling rate, and ventricular chamber stiffness. Pressure-volume loops can be graphically displayed. These analyses are exported to a text-file. These applications have simplified and automated the process of evaluating ventricular function.

  13. Obtaining Content Weights for Test Specifications from Job Analysis Task Surveys: An Application of the Many-Facets Rasch Model

    Science.gov (United States)

    Wang, Ning; Stahl, John

    2012-01-01

    This article discusses the use of the Many-Facets Rasch Model, via the FACETS computer program (Linacre, 2006a), to scale job/practice analysis survey data as well as to combine multiple rating scales into single composite weights representing the tasks' relative importance. Results from the Many-Facets Rasch Model are compared with those…

  14. Using a task analysis approach within a guided problem-solving model to design mathematical learning activities

    Directory of Open Access Journals (Sweden)

    Aneshkumar Maharaj

    2007-10-01

    Full Text Available The FET Curriculum Statements for Mathematics advocates that knowledge integrates theory, skills and values. This paper focuses on a guided problem-solving teaching model that provides a framework to do this. A task analysis approach is used within this  framework to illustrate how educators could frame mathematical questions based on the relevant mathematical content.

  15. Using a task analysis approach within a guided problem-solving model to design mathematical learning activities

    OpenAIRE

    Aneshkumar Maharaj

    2007-01-01

    The FET Curriculum Statements for Mathematics advocates that knowledge integrates theory, skills and values. This paper focuses on a guided problem-solving teaching model that provides a framework to do this. A task analysis approach is used within this  framework to illustrate how educators could frame mathematical questions based on the relevant mathematical content.

  16. Photovoltaic venture analysis. Final report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    A description of the integrating model for photovoltaic venture analysis is given; input assumptions for the model are described; and the integrating model program listing is given. The integrating model is an explicit representation of the interactions between photovoltaic markets and supply under alternative sets of assumptions. It provides a consistent way of assembling and integrating the various assumptions, data, and information that have been obtained on photovoltaic systems supply and demand factors. Secondly, it provides a mechanism for understanding the implications of all the interacting assumptions. By representing the assumptions in a common, explicit framework, much more complex interactions can be considered than are possible intuitively. The integrating model therefore provides a way of examining the relative importance of different assumptions, parameters, and inputs through sensitivity analysis. Also, detailed results of model sensitivity analysis and detailed market and systems information are presented. (WHK)

  17. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1995-03-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  18. Cost-volume-profit and net present value analysis of health information systems.

    Science.gov (United States)

    McLean, R A

    1998-08-01

    The adoption of any information system should be justified by an economic analysis demonstrating that its projected benefits outweigh its projected costs. Analysis differ, however, on which methods to employ for such a justification. Accountants prefer cost-volume-profit analysis, and economists prefer net present value analysis. The article explains the strengths and weaknesses of each method and shows how they can be used together so that well-informed investments in information systems can be made.

  19. The Effect of Corrective Feedback on Performance in Basic Cognitive Tasks: An Analysis of RT Components

    Directory of Open Access Journals (Sweden)

    Carmen Moret-Tatay

    2016-12-01

    Full Text Available The current work examines the effect of trial-by-trial feedback about correct and error responding on performance in two basic cognitive tasks: a classic Stroop task (n = 40 and a color-word matching task ('n' = 30. Standard measures of both RT and accuracy were examined in addition to measures obtained from fitting the ex-Gaussian distributional model to the correct RTs. For both tasks, RTs were faster in blocks of trials with feedback than in blocks without feedback, but this difference was not significant. On the other hand, with respect to the distributional analyses, providing feedback served to significantly reduce the size of the tails of the RT distributions. Such results suggest that, for conditions in which accuracy is fairly high, the effect of corrective feedback might either be to reduce the tendency to double-check before responding or to decrease the amount of attentional lapsing.

  20. Performance Measure Analysis of Command and Control Organizational and Task Structures

    National Research Council Canada - National Science Library

    Smith, Neil

    1996-01-01

    .... The purpose of the initial A2C2 experiment was to examine the relationships between organizational structures and task structures involving competition for scarce assets, to serve as an integration...

  1. Function spaces and partial differential equations volume 2 : contemporary analysis

    CERN Document Server

    Taheri, Ali

    2015-01-01

    This is a book written primarily for graduate students and early researchers in the fields of Analysis and Partial Differential Equations (PDEs). Coverage of the material is essentially self-contained, extensive and novel with great attention to details and rigour.

  2. Licensing support system preliminary needs analysis: Volume 1

    International Nuclear Information System (INIS)

    1989-01-01

    This Preliminary Needs Analysis, together with the Preliminary Data Scope Analysis (next in this series of reports), is a first effort under the LSS Design and Implementation Contract toward developing a sound requirements foundation for subsequent design work. Further refinements must be made before requirements can be specified in sufficient detail to provide a basis for suitably specific system specifications. This preliminary analysis of the LSS requirements has been divided into a ''needs'' and a ''data scope'' portion only for project management and scheduling reasons. The Preliminary Data Scope Analysis will address all issues concerning the content and size of the LSS data base; providing the requirements basis for data acquisition, cataloging and storage sizing specifications. This report addresses all other requirements for the LSS. The LSS consists of both computer subsystems and non-computer archives. This study addresses only the computer subsystems, focusing on the Access Subsystems. After providing background on previous LSS-related work, this report summarizes the findings from previous examinations of needs and describes a number of other requirements that have an impact on the LSS. The results of interviews conducted for this report are then described and analyzed. The final section of the report brings all of the key findings together and describes how these needs analyses will continue to be refined and utilized in on-going design activities. 14 refs., 2 figs., 1 tab

  3. An analysis of the processing requirements of a complex perceptual-motor task

    Science.gov (United States)

    Kramer, A. F.; Wickens, C. D.; Donchin, E.

    1983-01-01

    Current concerns in the assessment of mental workload are discussed, and the event-related brain potential (ERP) is introduced as a promising mental-workload index. Subjects participated in a series of studies in which they were required to perform a target acquisition task while also covertly counting either auditory or visual probes. The effects of several task-difficulty manipulations on the P300 component of the ERP elicited by the counted stimulus probes were investigated. With sufficiently practiced subjects the amplitude of the P300 was found to decrease with increases in task difficulty. The second experiment also provided evidence that the P300 is selectively sensitive to task-relevant attributes. A third experiment demonstrated a convergence in the amplitude of the P300s elicited in the simple and difficult versions of the tracking task. The amplitude of the P300 was also found to covary with the measures of tracking performance. The results of the series of three experiments illustrate the sensitivity of the P300 to the processing requirements of a complex target acquisition task. The findings are discussed in terms of the multidimensional nature of processing resources.

  4. Knee Arthroscopy Simulation: A Randomized Controlled Trial Evaluating the Effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) Tool.

    Science.gov (United States)

    Bhattacharyya, Rahul; Davidson, Donald J; Sugand, Kapil; Bartlett, Matthew J; Bhattacharya, Rajarshi; Gupte, Chinmay M

    2017-10-04

    Virtual-reality and cadaveric simulations are expensive and not readily accessible. Innovative and accessible training adjuncts are required to help to meet training needs. Cognitive task analysis has been used extensively to train pilots and in other surgical specialties. However, the use of cognitive task analyses within orthopaedics is in its infancy. The purpose of this study was to evaluate the effectiveness of a novel cognitive task analysis tool to train novice surgeons in diagnostic knee arthroscopy in high-fidelity, phantom-limb simulation. Three expert knee surgeons were interviewed independently to generate a list of technical steps, decision points, and errors for diagnostic knee arthroscopy. A modified Delphi technique was used to generate the final cognitive task analysis. A video and a voiceover were recorded for each phase of this procedure. These were combined to produce the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool that utilizes written and audiovisual stimuli to describe each phase of a diagnostic knee arthroscopy. In this double-blinded, randomized controlled trial, a power calculation was performed prior to recruitment. Sixteen novice orthopaedic trainees who performed ≤10 diagnostic knee arthroscopies were randomized into 2 equal groups. The intervention group (IKACTA group) was given the IKACTA tool and the control group had no additional learning material. They were assessed objectively (validated Arthroscopic Surgical Skill Evaluation Tool [ASSET] global rating scale) on a high-fidelity, phantom-knee simulator. All participants, using the Likert rating scale, subjectively rated the tool. The mean ASSET score (and standard deviation) was 19.5 ± 3.7 points in the IKACTA group and 10.6 ± 2.3 points in the control group, resulting in an improvement of 8.9 points (95% confidence interval, 7.6 to 10.1 points; p = 0.002); the score was determined as 51.3% (19.5 of 38) for the IKACTA group, 27.9% (10.6 of 38) for the

  5. Final report : Occupational gap analysis, volume 2 Wood Buffalo region

    International Nuclear Information System (INIS)

    2002-06-01

    The general economy of the Wood Buffalo region of Alberta has improved in recent years due to the expansion of the oil sand industry. Many of the oil sands projects are situated in the southern part of the region and are serviced out of Lac LaBiche. This paper examines the implications of this economic growth for the regional supply and demand of workers in key occupational categories. In particular, it projects employment opportunities that are directly attributed to the regional economic expansion, and assesses whether there will be an adequate supply of workers to meet the demand for new and replacement employees. The report includes: an overview of the regional economy; a population forecast for the region and the demand for workers in selected occupations; an analysis of whether or not the new labour force entrants will choose the selected occupations; and, a supply and demand analysis. 7 tabs., 2 figs., 5 appendices

  6. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    Science.gov (United States)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  7. An analysis of confidence limit calculations used in AAPM Task Group No. 119

    International Nuclear Information System (INIS)

    Knill, Cory; Snyder, Michael

    2011-01-01

    Purpose: The report issued by AAPM Task Group No. 119 outlined a procedure for evaluating the effectiveness of IMRT commissioning. The procedure involves measuring gamma pass-rate indices for IMRT plans of standard phantoms and determining if the results fall within a confidence limit set by assuming normally distributed data. As stated in the TG report, the assumption of normally distributed gamma pass rates is a convenient approximation for commissioning purposes, but may not accurately describe the data. Here the authors attempt to better describe gamma pass-rate data by fitting it to different distributions. The authors then calculate updated confidence limits using those distributions and compare them to those derived using TG No. 119 method. Methods: Gamma pass-rate data from 111 head and neck patients are fitted using the TG No. 119 normal distribution, a truncated normal distribution, and a Weibull distribution. Confidence limits to 95% are calculated for each and compared. A more general analysis of the expected differences between the TG No. 119 method of determining confidence limits and a more time-consuming curve fitting method is performed. Results: The TG No. 119 standard normal distribution does not fit the measured data. However, due to the small range of measured data points, the inaccuracy of the fit has only a small effect on the final value of the confidence limits. The confidence limits for the 111 patient plans are within 0.1% of each other for all distributions. The maximum expected difference in confidence limits, calculated using TG No. 119's approximation and a truncated distribution, is 1.2%. Conclusions: A three-parameter Weibull probability distribution more accurately fits the clinical gamma index pass-rate data than the normal distribution adopted by TG No. 119. However, the sensitivity of the confidence limit on distribution fit is low outside of exceptional circumstances.

  8. Lessons from a pilot project in cognitive task analysis: the potential role of intermediates in preclinical teaching in dental education.

    Science.gov (United States)

    Walker, Judith; von Bergmann, HsingChi

    2015-03-01

    The purpose of this study was to explore the use of cognitive task analysis to inform the teaching of psychomotor skills and cognitive strategies in clinical tasks in dental education. Methods used were observing and videotaping an expert at one dental school thinking aloud while performing a specific preclinical task (in a simulated environment), interviewing the expert to probe deeper into his thinking processes, and applying the same procedures to analyze the performance of three second-year dental students who had recently learned the analyzed task and who represented a spectrum of their cohort's ability to undertake the procedure. The investigators sought to understand how experts (clinical educators) and intermediates (trained students) overlapped and differed at points in the procedure that represented the highest cognitive load, known as "critical incidents." Findings from this study and previous research identified possible limitations of current clinical teaching as a result of expert blind spots. These findings coupled with the growing evidence of the effectiveness of peer teaching suggest the potential role of intermediates in helping novices learn preclinical dentistry tasks.

  9. Wind tunnel test IA300 analysis and results, volume 1

    Science.gov (United States)

    Kelley, P. B.; Beaufait, W. B.; Kitchens, L. L.; Pace, J. P.

    1987-01-01

    The analysis and interpretation of wind tunnel pressure data from the Space Shuttle wind tunnel test IA300 are presented. The primary objective of the test was to determine the effects of the Space Shuttle Main Engine (SSME) and the Solid Rocket Booster (SRB) plumes on the integrated vehicle forebody pressure distributions, the elevon hinge moments, and wing loads. The results of this test will be combined with flight test results to form a new data base to be employed in the IVBC-3 airloads analysis. A secondary objective was to obtain solid plume data for correlation with the results of gaseous plume tests. Data from the power level portion was used in conjunction with flight base pressures to evaluate nominal power levels to be used during the investigation of changes in model attitude, eleveon deflection, and nozzle gimbal angle. The plume induced aerodynamic loads were developed for the Space Shuttle bases and forebody areas. A computer code was developed to integrate the pressure data. Using simplified geometrical models of the Space Shuttle elements and components, the pressure data were integrated to develop plume induced force and moments coefficients that can be combined with a power-off data base to develop a power-on data base.

  10. Waste Isolation Pilot Plant Safety Analysis Report. Volume 5

    International Nuclear Information System (INIS)

    1986-01-01

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  11. Waste Isolation Pilot Plant Safety Analysis Report. Volume 2

    International Nuclear Information System (INIS)

    1986-01-01

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  12. Waste Isolation Pilot Plant Safety Analysis Report. Volume 4

    International Nuclear Information System (INIS)

    1986-01-01

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  13. Waste Isolation Pilot Plant Safety Analysis Report. Volume 1

    International Nuclear Information System (INIS)

    1986-01-01

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection: Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating control and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  14. Does Flywheel Paradigm Training Improve Muscle Volume and Force? A Meta-Analysis.

    Science.gov (United States)

    Nuñez Sanchez, Francisco J; Sáez de Villarreal, Eduardo

    2017-11-01

    Núñez Sanchez, FJ and Sáez de Villarreal, E. Does flywheel paradigm training improve muscle volume and force? A meta-analysis. J Strength Cond Res 31(11): 3177-3186, 2017-Several studies have confirmed the efficacy of flywheel paradigm training for improving or benefiting muscle volume and force. A meta-analysis of 13 studies with a total of 18 effect sizes was performed to analyse the role of various factors on the effectiveness of flywheel paradigm training. The following inclusion criteria were employed for the analysis: (a) randomized studies; (b) high validity and reliability instruments; (c) published in a high quality peer-reviewed journal; (d) healthy participants; (e) studies where the eccentric programme were described; and (f) studies where increases in muscle volume and force were measured before and after training. Increases in muscle volume and force were noted through the use of flywheel systems during short periods of training. The increase in muscle mass appears was not influenced by the existence of eccentric overload during the exercise. The increase in force was significantly higher with the existence of eccentric overload during the exercise. The responses identified in this analysis are essential and should be considered by strength and conditioning professionals regarding the most appropriate dose response trends for flywheel paradigm systems to optimize the increase in muscle volume and force.

  15. Analysis meets geometry the Mikael Passare memorial volume

    CERN Document Server

    Boman, Jan; Kiselman, Christer; Kurasov, Pavel; Sigurdsson, Ragnar

    2017-01-01

    This book is dedicated to the memory of Mikael Passare, an outstanding Swedish mathematician who devoted his life to developing the theory of analytic functions in several complex variables and exploring geometric ideas first-hand. It includes several papers describing Mikael’s life as well as his contributions to mathematics, written by friends of Mikael’s who share his attitude and passion for science. A major section of the book presents original research articles that further develop Mikael’s ideas and which were written by his former students and co-authors. All these mathematicians work at the interface of analysis and geometry, and Mikael’s impact on their research cannot be underestimated. Most of the contributors were invited speakers at the conference organized at Stockholm University in his honor. This book is an attempt to express our gratitude towards this great mathematician, who left us full of energy and new creative mathematical ideas.

  16. A task analysis-linked approach for integrating the human factor in reliability assessments of nuclear power plants

    International Nuclear Information System (INIS)

    Ryan, T.G.

    1988-01-01

    This paper describes an emerging Task Analysis-Linked Evaluation Technique (TALENT) for assessing the contributions of human error to nuclear power plant systems unreliability and risk. Techniques such as TALENT are emerging as a recognition that human error is a primary contributor to plant safety, however, it has been a peripheral consideration to data in plant reliability evaluations. TALENT also recognizes that involvement of persons with behavioral science expertise is required to support plant reliability and risk analyses. A number of state-of-knowledge human reliability analysis tools are also discussed which support the TALENT process. The core of TALENT is comprised of task, timeline and interface analysis data which provide the technology base for event and fault tree development, serve as criteria for selecting and evaluating performance shaping factors, and which provide a basis for auditing TALENT results. Finally, programs and case studies used to refine the TALENT process are described along with future research needs in the area. (author)

  17. Oak Ridge Health Studies Phase 1 report, Volume 2: Part D, Dose Reconstruction Feasibility Study. Tasks 6, Hazard summaries for important materials at the Oak Ridge Reservation

    Energy Technology Data Exchange (ETDEWEB)

    Bruce, G.M.; Walker, L.B.; Widner, T.E.

    1993-09-01

    The purpose of Task 6 of Oak Ridge Phase I Health Studies is to provide summaries of current knowledge of toxic and hazardous properties of materials that are important for the Oak Ridge Reservation. The information gathered in the course of Task 6 investigations will support the task of focussing any future health studies efforts on those operations and emissions which have likely been most significant in terms of off-site health risk. The information gathered in Task 6 efforts will likely also be of value to individuals evaluating the feasibility of additional health,study efforts (such as epidemiological investigations) in the Oak Ridge area and as a resource for citizens seeking information on historical emissions.

  18. Automated acoustic analysis of task dependency in adductor spasmodic dysphonia versus muscle tension dysphonia.

    Science.gov (United States)

    Roy, Nelson; Mazin, Alqhazo; Awan, Shaheen N

    2014-03-01

    Distinguishing muscle tension dysphonia (MTD) from adductor spasmodic dysphonia (ADSD) can be difficult. Unlike MTD, ADSD is described as "task-dependent," implying that dysphonia severity varies depending upon the demands of the vocal task, with connected speech thought to be more symptomatic than sustained vowels. This study used an acoustic index of dysphonia severity (i.e., the Cepstral Spectral Index of Dysphonia [CSID]) to: 1) assess the value of "task dependency" to distinguish ADSD from MTD, and to 2) examine associations between the CSID and listener ratings. Case-Control Study. CSID estimates of dysphonia severity for connected speech and sustained vowels of patients with ADSD (n = 36) and MTD (n = 45) were compared. The diagnostic precision of task dependency (as evidenced by differences in CSID-estimated dysphonia severity between connected speech and sustained vowels) was examined. In ADSD, CSID-estimated severity for connected speech (M = 39. 2, SD = 22.0) was significantly worse than for sustained vowels (M = 29.3, SD = 21.9), [P = .020]. Whereas in MTD, no significant difference in CSID-estimated severity was observed between connected speech (M = 55.1, SD = 23.8) and sustained vowels (M = 50.0, SD = 27.4), [P = .177]. CSID evidence of task dependency correctly identified 66.7% of ADSD cases (sensitivity) and 64.4% of MTD cases (specificity). CSID and listener ratings were significantly correlated. Task dependency in ADSD, as revealed by differences in acoustically-derived estimates of dysphonia severity between connected speech and sustained vowel production, is a potentially valuable diagnostic marker. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  19. A hydrogen energy carrier. Volume 2: Systems analysis

    Science.gov (United States)

    Savage, R. L. (Editor); Blank, L. (Editor); Cady, T. (Editor); Cox, K. (Editor); Murray, R. (Editor); Williams, R. D. (Editor)

    1973-01-01

    A systems analysis of hydrogen as an energy carrier in the United States indicated that it is feasible to use hydrogen in all energy use areas, except some types of transportation. These use areas are industrial, residential and commercial, and electric power generation. Saturation concept and conservation concept forecasts of future total energy demands were made. Projected costs of producing hydrogen from coal or from nuclear heat combined with thermochemical decomposition of water are in the range $1.00 to $1.50 per million Btu of hydrogen produced. Other methods are estimated to be more costly. The use of hydrogen as a fuel will require the development of large-scale transmission and storage systems. A pipeline system similar to the existing natural gas pipeline system appears practical, if design factors are included to avoid hydrogen environment embrittlement of pipeline metals. Conclusions from the examination of the safety, legal, environmental, economic, political and societal aspects of hydrogen fuel are that a hydrogen energy carrier system would be compatible with American values and the existing energy system.

  20. Engineering characterization of ground motion. Task II. Effects of ground motion characteristics on structural response considering localized structural nonlinearities and soil-structure interaction effects. Volume 2

    International Nuclear Information System (INIS)

    Kennedy, R.P.; Kincaid, R.H.; Short, S.A.

    1985-03-01

    This report presents the results of part of a two-task study on the engineering characterization of earthquake ground motion for nuclear power plant design. Task I of the study, which is presented in NUREG/CR-3805, Vol. 1, developed a basis for selecting design response spectra taking into account the characteristics of free-field ground motion found to be significant in causing structural damage. Task II incorporates additional considerations of effects of spatial variations of ground motions and soil-structure interaction on foundation motions and structural response. The results of Task II are presented in four parts: (1) effects of ground motion characteristics on structural response of a typical PWR reactor building with localized nonlinearities and soil-structure interaction effects; (2) empirical data on spatial variations of earthquake ground motion; (3) soil-structure interaction effects on structural response; and (4) summary of conclusions and recommendations based on Tasks I and II studies. This report presents the results of the first part of Task II. The results of the other parts will be presented in NUREG/CR-3805, Vols. 3 to 5

  1. Development of the complex of nuclear-physical methods of analysis for geology and technology tasks in Kazakhstan

    International Nuclear Information System (INIS)

    Solodukhin, V.; Silachyov, I.; Poznyak, V.; Gorlachev, I.

    2016-01-01

    The paper describes the development of nuclear-physical methods of analysis and their applications in Kazakhstan for geological tasks and technology. The basic methods of this complex include instrumental neutron-activation analysis, x-ray fluorescent analysis and instrumental γ-spectrometry. The following aspects are discussed: applications of developed and adopted analytical techniques for assessment and calculations of rare-earth metal reserves at various deposits in Kazakhstan, for technology development of mining and extraction from uranium-phosphorous ore and wastes, for radioactive coal gasification technology, for studies of rare metal contents in chromite, bauxites, black shales and their processing products. (author)

  2. Quantitative Analysis of Language Production in Parkinson's Disease Using a Cued Sentence Generation Task

    Science.gov (United States)

    Vanhoutte, Sarah; De Letter, Miet; Corthals, Paul; Van Borsel, John; Santens, Patrick

    2012-01-01

    The present study examined language production skills in Parkinson's disease (PD) patients. A unique cued sentence generation task was created in order to reduce demands on memory and attention. Differences in sentence production abilities according to disease severity and cognitive impairments were assessed. Language samples were obtained from 20…

  3. A meta-analysis of the impact of situationally induced achievement goals on task performance

    NARCIS (Netherlands)

    Van Yperen, Nico W.; Blaga, Monica; Postmes, Thomas

    2015-01-01

    The purpose of this research was to meta-analyze studies which experimentally induced an achieve- ment goal state to examine its causal effect on the individual’s performance at the task at hand, and to investigate the moderator effects of feedback anticipation and time pressure. The data set

  4. A Task-Specific Analysis of the Benefit of Haptic Shared Control During Tele-Manipulation

    NARCIS (Netherlands)

    Boessenkool, H.; Abbink, D. A.; Heemskerk, C. J. M.; van der Helm, F. C. T.; Wildenbeest, J. G. W.

    2013-01-01

    Tele-manipulation allows human to perform operations in a remote environment, but performance and required time of tasks is negatively influenced when (haptic) feedback is limited. Improvement of transparency (reflected forces) is an important focus in literature, but despite significant progress,

  5. Difficulties in solving context-based PISA mathematics tasks : An analysis of students' errors

    NARCIS (Netherlands)

    Wijaya, Ariyadi; van den Heuvel-Panhuizen, Marja; Doorman, Michiel; Robitzsch, Alexander

    2014-01-01

    The intention of this study was to clarify students' difficulties in solving context-based mathematics tasks as used in the Programme for International Student Assessment (PISA). The study was carried out with 362 Indonesian ninth- and tenth-grade students. In the study we used 34 released PISA

  6. Path Analysis Examining Self-Efficacy and Decision-Making Performance on a Simulated Baseball Task

    Science.gov (United States)

    Hepler, Teri J.; Feltz, Deborah L.

    2012-01-01

    The purpose of this study was to examine the relationship between decision-making self-efficacy and decision-making performance in sport. Undergraduate students (N = 78) performed 10 trials of a decision-making task in baseball. Self-efficacy was measured before performing each trial. Decision-making performance was assessed by decision speed and…

  7. A Task Analysis of a Sport Education Physical Education Season for Fourth Grade Students

    Science.gov (United States)

    Layne, Todd; Hastie, Peter

    2015-01-01

    Background: Previous research on Sport Education in which the participants were in the primary grades has focused on perceptions of fun and enjoyment as well as other components of motivation. To date, no study in Sport Education has examined the accomplishment of the various instructional and managerial tasks by upper primary school children,…

  8. Exploring strategies in integrated container terminal planning tasks : A data-intensive simulation game analysis

    NARCIS (Netherlands)

    Kurapati, S.; Lukosch, H.K.; Cunningham, S.; Kwakkel, J.H.; Verbraeck, A.

    2016-01-01

    Planning tasks in modern, fully automated container terminals require a high awareness of the complex situation, and successful planning strategies. Operational planning includes both strategies of planning and resource management. As planning procedures are not (yet) fully automated, a skilled

  9. Task and person-focused leadership behaviors and team performance : A meta-analysis

    NARCIS (Netherlands)

    Ceri-Booms, Meltem; Curseu, P.L.; Oerlemans, L.A.G.

    2017-01-01

    This paper reports the results of a meta-analytic review of the relationship between person and task oriented leader behaviors, on the one hand, and team performance, on the other hand. The results, based on 89 independent samples, show a moderate positive (ρ=.33) association between both types of

  10. Gender Perspectives on Spatial Tasks in a National Assessment: A Secondary Data Analysis

    Science.gov (United States)

    Logan, Tracy; Lowrie, Tom

    2017-01-01

    Most large-scale summative assessments present results in terms of cumulative scores. Although such descriptions can provide insights into general trends over time, they do not provide detail of how students solved the tasks. Less restrictive access to raw data from these summative assessments has occurred in recent years, resulting in…

  11. A diffusion decision model analysis of evidence variability in the lexical decision task

    NARCIS (Netherlands)

    Tillman, Gabriel; Osth, Adam F.; van Ravenzwaaij, Don; Heathcote, Andrew

    2017-01-01

    The lexical-decision task is among the most commonly used paradigms in psycholinguistics. In both the signal-detection theory and Diffusion Decision Model (DDM; Ratcliff, Gomez, & McKoon, Psychological Review, 111, 159–182, 2004) frameworks, lexical-decisions are based on a continuous source of

  12. How Can Writing Tasks Be Characterized in a Way Serving Pedagogical Goals and Automatic Analysis Needs?

    Science.gov (United States)

    Quixal, Martí; Meurers, Detmar

    2016-01-01

    The paper tackles a central question in the field of Intelligent Computer-Assisted Language Learning (ICALL): How can language learning tasks be conceptualized and made explicit in a way that supports the pedagogical goals of current Foreign Language Teaching and Learning and at the same time provides an explicit characterization of the Natural…

  13. Iowa Gambling Task in patients with early-onset Parkinson’s disease: strategy analysis

    Czech Academy of Sciences Publication Activity Database

    Gescheidt, T.; Czekóová, Kristína; Urbánek, Tomáš; Mareček, R.; Mikl, M.; Kubíková, R.; Telecká, S.; Andrlová, H.; Husárová, I.; Bareš, M.

    2012-01-01

    Roč. 33, č. 6 (2012), s. 1329-1335 ISSN 1590-1874 R&D Projects: GA ČR(CZ) GAP407/12/2432 Institutional support: RVO:68081740 Keywords : Parkinson’s disease * decision making * Iowa gambling task * executive function Subject RIV: FL - Psychiatry, Sexuology Impact factor: 1.412, year: 2012

  14. Different Bilingual Experiences Might Modulate Executive Tasks Advantages: Comparative Analysis between Monolinguals, Translators, and Interpreters.

    Science.gov (United States)

    Henrard, Sébastien; Van Daele, Agnès

    2017-01-01

    Many studies have shown that being bilingual presents an advantage in executive control. However, it appears that knowing two (or more) languages is not enough to improve executive control. According to the adaptive control hypothesis (Green and Abutalebi, 2013), the interactional context in which bilinguals behave is a key factor that modulates cognitive advantage in executive control. Translation and simultaneous interpretation are performed in a dual-language context: professional bi- and multilinguals use two or more languages within the same context (at work). Simultaneous interpretation differs from translation though, because of its higher level of time pressure, which increases the cognitive demands on executive control. The main objective of the present study is to investigate the relationship between simultaneous interpretation and some aspects of executive control. To this end, we compare the performance of three groups (60 interpreters, 60 translators, and 60 monolinguals) in five computerized tasks designed to assess different executive processes as well as the speed of information processing. The results show that the interpreters perform better than the monolinguals in all tasks and better than the translators in all tasks except for the one designed to assess flexibility. The results also show that the age variable does not have the same effect on performance in tasks designed to assess updating, flexibility, and resistance of proactive inhibition in bilinguals (both interpreters and translators), or in tasks designed to assess the speed of information processing and inhibition of a prepotent response in interpreters only. In addition to the advantage that being bilingual presents in some aspects of executive control, the results suggest that interpreters have an additional advantage that may be explained by the characteristics of their work activity (especially heavy time pressure) and by how much experience they have in this activity (in terms of

  15. Analysis of the posture pattern during robotic simulator tasks using an optical motion capture system.

    Science.gov (United States)

    Takayasu, Kenta; Yoshida, Kenji; Mishima, Takao; Watanabe, Masato; Matsuda, Tadashi; Kinoshita, Hidefumi

    2018-01-01

    Surgeons are sometimes forced to maintain uncomfortable joint positions during robotic surgery despite the high degree of instrument maneuverability. This study aimed to use an optical motion capture system to analyze the differences in posture patterns during robotic simulator tasks between surgeons at two skill levels. Ten experienced and ten novice surgeons performed two tasks in a da Vinci Skills Simulator: Suture Sponge 1 (SP) and Tubes (TU). The participants' upper body motion during each task was captured, including the joint angles (axilla, elbow, and wrist), the percentage of time when the wrist height was lower than the elbow height (PTW), and the height of the elbow and wrist relative to the armrest. The novice group showed significantly more excess extension in both elbow angles and extension (>50°) in both wrist angles than did the experienced group. The novice group had significantly lower PTW than the experienced group on the right side in both tasks (both p < 0.001), and on the left side in SP (p < 0.001). Compared with the experienced group, the novice group had a significantly higher elbow relative to the armrest on the right side (SP, TU: p < 0.05), and a significantly lower wrist relative to the armrest on the right side (SP, TU: p < 0.05). An optical motion capture system can detect the differences in posture patterns in the positional relationship between the elbow and wrist and the joint angles of the upper limb between two groups of surgeons at different skill levels during robotic simulator tasks.

  16. A 259.6 μW HRV-EEG Processor With Nonlinear Chaotic Analysis During Mental Tasks.

    Science.gov (United States)

    Roh, Taehwan; Hong, Sunjoo; Cho, Hyunwoo; Yoo, Hoi-Jun

    2016-02-01

    A system-on-chip (SoC) with nonlinear chaotic analysis (NCA) is presented for mental task monitoring. The proposed processor treats both heart rate variability (HRV) and electroencephalography (EEG). An independent component analysis (ICA) accelerator decreases the error of HRV extraction from 5.94% to 1.84% in the preprocessing step. Largest Lyapunov exponents (LLE), as well as linear features such as mean and standard variation and sub-band power, are calculated with NCA acceleration. Measurements with mental task protocols result in confidence level of 95%. Thanks to the hardware acceleration, the chaos-processor fabricated in 0.13 μm CMOS technology consumes only 259.6 μW.

  17. Expectation and task for constructing the volume reduction system of removed soils. In search of the technical integrity from the intermediate storage to final disposal

    International Nuclear Information System (INIS)

    Mori, Hisaki

    2016-01-01

    The intermediate storage volume of the removed soils and incineration ash in Fukushima is supposed about 22 million cubic meters. Within 30 years after starting the intermediate storage, the final disposal outside Fukushima prefecture to these removed soils and incineration ash is determined by the law. Because these removed soils are the very-very low radio activity, the volume reduction method is most effective to reduce the burden of the final disposal. As the volume reduction technology is the stage of research and development, the possibility of the introduction of the volume reduction technology that has the consistency of the final disposal technology is evaluated from the point of view of cost. Since this business is accompanied by economic and technical risk to implement private companies, this project is considered appropriate to be implemented as a national project. (author)

  18. Analysis of Load-Carrying Capacity for Redundant Free-Floating Space Manipulators in Trajectory Tracking Task

    OpenAIRE

    Jia, Qingxuan; Liu, Yong; Chen, Gang; Sun, Hanxu; Peng, Junjie

    2014-01-01

    The aim of this paper is to analyze load-carrying capacity of redundant free-floating space manipulators (FFSM) in trajectory tracking task. Combined with the analysis of influential factors in load-carrying process, evaluation of maximum load-carrying capacity (MLCC) is described as multiconstrained nonlinear programming problem. An efficient algorithm based on repeated line search within discontinuous feasible region is presented to determine MLCC for a given trajectory of the end-effector ...

  19. A hierarchical task analysis of shoulder arthroscopy for a virtual arthroscopic tear diagnosis and evaluation platform (VATDEP).

    Science.gov (United States)

    Demirel, Doga; Yu, Alexander; Cooper-Baer, Seth; Dendukuri, Aditya; Halic, Tansel; Kockara, Sinan; Kockara, Nizamettin; Ahmadi, Shahryar

    2017-09-01

    Shoulder arthroscopy is a minimally invasive surgical procedure for diagnosis and treatment of a shoulder pathology. The procedure is performed with a fiber optic camera, called arthroscope, and instruments inserted through very tiny incisions made around the shoulder. The confined shoulder space, unintuitive camera orientation and constrained instrument motions complicates the procedure. Therefore, surgical competence in arthroscopy entails extensive training especially for psychomotor skills development. Conventional arthroscopy training methods such as mannequins, cadavers or apprenticeship model have limited use attributed to their low-fidelity in realism, cost inefficiency or incurring high risk. However, virtual reality (VR) based surgical simulators offer a realistic, low cost, risk-free training and assessment platform where the trainees can repeatedly perform arthroscopy and receive quantitative feedback on their performances. Therefore, we are developing a VR based shoulder arthroscopy simulation specifically for the rotator cuff ailments that can quantify the surgery performance. Development of such a VR simulation requires a through task analysis that describes the steps and goals of the procedure, comprehensive metrics for quantitative and objective skills and surgical technique assessment. We analyzed shoulder arthroscopic rotator cuff surgeries and created a hierarchical task tree. We introduced a novel surgery metrics to reduce the subjectivity of the existing grading metrics and performed video analysis of 14 surgery recordings in the operating room (OR). We also analyzed our video analysis results with respect to the existing proposed metrics in the literature. We used Pearson's correlation tests to find any correlations among the task times, scores and surgery specific information. We determined strong positive correlation between cleaning time vs difficulty in tying suture, cleaning time vs difficulty in passing suture, cleaning time vs scar

  20. Decision-making in obesity without eating disorders: a systematic review and meta-analysis of Iowa gambling task performances

    OpenAIRE

    Rotge, J. Y.; Poitou, C.; Fossati, P.; Aron-Wisnewsky, J.; Oppert, J.-M.

    2017-01-01

    International audience; Background: There is evidence that obesity is associated with impairments in executive functions, such as deficits in decision-making, planning or problem solving, which might interfere with weight loss in obese individuals. We performed a systematic review and meta-analysis of decision-making abilities, as measured with the Iowa gambling task (IGT), in obesity without eating disorders.Methods: A systematic search was conducted to identify studies comparing IGT perform...

  1. PRT Impact Study Pre-PRT Phase : Volume 1. Travel Analysis.

    Science.gov (United States)

    1976-03-01

    Part of a three-volume work, this report describes the analysis performed on travel data collected for the Pre-PRT Impact Study. The data analyzed consist of travel behavior, travel patterns, model utilization and travel costs of various modes of tra...

  2. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    International Nuclear Information System (INIS)

    2006-01-01

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  3. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2006-03-20

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  4. Waste Isolation Pilot Plant Geotechnical Analysis Report for July 2005 - June 2006, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2007-03-25

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2006. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  5. Physiological adaptation of maternal plasma volume during pregnancy: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Haas, S.; Ghossein-Doha, C.; Kuijk, S.M. van; Drongelen, J. van; Spaanderman, M.E.A.

    2017-01-01

    OBJECTIVE: To describe the physiological pattern of gestational plasma volume adjustments in normal singleton pregnancy and compare this with the pattern in pregnancies complicated by pregnancy-induced hypertension, pre-eclampsia or fetal growth restriction. METHODS: We performed a meta-analysis of

  6. An Analysis of U.S. Sex Education Programs and Evaluation Methods. Volume I.

    Science.gov (United States)

    Kirby, Douglas; And Others

    The volume, first in a series of five, presents an analysis of sex education programs in the United States. It is presented in six chapters. Chapter I provides a brief overview of sex education in the public schools and summarizes goals, forms, and prevalence of sex education. Chapter II reviews literature on the effects of school sex education…

  7. SOLVENT-BASED TO WATERBASED ADHESIVE-COATED SUBSTRATE RETROFIT - VOLUME I: COMPARATIVE ANALYSIS

    Science.gov (United States)

    This volume represents the analysis of case study facilities' experience with waterbased adhesive use and retrofit requirements. (NOTE: The coated and laminated substrate manufacturing industry was selected as part of NRMRL'S support of the 33/50 Program because of its significan...

  8. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    International Nuclear Information System (INIS)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon

    2002-01-01

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  9. Planning performance in schizophrenia patients: a meta-analysis of the influence of task difficulty and clinical and sociodemographic variables.

    Science.gov (United States)

    Knapp, F; Viechtbauer, W; Leonhart, R; Nitschke, K; Kaller, C P

    2017-08-01

    Despite a large body of research on planning performance in adult schizophrenia patients, results of individual studies are equivocal, suggesting either no, moderate or severe planning deficits. This meta-analysis therefore aimed to quantify planning deficits in schizophrenia and to examine potential sources of the heterogeneity seen in the literature. The meta-analysis comprised outcomes of planning accuracy of 1377 schizophrenia patients and 1477 healthy controls from 31 different studies which assessed planning performance using tower tasks such as the Tower of London, the Tower of Hanoi and the Stockings of Cambridge. A meta-regression analysis was applied to assess the influence of potential moderator variables (i.e. sociodemographic and clinical variables as well as task difficulty). The findings indeed demonstrated a planning deficit in schizophrenia patients (mean effect size: ; 95% confidence interval 0.56-0.78) that was moderated by task difficulty in terms of the minimum number of moves required for a solution. The results did not reveal any significant relationship between the extent of planning deficits and sociodemographic or clinical variables. The current results provide first meta-analytic evidence for the commonly assumed impairments of planning performance in schizophrenia. Deficits are more likely to become manifest in problem items with higher demands on planning ahead, which may at least partly explain the heterogeneity of previous findings. As only a small fraction of studies reported coherent information on sample characteristics, future meta-analyses would benefit from more systematic reports on those variables.

  10. High-resolution EEG analysis of power spectral density maps and coherence networks in a proportional reasoning task.

    Science.gov (United States)

    Vecchiato, Giovanni; Susac, Ana; Margeti, Stavroula; De Vico Fallani, Fabrizio; Maglione, Anton Giulio; Supek, Selma; Planinic, Maja; Babiloni, Fabio

    2013-04-01

    Proportional reasoning is very important logical skill required in mathematics and science problem solving as well as in everyday life decisions. However, there is a lack of studies on neurophysiological correlates of proportional reasoning. To explore the brain activity of healthy adults while performing a balance scale task, we used high-resolution EEG techniques and graph-theory based connectivity analysis. After unskilled subjects learned how to properly solve the task, their cortical power spectral density (PSD) maps revealed an increased parietal activity in the beta band. This indicated that subjects started to perform calculations. In addition, the number of inter-hemispheric connections decreased after learning, implying a rearrangement of the brain activity. Repeated performance of the task led to the PSD decrease in the beta and gamma bands among parietal and frontal regions along with a synchronization of lower frequencies. These findings suggest that repetition led to a more automatic task performance. Subjects were also divided in two groups according to their scores on the test of logical thinking (TOLT). Although no group differences in the accuracy and reaction times were found, EEG data showed higher activity in the beta and gamma bands for the group that scored better on TOLT. Learning and repetition induced changes in the pattern of functional connectivity were evident for all frequency bands. Overall, the results indicated that higher frequency oscillations in frontal and parietal regions are particularly important for proportional reasoning.

  11. Neural markers for immediate performance accuracy in a Stroop color-word matching task: an event-related potentials analysis.

    Science.gov (United States)

    Shou, Guofa; Ding, Lei

    2014-01-01

    The present study examined the neural markers measured in event-related potentials (ERPs) for immediate performance accuracy during a cognitive task with less conflict, i.e., a Stroop color-word matching task, in which participants were required to judge the congruency of two feature dimensions of a stimulus. In an effort to make ERP components more specific to distinct underlying neural substrates, recorded EEG signals were firstly dissolved into multiple independent components (ICs) using independent component analysis (ICA). Thereafter, individual ICs with prominent sensory- or cognitive-related ERP components were selected to separately reconstruct scalp EEG signals at representative channels, from which ERP waveforms were built, respectively. Statistical comparisons on amplitudes of stimulus-locked ERP components, i.e., prefrontal P2 and N2, parietal P3, bilateral occipital P1 and N1, revealed significant reduced P3 amplitude in error trials than in correct trials. In addition, significant evident ERN was also observed in error trials but not in correct trials. Considering the temporal locus of semantic conflict in the present task, we concluded that reduced P3 amplitude in error trials reflect impaired resolving process of semantic conflict, which further lead to a performance error in the Stroop color-word matching task.

  12. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  13. [Environmental investigation of ground water contamination at Wright- Patterson Air Force Base, Ohio]. Volume 4, Health and Safety Plan (HSP); Phase 1, Task 4 Field Investigation report: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This Health and Safety Plan (HSP) was developed for the Environmental Investigation of Ground-water Contamination Investigation at Wright-Patterson Air Force Base near Dayton, Ohio, based on the projected scope of work for the Phase 1, Task 4 Field Investigation. The HSP describes hazards that may be encountered during the investigation, assesses the hazards, and indicates what type of personal protective equipment is to be used for each task performed. The HSP also addresses the medical monitoring program, decontamination procedures, air monitoring, training, site control, accident prevention, and emergency response.

  14. EEG Analysis during complex diagnostic tasks in Nuclear Power Plants - Simulator-based Experimental Study

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2005-01-01

    In literature, there are a lot of studies based on EEG signals during cognitive activities of human-beings but most of them dealt with simple cognitive activities such as transforming letters into Morse code, subtraction, reading, semantic memory search, visual search, memorizing a set of words and so on. In this work, EEG signals were analyzed during complex diagnostic tasks in NPP simulator-based environment. Investigated are the theta, alpha, beta, and gamma band EEG powers during the diagnostic tasks. The experimental design and procedure are represented in section 2 and the results are shown in section 3. Finally some considerations are discussed and the direction for the further work is proposed in section 4

  15. Analysis of the financial task generated by the construction of a nuclear power plant in Mexico

    International Nuclear Information System (INIS)

    Alonso, G.; Ramirez, R.; Palacios, J.; Delfin, A.

    2011-11-01

    The construction of new nuclear reactors requires of a high investment making them intensive projects in capital and that require as minimum of 5 years for its construction. The financial task that represents for the electric company is of vital importance, since in the case of privates in other countries prevents them of entering in this type of projects if they do not have its Government support. In the case of Mexico, being an electric company integrated vertically can have financing to carry out this investment type. In this study is analyzed the construction viability of new nuclear reactors in Mexico based on the financial task that represents for the Electric Company its construction. (Author)

  16. Analysis of measurement process of placental volume in early pregnancy: an interobserver reliability study.

    Science.gov (United States)

    Florido, Jesús; Ocón, Olga; Luna del Castillo, Juan de Dios; Vega-Cañadas, Javier; Manrique-Espinoza, Nadya; Navarrete, Luis

    2014-09-01

    To examine concordance among results obtained in measurement process of first-trimester placental volume using 3D ultrasound and eXtended Imaging Virtual Organ Computed-aided AnaLysis (XI-VOCAL®, 3DXITM, Medison, Seoul, Korea) image analysis by three different operators. Twenty first-trimester normal pregnancy cases were selected for placental volume measuring using a Medison SA 8000 Live Prime® (Medison, Seoul, Korea) with a convex volumetric multifrequency abdominal probe. Images were processed and studied independently by three operators with different grade of training. Each operator obtained 50 slices per case. Thus, this study is based on 1000 measurements that have generated four different sets of placental volume determinations based on 5, 10, 15, and 20 slices, respectively. Results of measurement process were analyzed using reliability coefficients. There was a good degree of concordance in the placental length values obtained by all operators and not depend from the number of cuts measured [intraclass correlation coefficient (ICC)=0.734]. However, the number of cuts is important to obtain a placental volume with more accuracy. Reliability coefficients were low when determining placental volume adjusted to placental length (ICC=0.293), but combined results of the two operators that were trained in the same way showed higher coefficients of reliability (ICC=0.682), and therefore, greater concordance when comparing with the operator that was not trained in the same way. Higher coefficients of reliability guarantee high grades of concordance in the results among operators when measuring placental volumes independently, however, contouring process introduces high variability. Training in how to best use the image analysis software effectively assists in getting higher coefficients of reliability.

  17. Analysis of dual-task elderly gait in fallers and non-fallers using wearable sensors.

    Science.gov (United States)

    Howcroft, Jennifer; Kofman, Jonathan; Lemaire, Edward D; McIlroy, William E

    2016-05-03

    Dual-task (DT) gait involves walking while simultaneously performing an attention-demanding task and can be used to identify impaired gait or executive function in older adults. Advancment is needed in techniques that quantify the influence of dual tasking to improve predictive and diagnostic potential. This study investigated the viability of wearable sensor measures to identify DT gait changes in older adults and distinguish between elderly fallers and non-fallers. A convenience sample of 100 older individuals (75.5±6.7 years; 76 non-fallers, 24 fallers based on 6 month retrospective fall occurrence) walked 7.62m under single-task (ST) and DT conditions while wearing pressure-sensing insoles and tri-axial accelerometers at the head, pelvis, and left and right shanks. Differences between ST and DT gait were identified for temporal measures, acceleration descriptive statistics, Fast Fourier Transform (FFT) quartiles, ratio of even to odd harmonics, center of pressure (CoP) stance path coefficient of variation, and deviations to expected CoP stance path. Increased posterior CoP stance path deviations, increased coefficient of variation, decreased FFT quartiles, and decreased ratio of even to odd harmonics suggested increased DT gait variability. Decreased gait velocity and decreased acceleration standard deviations (SD) at the pelvis and shanks could represent compensatory gait strategies that maintain stability. Differences in acceleration between fallers and non-fallers in head posterior SD and pelvis AP ratio of even to odd harmonics during ST, and pelvis vertical maximum Lyapunov exponent during DT gait were identified. Wearable-sensor-based DT gait assessments could be used in point-of-care environments to identify gait deficits. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. An Analysis of Individual Differences for a Computer Self-Regulated Reading Comprehension Task

    OpenAIRE

    López Escribano, C armen; Moreno Ingelmo, Antonio

    2016-01-01

    Recent developments in brain imagery have made it possible to explore links between brain functions and psychological phenomena, opening a window between mind, brain and behavior. However, behavior cannot be understood solely by looking at the brain alone; the roles of the context, task, and practice are potent forces in shaping behavior. According to these ideas, we present a work experience to reflect on: 1) the variations of how people learn, 2) the learning potential of students with lear...

  19. Organizational analysis and safety for utilities with nuclear power plants: an organizational overview. Volume 1

    International Nuclear Information System (INIS)

    Osborn, R.N.; Olson, J.; Sommers, P.E.; McLaughlin, S.D.; Jackson, M.S.; Scott, W.G.; Connor, P.E.

    1983-08-01

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. A model is introduced for the purposes of organizing the literature review and showing key relationships among identified organizational factors and nuclear power plant safety. Volume I of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety

  20. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  1. Leukocyte telomere length and hippocampus volume: a meta-analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Gustav Nilsonne

    2015-10-01

    Full Text Available Leukocyte telomere length has been shown to correlate to hippocampus volume, but effect estimates differ in magnitude and are not uniformly positive. This study aimed primarily to investigate the relationship between leukocyte telomere length and hippocampus gray matter volume by meta-analysis and secondarily to investigate possible effect moderators. Five studies were included with a total of 2107 participants, of which 1960 were contributed by one single influential study. A random-effects meta-analysis estimated the effect to r = 0.12 [95% CI -0.13, 0.37] in the presence of heterogeneity and a subjectively estimated moderate to high risk of bias. There was no evidence that apolipoprotein E (APOE genotype was an effect moderator, nor that the ratio of leukocyte telomerase activity to telomere length was a better predictor than leukocyte telomere length for hippocampus volume. This meta-analysis, while not proving a positive relationship, also is not able to disprove the earlier finding of a positive correlation in the one large study included in analyses. We propose that a relationship between leukocyte telomere length and hippocamus volume may be mediated by transmigrating monocytes which differentiate into microglia in the brain parenchyma.

  2. Engineering task plan for development, fabrication, and deployment of nested, fixed depth fluidic sampling and at-tank analysis systems

    International Nuclear Information System (INIS)

    REICH, F.R.

    1999-01-01

    An engineering task plan was developed that presents the resources, responsibilities, and schedules for the development, test, and deployment of the nested, fixed-depth fluidic sampling and at-tank analysis system. The sampling system, deployed in the privatization contract double-shell tank feed tank, will provide waste samples for assuring the readiness of the tank for shipment to the privatization contractor for vitrification. The at-tank analysis system will provide ''real-time'' assessments of the sampled wastes' chemical and physical properties. These systems support the Hanford Phase 1B Privatization Contract

  3. Changes in balance coordination and transfer to an unlearned balance task after slackline training: a self-organizing map analysis.

    Science.gov (United States)

    Serrien, Ben; Hohenauer, Erich; Clijsen, Ron; Taube, Wolfgang; Baeyens, Jean-Pierre; Küng, Ursula

    2017-11-01

    How humans maintain balance and change postural control due to age, injury, immobility or training is one of the basic questions in motor control. One of the problems in understanding postural control is the large set of degrees of freedom in the human motor system. Therefore, a self-organizing map (SOM), a type of artificial neural network, was used in the present study to extract and visualize information about high-dimensional balance strategies before and after a 6-week slackline training intervention. Thirteen subjects performed a flamingo and slackline balance task before and after the training while full body kinematics were measured. Range of motion, velocity and frequency of the center of mass and joint angles from the pelvis, trunk and lower leg (45 variables) were calculated and subsequently analyzed with an SOM. Subjects increased their standing time significantly on the flamingo (average +2.93 s, Cohen's d = 1.04) and slackline (+9.55 s, d = 3.28) tasks, but the effect size was more than three times larger in the slackline. The SOM analysis, followed by a k-means clustering and marginal homogeneity test, showed that the balance coordination pattern was significantly different between pre- and post-test for the slackline task only (χ 2  = 82.247; p slackline could be characterized by an increase in range of motion and a decrease in velocity and frequency in nearly all degrees of freedom simultaneously. The observation of low transfer of coordination strategies to the flamingo task adds further evidence for the task-specificity principle of balance training, meaning that slackline training alone will be insufficient to increase postural control in other challenging situations.

  4. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2

    International Nuclear Information System (INIS)

    Osborn, R.N.; Olson, J.; Sommers, P.E.

    1983-08-01

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators

  5. Analysis of Load-Carrying Capacity for Redundant Free-Floating Space Manipulators in Trajectory Tracking Task

    Directory of Open Access Journals (Sweden)

    Qingxuan Jia

    2014-01-01

    Full Text Available The aim of this paper is to analyze load-carrying capacity of redundant free-floating space manipulators (FFSM in trajectory tracking task. Combined with the analysis of influential factors in load-carrying process, evaluation of maximum load-carrying capacity (MLCC is described as multiconstrained nonlinear programming problem. An efficient algorithm based on repeated line search within discontinuous feasible region is presented to determine MLCC for a given trajectory of the end-effector and corresponding joint path. Then, considering the influence of MLCC caused by different initial configurations for the starting point of given trajectory, a kind of maximum payload initial configuration planning method is proposed by using PSO algorithm. Simulations are performed for a particular trajectory tracking task of the 7-DOF space manipulator, of which MLCC is evaluated quantitatively. By in-depth research of the simulation results, significant gap between the values of MLCC when using different initial configurations is analyzed, and the discontinuity of allowable load-carrying capacity is illustrated. The proposed analytical method can be taken as theoretical foundation of feasibility analysis, trajectory optimization, and optimal control of trajectory tracking task in on-orbit load-carrying operations.

  6. Advances on Automatic Speech Analysis for Early Detection of Alzheimer Disease: A Non-linear Multi-task Approach.

    Science.gov (United States)

    Lopez-de-Ipina, Karmele; Martinez-de-Lizarduy, Unai; Calvo, Pilar M; Mekyska, Jiri; Beitia, Blanca; Barroso, Nora; Estanga, Ainara; Tainta, Milkel; Ecay-Torres, Mirian

    2018-01-01

    Nowadays proper detection of cognitive impairment has become a challenge for the scientific community. Alzheimer's Disease (AD), the most common cause of dementia, has a high prevalence that is increasing at a fast pace towards epidemic level. In the not-so-distant future this fact could have a dramatic social and economic impact. In this scenario, an early and accurate diagnosis of AD could help to decrease its effects on patients, relatives and society. Over the last decades there have been useful advances not only in classic assessment techniques, but also in novel non-invasive screening methodologies. Among these methods, automatic analysis of speech -one of the first damaged skills in AD patients- is a natural and useful low cost tool for diagnosis. In this paper a non-linear multi-task approach based on automatic speech analysis is presented. Three tasks with different language complexity levels are analyzed, and promising results that encourage a deeper assessment are obtained. Automatic classification was carried out by using classic Multilayer Perceptron (MLP) and Deep Learning by means of Convolutional Neural Networks (CNN) (biologically- inspired variants of MLPs) over the tasks with classic linear features, perceptual features, Castiglioni fractal dimension and Multiscale Permutation Entropy. Finally, the most relevant features are selected by means of the non-parametric Mann- Whitney U-test. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Decision-making in obesity without eating disorders: a systematic review and meta-analysis of Iowa gambling task performances.

    Science.gov (United States)

    Rotge, J-Y; Poitou, C; Fossati, P; Aron-Wisnewsky, J; Oppert, J-M

    2017-08-01

    There is evidence that obesity is associated with impairments in executive functions, such as deficits in decision-making, planning or problem solving, which might interfere with weight loss in obese individuals. We performed a systematic review and meta-analysis of decision-making abilities, as measured with the Iowa gambling task (IGT), in obesity without eating disorders. A systematic search was conducted to identify studies comparing IGT performances between groups of obese patients without eating disorders and groups of healthy control groups. The standardized mean differences were calculated for the total IGT scores and for the course of IGT scores. Meta-regression analyses were performed to explore the influence of clinical variables on standardized mean differences. Total IGT scores were significantly lower in obese patients compared with normal-weight healthy controls. IGT performances did not differ between groups for the first trials of the task. Significant effect sizes for the last trials of the task were subjected to a high degree of heterogeneity. Risky decision-making is impaired in obesity. The clinical importance of non-food-related decision-making impairments remains to be assessed especially in terms of consequences in daily life or the achievement of weight loss. This meta-analysis has been registered in the Prospero database (CRD42016037533). © 2017 World Obesity Federation.

  8. Dexterity: A MATLAB-based analysis software suite for processing and visualizing data from tasks that measure arm or forelimb function.

    Science.gov (United States)

    Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B

    2017-07-15

    Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Patients with Schizophrenia Fail to Up-Regulate Task-Positive and Down-Regulate Task-Negative Brain Networks: An fMRI Study Using an ICA Analysis Approach.

    Science.gov (United States)

    Nygård, Merethe; Eichele, Tom; Løberg, Else-Marie; Jørgensen, Hugo A; Johnsen, Erik; Kroken, Rune A; Berle, Jan Øystein; Hugdahl, Kenneth

    2012-01-01

    Recent research suggests that the cerebral correlates of cognitive deficits in schizophrenia are nested in the activity of widespread, inter-regional networks rather than being restricted to any specific brain location. One of the networks that have received focus lately is the default mode network. Parts of this network have been reported as hyper-activated in schizophrenia patients (SZ) during rest and during task performance compared to healthy controls (HC), although other parts have been found to be hypo-activated. In contrast to this network, task-positive networks have been reported as hypo-activated compared in SZ during task performance. However, the results are mixed, with, e.g., the dorsolateral prefrontal cortex showing both hyper- and hypo-activation in SZ. In this study we were interested in signal increase and decrease differences between a group of SZ and HC in cortical networks, assuming that the regulatory dynamics of alternating task-positive and task-negative neuronal processes are aberrant in SZ. We compared 31 SZ to age- and gender-matched HC, and used fMRI and independent component analysis (ICA) in order to identify relevant networks. We selected the independent components (ICs) with the largest signal intensity increases (STG, insula, supplementary motor cortex, anterior cingulate cortex, and MTG) and decreases (fusiform gyri, occipital lobe, PFC, cingulate, precuneus, and angular gyrus) in response to a dichotic auditory cognitive task. These ICs were then tested for group differences. Our findings showed deficient up-regulation of the executive network and a corresponding deficit in the down-regulation of the anterior default mode, or effort network during task performance in SZ when compared with HC. These findings may indicate a deficit in the dynamics of alternating task-dependent and task-independent neuronal processes in SZ. The results may cast new light on the mechanisms underlying cognitive deficits in schizophrenia, and may be of

  10. A Work-Demand Analysis Compatible with Preemption-Aware Scheduling for Power-Aware Real-Time Tasks

    Directory of Open Access Journals (Sweden)

    Da-Ren Chen

    2013-01-01

    Full Text Available Due to the importance of slack time utilization for power-aware scheduling algorithms,we propose a work-demand analysis method called parareclamation algorithm (PRA to increase slack time utilization of the existing real-time DVS algorithms. PRA is an online scheduling for power-aware real-time tasks under rate-monotonic (RM policy. It can be implemented and fully compatible with preemption-aware or transition-aware scheduling algorithms without increasing their computational complexities. The key technique of the heuristics method doubles the analytical interval and turns the deferrable workload out the potential slack time. Theoretical proofs show that PRA guarantees the task deadlines in a feasible RM schedule and takes linear time and space complexities. Experimental results indicate that the proposed method combining the preemption-aware methods seamlessly reduces the energy consumption by 14% on average over their original algorithms.

  11. Designing simulator-based training: an approach integrating cognitive task analysis and four-component instructional design.

    Science.gov (United States)

    Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G

    2012-01-01

    Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.

  12. The Iowa Gambling Task in Parkinson's disease: A meta-analysis on effects of disease and medication.

    Science.gov (United States)

    Evens, Ricarda; Hoefler, Michael; Biber, Karolina; Lueken, Ulrike

    2016-10-01

    Decision-making under uncertainty as measured by the Iowa Gambling Task has frequently been studied in Parkinson's disease. The dopamine overdose hypothesis assumes that dopaminergic effects follow an inverted U-shaped function, restoring some cognitive functions while overdosing others. The present work quantitatively summarizes disease and medication effects on task performance and evaluates evidence for the dopamine overdose hypothesis of impaired decision-making in Parkinson's disease. A systematic literature search was performed to identify studies examining the Iowa Gambling Task in patients with Parkinson's disease. Outcomes were quantitatively combined, with separate estimates for the clinical (patients ON medication vs. healthy controls), disease (patients OFF medication vs. healthy controls), and medication effects (patients ON vs. OFF medication). Furthermore, using meta-regression analysis it was explored whether the study characteristics drug level, disease duration, and motor symptoms explained heterogeneous performance between studies. Patients with Parkinson's disease ON dopaminergic medication showed significantly impaired Iowa Gambling Task performance compared to healthy controls. This impairment was not normalized by short-term withdrawal of medication. Heterogeneity across studies was not explained by dopaminergic drug levels, disease durations or motor symptoms. While this meta-analysis showed significantly impaired decision-making performance in Parkinson's disease, there was no evidence that this impairment was related to dopamine overdosing. However, only very few studies assessed patients OFF medication and future studies are needed to concentrate on the modulation of dopaminergic drug levels and pay particular attention to problems related to repeated testing. Furthermore, short- vs. long-term medication effects demand further in-depth investigation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. A task analysis of the content creators of a European digital library

    OpenAIRE

    Nasomtrug, Kanokporn

    2009-01-01

    This study explores the tasks of the digital library (DL) content creators of a European DL, namely European NAvigator (ENA), developed by the Centre Virtuel de la Connaissance sur l’Europe (Virtual Resource Centre for Knowledge about Europe), CVCE, which is located in Luxembourg. The CVCE is in the process of enhancing the DL by creating a new DL management system, partly in response to the European Union’s i2010 policy, as well as to newly emerging DL technologies and applications. The...

  14. Volume Segmentation and Analysis of Biological Materials Using SuRVoS (Super-region Volume Segmentation) Workbench.

    Science.gov (United States)

    Darrow, Michele C; Luengo, Imanol; Basham, Mark; Spink, Matthew C; Irvine, Sarah; French, Andrew P; Ashton, Alun W; Duke, Elizabeth M H

    2017-08-23

    Segmentation is the process of isolating specific regions or objects within an imaged volume, so that further study can be undertaken on these areas of interest. When considering the analysis of complex biological systems, the segmentation of three-dimensional image data is a time consuming and labor intensive step. With the increased availability of many imaging modalities and with automated data collection schemes, this poses an increased challenge for the modern experimental biologist to move from data to knowledge. This publication describes the use of SuRVoS Workbench, a program designed to address these issues by providing methods to semi-automatically segment complex biological volumetric data. Three datasets of differing magnification and imaging modalities are presented here, each highlighting different strategies of segmenting with SuRVoS. Phase contrast X-ray tomography (microCT) of the fruiting body of a plant is used to demonstrate segmentation using model training, cryo electron tomography (cryoET) of human platelets is used to demonstrate segmentation using super- and megavoxels, and cryo soft X-ray tomography (cryoSXT) of a mammalian cell line is used to demonstrate the label splitting tools. Strategies and parameters for each datatype are also presented. By blending a selection of semi-automatic processes into a single interactive tool, SuRVoS provides several benefits. Overall time to segment volumetric data is reduced by a factor of five when compared to manual segmentation, a mainstay in many image processing fields. This is a significant savings when full manual segmentation can take weeks of effort. Additionally, subjectivity is addressed through the use of computationally identified boundaries, and splitting complex collections of objects by their calculated properties rather than on a case-by-case basis.

  15. Tools for the analysis of dose optimization: I. Effect-volume histogram

    International Nuclear Information System (INIS)

    Alber, M.; Nuesslin, F.

    2002-01-01

    With the advent of dose optimization algorithms, predominantly for intensity-modulated radiotherapy (IMRT), computer software has progressed beyond the point of being merely a tool at the hands of an expert and has become an active, independent mediator of the dosimetric conflicts between treatment goals and risks. To understand and control the internal decision finding as well as to provide means to influence it, a tool for the analysis of the dose distribution is presented which reveals the decision-making process performed by the algorithm. The internal trade-offs between partial volumes receiving high or low doses are driven by functions which attribute a weight to each volume element. The statistics of the distribution of these weights is cast into an effect-volume histogram (EVH) in analogy to dose-volume histograms. The analysis of the EVH reveals which traits of the optimum dose distribution result from the defined objectives, and which are a random consequence of under- or misspecification of treatment goals. The EVH can further assist in the process of finding suitable objectives and balancing conflicting objectives. If biologically inspired objectives are used, the EVH shows the distribution of local dose effect relative to the prescribed level. (author)

  16. Mathematical tasks, study approaches, and course grades in undergraduate mathematics: a year-by-year analysis

    Science.gov (United States)

    Maciejewski, Wes; Merchant, Sandra

    2016-04-01

    Students approach learning in different ways, depending on the experienced learning situation. A deep approach is geared toward long-term retention and conceptual change while a surface approach focuses on quickly acquiring knowledge for immediate use. These approaches ultimately affect the students' academic outcomes. This study takes a cross-sectional look at the approaches to learning used by students from courses across all four years of undergraduate mathematics and analyses how these relate to the students' grades. We find that deep learning correlates with grade in the first year and not in the upper years. Surficial learning has no correlation with grades in the first year and a strong negative correlation with grades in the upper years. Using Bloom's taxonomy, we argue that the nature of the tasks given to students is fundamentally different in lower and upper year courses. We find that first-year courses emphasize tasks that require only low-level cognitive processes. Upper year courses require higher level processes but, surprisingly, have a simultaneous greater emphasis on recall and understanding. These observations explain the differences in correlations between approaches to learning and course grades. We conclude with some concerns about the disconnect between first year and upper year mathematics courses and the effect this may have on students.

  17. A diffusion decision model analysis of evidence variability in the lexical decision task.

    Science.gov (United States)

    Tillman, Gabriel; Osth, Adam F; van Ravenzwaaij, Don; Heathcote, Andrew

    2017-12-01

    The lexical-decision task is among the most commonly used paradigms in psycholinguistics. In both the signal-detection theory and Diffusion Decision Model (DDM; Ratcliff, Gomez, & McKoon, Psychological Review, 111, 159-182, 2004) frameworks, lexical-decisions are based on a continuous source of word-likeness evidence for both words and non-words. The Retrieving Effectively from Memory model of Lexical-Decision (REM-LD; Wagenmakers et al., Cognitive Psychology, 48(3), 332-367, 2004) provides a comprehensive explanation of lexical-decision data and makes the prediction that word-likeness evidence is more variable for words than non-words and that higher frequency words are more variable than lower frequency words. To test these predictions, we analyzed five lexical-decision data sets with the DDM. For all data sets, drift-rate variability changed across word frequency and non-word conditions. For the most part, REM-LD's predictions about the ordering of evidence variability across stimuli in the lexical-decision task were confirmed.

  18. Neural correlates of interference resolution in the multi-source interference task: a meta-analysis of functional neuroimaging studies.

    Science.gov (United States)

    Deng, Yuqin; Wang, Xiaochun; Wang, Yan; Zhou, Chenglin

    2018-04-10

    Interference resolution refers to cognitive control processes enabling one to focus on task-related information while filtering out unrelated information. But the exact neural areas, which underlie a specific cognitive task on interference resolution, are still equivocal. The multi-source interference task (MSIT), as a particular cognitive task, is a well-established experimental paradigm used to evaluate interference resolution. Studies combining the MSIT with functional magnetic resonance imaging (fMRI) have shown that the MSIT evokes the dorsal anterior cingulate cortex (dACC) and cingulate-frontal-parietal cognitive-attentional networks. However, these brain areas have not been evaluated quantitatively and these findings have not been replicated. In the current study, we firstly report a voxel-based meta-analysis of functional brain activation associated with the MSIT so as to identify the localization of interference resolution in such a specific cognitive task. Articles on MSIT-related fMRI published between 2003 and July 2017 were eligible. The electronic databases searched included PubMed, Web of Knowledge, and Google Scholar. Differential BOLD activation patterns between the incongruent and congruent condition were meta-analyzed in anisotropic effect-size signed differential mapping software. Robustness meta-analysis indicated that two significant activation clusters were shown to have reliable functional activity in comparisons between incongruent and congruent conditions. The first reliable activation cluster, which included the dACC, medial prefrontal cortex, supplementary motor area, replicated the previous MSIT-related fMRI study results. Furthermore, we found another reliable activation cluster comprising areas of the right insula, right inferior frontal gyrus, and right lenticular nucleus-putamen, which were not typically discussed in previous MSIT-related fMRI studies. The current meta-analysis study presents the reliable brain activation patterns

  19. GRPAUT: a program for Pu isotopic analysis (a user's guide). ISPO task A. 76

    Energy Technology Data Exchange (ETDEWEB)

    Fleissner, J G

    1981-01-30

    GRPAUT is a modular program for performing automated Pu isotopic analysis supplied to the International Atomic Energy Agency (IAEA) per ISPO Task A.76. Section I of this user's guide for GRPAUT presents an overview of the various programs and disk files that are used in performing a Pu isotopic analysis. Section II describes the program GRFEDT which is used in creating and editing the analysis parameter file that contains all the spectroscopic information needed at runtime by GRPAUT. An example of the dialog and output of GRFEDT is shown in Appendix B. Section III describes the operation of the various GRPAUT modules: GRPNL2, the peak stripping module; EFFCH2, the efficiency calculation module; and ISOAUT, the isotopic calculation module. (A description of the peak fitting methodology employed by GRPNL2 is presented in Appendix A.) Finally, Section IV outlines the procedure for determining the peak shape constants for a detector system and describes the operation of the program used to create and edit the peak shape parameter files. An output of GRPAUT, showing an example of a complete isotopic analysis, is presented in Appendix C. Source listings of all the Fortran programs supplied to the Agency under ISPO Task A.76 are contained in Appendix E.

  20. Oak Ridge Reservation volume I. Y-12 mercury task force files: A guide to record series of the Department of Energy and its contractors

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-17

    The purpose of this guide is to describe each of the series of records identified in the documents of the Y-12 Mercury Task Force Files that pertain to the use of mercury in the separation and enrichment of lithium isotopes at the Department of Energy`s (DOE) Y-12 Plant in Oak Ridge, Tennessee. History Associates Incorporated (HAI) prepared this guide as part of DOE`s Epidemiologic Records Inventory Project, which seeks to verify and conduct inventories of epidemiologic and health-related records at various DOE and DOE contractor sites. This introduction briefly describes the Epidemiologic Records Inventory Project and HAI`s role in the project. Specific attention will be given to the history of the DOE-Oak Ridge Reservation, the development of the Y-12 Plant, and the use of mercury in the production of nuclear weapons during the 1950s and early 1960s. This introduction provides background information on the Y-12 Mercury Task Force Files, an assembly of documents resulting from the 1983 investigation of the Mercury Task Force into the effects of mercury toxicity upon workplace hygiene and worker health, the unaccountable loss of mercury, and the impact of those losses upon the environment. This introduction also explains the methodology used in the selection and inventory of these record series. Other topics include the methodology used to produce this guide, the arrangement of the detailed record series descriptions, and information concerning access to the collection.

  1. Analysis of international and European policy instruments: pollution swapping . Task 2 Service contract "Integrated measures in agriculture to reduce ammonia emissions"

    NARCIS (Netherlands)

    Oenema, O.; Velthof, G.L.

    2007-01-01

    This Report describes the results of Task 2 ‘Analysis of International and European policy instruments’. The aim of this task is to analyze the existing International and European policy instruments aiming at reducing emissions of ammonia, nitrous oxide and methane to the atmosphere and nitrate to

  2. HYDRA-II: A hydrothermal analysis computer code: Volume 1, Equations and numerics

    International Nuclear Information System (INIS)

    McCann, R.A.

    1987-04-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in Cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the Cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits of modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. This volume, Volume I - Equations and Numerics, describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. The final volume, Volume III - Verification/Validation Assessments, presents results of numerical simulations of single- and multiassembly storage systems and comparisons with experimental data. 4 refs

  3. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  4. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.

  5. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    International Nuclear Information System (INIS)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs

  6. Predicting Nonauditory Adverse Radiation Effects Following Radiosurgery for Vestibular Schwannoma: A Volume and Dosimetric Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hayhurst, Caroline; Monsalves, Eric; Bernstein, Mark; Gentili, Fred [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada); Heydarian, Mostafa; Tsao, May [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Schwartz, Michael [Radiation Oncology Program and Division of Neurosurgery, Sunnybrook Hospital, Toronto (Canada); Prooijen, Monique van [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Millar, Barbara-Ann; Menard, Cynthia [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Kulkarni, Abhaya V. [Division of Neurosurgery, Hospital for Sick Children, University of Toronto (Canada); Laperriere, Norm [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Zadeh, Gelareh, E-mail: Gelareh.Zadeh@uhn.on.ca [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada)

    2012-04-01

    Purpose: To define clinical and dosimetric predictors of nonauditory adverse radiation effects after radiosurgery for vestibular schwannoma treated with a 12 Gy prescription dose. Methods: We retrospectively reviewed our experience of vestibular schwannoma patients treated between September 2005 and December 2009. Two hundred patients were treated at a 12 Gy prescription dose; 80 had complete clinical and radiological follow-up for at least 24 months (median, 28.5 months). All treatment plans were reviewed for target volume and dosimetry characteristics; gradient index; homogeneity index, defined as the maximum dose in the treatment volume divided by the prescription dose; conformity index; brainstem; and trigeminal nerve dose. All adverse radiation effects (ARE) were recorded. Because the intent of our study was to focus on the nonauditory adverse effects, hearing outcome was not evaluated in this study. Results: Twenty-seven (33.8%) patients developed ARE, 5 (6%) developed hydrocephalus, 10 (12.5%) reported new ataxia, 17 (21%) developed trigeminal dysfunction, 3 (3.75%) had facial weakness, and 1 patient developed hemifacial spasm. The development of edema within the pons was significantly associated with ARE (p = 0.001). On multivariate analysis, only target volume is a significant predictor of ARE (p = 0.001). There is a target volume threshold of 5 cm3, above which ARE are more likely. The treatment plan dosimetric characteristics are not associated with ARE, although the maximum dose to the 5th nerve is a significant predictor of trigeminal dysfunction, with a threshold of 9 Gy. The overall 2-year tumor control rate was 96%. Conclusions: Target volume is the most important predictor of adverse radiation effects, and we identified the significant treatment volume threshold to be 5 cm3. We also established through our series that the maximum tolerable dose to the 5th nerve is 9 Gy.

  7. Development and application of a new tool in driver task analysis; Entwicklung und Anwendung einer neuen Methodik zur Fahreraufgabenanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Fastenmeier, W.; Gstalter, H. [mensch-verkehr-umwelt Institut fuer Angewandte Psychologie, Muenchen (Germany)

    2003-07-01

    A framework for driver task analysis is derived from a model of the drivers information processing. The first step of the procedure (SAFE) is to divide a given driving task (e.g. left turning in a signalised intersection of a certain type) into subtasks. These subtasks are appointed to defined stretches of the road and the time structure of the subtasks is determined. For each subtask an analysis format is used, that organizes different requirements into perception, expectation, judgement, memory, decision and driver action. In a next step, typical driver errors are appointed to the subtasks. Afterwards the information gathered in the analysis procedure is compressed to ratings of the complexity and risk included in each subtask. After the crucial subtasks of a driving task have been determined, a list of those requirements that led to that rating can be compiled. These lists give the potential for aiding the driver e.g. by use of driver assistance systems. (orig.) [German] In den Verkehrswissenschaften existieren bislang keine verkehrsspezifischen Standardverfahren, die eine Anforderungsanalyse auf der Ebene der Fahraufgaben erlauben. Dieser Beitrag stellt die Grundzuege eines Verfahrens zur Fahraufgabenanalyse dar (SAFE: Situative Anforderungsanalyse von Fahraufgaben). Es basiert auf einer Modellvorstellung zu den Informationsverarbeitungsprozessen, die waehrend der Bewaeltigung der Fahraufgabe eine Rolle spielen. Der erste Schritt der Aufgabenanalyse besteht darin, eine ausgewaehlte Fahraufgabe (z.B. Linksabbiegen in einer ampelgeregelten Kreuzung bestimmten Typs) in Teilaufgaben zu zerlegen und eine zeitliche Strukturierung dieser Teilaufgaben vorzunehmen. Diese werden raeumlichen Segmenten zugeordnet, die nacheinander durchfahren werden muessen. Fuer jede Teilaufgabe erfolgt dann die eigentliche Anforderungsanalyse, die nach den Kapiteln Wahrnehmung, Erwartungsbildung, Beurteilung, Gedaechtnisprozesse, Entscheidung/Planung sowie Fahrzeugbedienung mit jeweils

  8. Event-related fast optical signal in a rapid object recognition task: improving detection by the independent component analysis.

    Science.gov (United States)

    Medvedev, Andrei V; Kainerstorfer, Jana; Borisov, Sergey V; Barbour, Randall L; VanMeter, John

    2008-10-21

    Noninvasive recording of fast optical signals presumably reflecting neuronal activity is a challenging task because of a relatively low signal-to-noise ratio. To improve detection of those signals in rapid object recognition tasks, we used the independent component analysis (ICA) to reduce "global interference" (heartbeat and contribution of superficial layers). We recorded optical signals from the left prefrontal cortex in 10 right-handed participants with a continuous-wave instrument (DYNOT, NIRx, Brooklyn, NY). Visual stimuli were pictures of urban, landscape and seashore scenes with various vehicles as targets (target-to-non-target ratio 1:6) presented at ISI=166 ms or 250 ms. Subjects mentally counted targets. Data were filtered at 2-30 Hz and artifactual components were identified visually (for heartbeat) and using the ICA weight matrix (for superficial layers). Optical signals were restored from the ICA components with artifactual components removed and then averaged over target and non-target epochs. After ICA processing, the event-related response was detected in 70%-100% of subjects. The refined signal showed a significant decrease from baseline within 200-300 ms after targets and a slight increase after non-targets. The temporal profile of the optical signal corresponded well to the profile of a "differential ERP response", the difference between targets and non-targets which peaks at 200 ms in similar object detection tasks. These results demonstrate that the detection of fast optical responses with continuous-wave instruments can be improved through the ICA method capable to remove noise, global interference and the activity of superficial layers. Fast optical signals may provide further information on brain processing during higher-order cognitive tasks such as rapid categorization of objects.

  9. Does repetitive task training improve functional activity after stroke? A Cochrane systematic review and meta-analysis.

    Science.gov (United States)

    French, Beverley; Thomas, Lois; Leathley, Michael; Sutton, Christopher; McAdam, Joanna; Forster, Anne; Langhorne, Peter; Price, Christopher; Walker, Andrew; Watkins, Caroline

    2010-01-01

    To determine if repetitive task training after stroke improves functional activity. Systematic review and meta-analysis of trials comparing repetitive task training with attention control or usual care. The Cochrane Stroke Trials Register, electronic databases of published, unpublished and non-English language papers; conference proceedings, reference lists, and trial authors. Included studies were randomized/quasi-randomized trials in adults after stroke where an active motor sequence aiming to improve functional activity was performed repetitively within a single training session. We used Cochrane Collaboration methods, resources, and software. We included 14 trials with 17 intervention-control pairs and 659 participants. Results were statistically significant for walking distance (mean difference 54.6, 95% confidence interval (95% CI) 17.5, 91.7); walking speed (standardized mean difference (SMD) 0.29, 95% CI 0.04, 0.53); sit-to-stand (standard effect estimate 0.35, 95% CI 0.13, 0.56), and activities of daily living: SMD 0.29, 95% CI 0.07, 0.51; and of borderline statistical significance for measures of walking ability (SMD 0.25, 95% CI 0.00, 0.51), and global motor function (SMD 0.32, 95% CI -0.01, 0.66). There were no statistically significant differences for hand/arm functional activity, lower limb functional activity scales, or sitting/standing balance/reach. Repetitive task training resulted in modest improvement across a range of lower limb outcome measures, but not upper limb outcome measures. Training may be sufficient to have a small impact on activities of daily living. Interventions involving elements of repetition and task training are diverse and difficult to classify: the results presented are specific to trials where both elements are clearly present in the intervention, without major confounding by other potential mechanisms of action.

  10. Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise.

    Directory of Open Access Journals (Sweden)

    Johannes Burge

    2017-02-01

    Full Text Available Accuracy Maximization Analysis (AMA is a recently developed Bayesian ideal observer method for task-specific dimensionality reduction. Given a training set of proximal stimuli (e.g. retinal images, a response noise model, and a cost function, AMA returns the filters (i.e. receptive fields that extract the most useful stimulus features for estimating a user-specified latent variable from those stimuli. Here, we first contribute two technical advances that significantly reduce AMA's compute time: we derive gradients of cost functions for which two popular estimators are appropriate, and we implement a stochastic gradient descent (AMA-SGD routine for filter learning. Next, we show how the method can be used to simultaneously probe the impact on neural encoding of natural stimulus variability, the prior over the latent variable, noise power, and the choice of cost function. Then, we examine the geometry of AMA's unique combination of properties that distinguish it from better-known statistical methods. Using binocular disparity estimation as a concrete test case, we develop insights that have general implications for understanding neural encoding and decoding in a broad class of fundamental sensory-perceptual tasks connected to the energy model. Specifically, we find that non-orthogonal (partially redundant filters with scaled additive noise tend to outperform orthogonal filters with constant additive noise; non-orthogonal filters and scaled additive noise can interact to sculpt noise-induced stimulus encoding uncertainty to match task-irrelevant stimulus variability. Thus, we show that some properties of neural response thought to be biophysical nuisances can confer coding advantages to neural systems. Finally, we speculate that, if repurposed for the problem of neural systems identification, AMA may be able to overcome a fundamental limitation of standard subunit model estimation. As natural stimuli become more widely used in the study of

  11. Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise.

    Science.gov (United States)

    Burge, Johannes; Jaini, Priyank

    2017-02-01

    Accuracy Maximization Analysis (AMA) is a recently developed Bayesian ideal observer method for task-specific dimensionality reduction. Given a training set of proximal stimuli (e.g. retinal images), a response noise model, and a cost function, AMA returns the filters (i.e. receptive fields) that extract the most useful stimulus features for estimating a user-specified latent variable from those stimuli. Here, we first contribute two technical advances that significantly reduce AMA's compute time: we derive gradients of cost functions for which two popular estimators are appropriate, and we implement a stochastic gradient descent (AMA-SGD) routine for filter learning. Next, we show how the method can be used to simultaneously probe the impact on neural encoding of natural stimulus variability, the prior over the latent variable, noise power, and the choice of cost function. Then, we examine the geometry of AMA's unique combination of properties that distinguish it from better-known statistical methods. Using binocular disparity estimation as a concrete test case, we develop insights that have general implications for understanding neural encoding and decoding in a broad class of fundamental sensory-perceptual tasks connected to the energy model. Specifically, we find that non-orthogonal (partially redundant) filters with scaled additive noise tend to outperform orthogonal filters with constant additive noise; non-orthogonal filters and scaled additive noise can interact to sculpt noise-induced stimulus encoding uncertainty to match task-irrelevant stimulus variability. Thus, we show that some properties of neural response thought to be biophysical nuisances can confer coding advantages to neural systems. Finally, we speculate that, if repurposed for the problem of neural systems identification, AMA may be able to overcome a fundamental limitation of standard subunit model estimation. As natural stimuli become more widely used in the study of psychophysical and

  12. [Analysis of the eye movement patterns in visual search tasks: effect of familiarity and stimulus features].

    Science.gov (United States)

    Macedo, Elizeu Coutinho de; Covre, Priscila; Orsati, Fernanda Tebexreni; Oliveira, Maira Okada de; Schwartzman, José Salomão

    2007-01-01

    To analyze eye movements in asymmetric visual search using the task of normal and mirrored position letters. To evaluate the effect of familiarity and stimulus features. Eighty-three university students with normal or corrected-to-normal vision were asked to search for a letter in inverted position to the letters in a group of either normal or mirrored letters. Four types of letters were used (Z, N, E and G) and the eye movements were tracked by a specialized computer-based system (eyetracking). The analyzed measurements were: reaction time, fixation number and duration, saccade distance and duration. All measures varied with the type of letter. Reaction time, fixation number, and saccade distance were higher when the task was to find the normal letter in a group of mirrored letters. In this condition, fixation duration was smaller. Interaction was found between familiarity and the type of letter for the reaction time, fixation number and duration. The reaction time and fixation number increased together with the stimulus complexity, with a greater increase for the normal letter target. Fixation duration, however, decreased with the complexity of the stimuli and the search condition. Finding a mirrored letter among normal letters proved to be easier than the contrary. The letter type also affected the performance. When the context is formed of unfamiliar complex stimuli, the fixation duration is shorter, indicating a narrower span for visual processing. Therefore, a greater number of fixations with shorter duration are needed for the unfamiliar context while less fixations with greater duration are needed for the familiar context.

  13. Second generation pressurized fluidized-bed combustion (PFBC) research and development, Phase 2 --- Task 4, carbonizer testing. Volume 2, Data reconciliation

    Energy Technology Data Exchange (ETDEWEB)

    Froehlich, R.; Robertson, A.; Vanhook, J.; Goyal, A.; Rehmat, A.; Newby, R.

    1994-11-01

    During the period beginning November 1991 and ending September 1992, a series of tests were conducted at Foster Wheeler Development Corporation in a fluidized-bed coal carbonizer to determine its performance characteristics. The carbonizer was operated for 533 hours in a jetting fluidized-bed configuration during which 36 set points (steady-state periods) were achieved. Extensive data were collected on the feed and product stream compositions, heating values, temperatures, and flow rates. With these data, elemental and energy balances were computed to evaluate and confirm accuracy of the data. The carbonizer data were not as self-consistent as could be desired (balance closure imperfection). A software package developed by Science Ventures, Inc., of California, called BALAID, was used to reconcile the carbonizer data; the details of the reconciliation have been given in Volume 1 of this report. The reconciled data for the carbonizer were rigorously analyzed, correlations were developed, and the model was updated accordingly. The model was then used in simulating each of the 36 steady-state periods achieved in the pilot plant. The details are given in this Volume one. This Volume 2 provides details of the carbonizer data reconciliation.

  14. Effect of varicocelectomy on testis volume and semen parameters in adolescents: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Tie Zhou

    2015-01-01

    Full Text Available Varicocele repair in adolescent remains controversial. Our aim is to identify and combine clinical trials results published thus far to ascertain the efficacy of varicocelectomy in improving testis volume and semen parameters compared with nontreatment control. A literature search was performed using Medline, Embase and Web of Science, which included results obtained from meta-analysis, randomized and nonrandomized controlled studies. The study population was adolescents with clinically palpable varicocele with or without the testicular asymmetry or abnormal semen parameters. Cases were allocated to treatment and observation groups, and testis volume or semen parameters were adopted as outcome measures. As a result, seven randomized controlled trials (RCTs and nonrandomized controlled trials studying bilateral testis volume or semen parameters in both treatment and observation groups were identified. Using a random effect model, mean difference of testis volume between the treatment group and the observation group was 2.9 ml (95% confidence interval [CI]: 0.6, 5.2; P< 0.05 for the varicocele side and 1.5 ml (95% CI: 0.3, 2.7; P< 0.05 for the healthy side. The random effect model analysis demonstrated that the mean difference of semen concentration, total semen motility, and normal morphology between the two groups was 13.7 × 10 6 ml−1 (95% CI: −1.4, 28.8; P = 0.075, 2.5% (95% CI: −3.6, 8.6; P= 0.424, and 2.9% (95% CI: −3.0, 8.7; P= 0.336 respectively. In conclusion, although varicocelectomy significantly improved bilateral testis volume in adolescents with varicocele compared with observation cases, semen parameters did not have any statistically significant difference between two groups. Well-planned, properly conducted RCTs are needed in order to confirm the above-mentioned conclusion further and to explore whether varicocele repair in adolescents could improve subsequently spontaneous pregnancy rates.

  15. Automated mediastinal lymph node detection from CT volumes based on intensity targeted radial structure tensor analysis.

    Science.gov (United States)

    Oda, Hirohisa; Bhatia, Kanwal K; Oda, Masahiro; Kitasaka, Takayuki; Iwano, Shingo; Homma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Schnabel, Julia A; Mori, Kensaku

    2017-10-01

    This paper presents a local intensity structure analysis based on an intensity targeted radial structure tensor (ITRST) and the blob-like structure enhancement filter based on it (ITRST filter) for the mediastinal lymph node detection algorithm from chest computed tomography (CT) volumes. Although the filter based on radial structure tensor analysis (RST filter) based on conventional RST analysis can be utilized to detect lymph nodes, some lymph nodes adjacent to regions with extremely high or low intensities cannot be detected. Therefore, we propose the ITRST filter, which integrates the prior knowledge on detection target intensity range into the RST filter. Our lymph node detection algorithm consists of two steps: (1) obtaining candidate regions using the ITRST filter and (2) removing false positives (FPs) using the support vector machine classifier. We evaluated lymph node detection performance of the ITRST filter on 47 contrast-enhanced chest CT volumes and compared it with the RST and Hessian filters. The detection rate of the ITRST filter was 84.2% with 9.1 FPs/volume for lymph nodes whose short axis was at least 10 mm, which outperformed the RST and Hessian filters.

  16. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    Science.gov (United States)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  17. Big Cat Coalitions: A Comparative Analysis of Regional Brain Volumes in Felidae

    Science.gov (United States)

    Sakai, Sharleen T.; Arsznov, Bradley M.; Hristova, Ani E.; Yoon, Elise J.; Lundrigan, Barbara L.

    2016-01-01

    Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of four focal species: lions (Panthera leo), leopards (Panthera pardus), cougars (Puma concolor), and cheetahs (Acinonyx jubatus). These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography. Skulls (n = 75) were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC) volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in four focal species revealed that lions and leopards, while not significantly different from one another, have relatively larger AC volumes

  18. Big Cat Coalitions: A comparative analysis of regional brain volumes in Felidae

    Directory of Open Access Journals (Sweden)

    Sharleen T Sakai

    2016-10-01

    Full Text Available Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of 4 focal species: lions (Panthera leo, leopards (Panthera pardus, cougars (Puma concolor, and cheetahs (Acinonyx jubatus. These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography (CT. Skulls (n=75 were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares (PGLS regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in 4 focal species revealed that lions and leopards, while not significantly different from one another, have relatively

  19. Sealed source and device design safety testing. Volume 4: Technical report on the findings of Task 4, Investigation of sealed source for paper mill digester

    International Nuclear Information System (INIS)

    Benac, D.J.; Iddings, F.A.

    1995-10-01

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate a suspected leaking radioactive source that was installed in a gauge that was on a paper mill digester. The actual source that was leaking was not available, therefore, SwRI examined another source. SwRI concluded that the encapsulated source examined by SwRI was not leaking. However, the presence of Cs-137 on the interior and exterior of the outer encapsulation and hending tube suggests that contamination probably occurred when the source was first manufactured, then installed in the handling tube

  20. Fabrication, testing, and analysis of anisotropic carbon/glass hybrid composites: volume 1: technical report.

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, Kyle K. (Wetzel Engineering, Inc. Lawrence, Kansas); Hermann, Thomas M. (Wichita state University, Wichita, Kansas); Locke, James (Wichita state University, Wichita, Kansas)

    2005-11-01

    o} from the long axis for approximately two-thirds of the laminate volume (discounting skin layers), with reinforcing carbon fibers oriented axially comprising the remaining one-third of the volume. Finite element analysis of each laminate has been performed to examine first ply failure. Three failure criteria--maximum stress, maximum strain, and Tsai-Wu--have been compared. Failure predicted by all three criteria proves generally conservative, with the stress-based criteria the most conservative. For laminates that respond nonlinearly to loading, large error is observed in the prediction of failure using maximum strain as the criterion. This report documents the methods and results in two volumes. Volume 1 contains descriptions of the laminates, their fabrication and testing, the methods of analysis, the results, and the conclusions and recommendations. Volume 2 contains a comprehensive summary of the individual test results for all laminates.

  1. Performance Prediction Modelling for Flexible Pavement on Low Volume Roads Using Multiple Linear Regression Analysis

    Directory of Open Access Journals (Sweden)

    C. Makendran

    2015-01-01

    Full Text Available Prediction models for low volume village roads in India are developed to evaluate the progression of different types of distress such as roughness, cracking, and potholes. Even though the Government of India is investing huge quantum of money on road construction every year, poor control over the quality of road construction and its subsequent maintenance is leading to the faster road deterioration. In this regard, it is essential that scientific maintenance procedures are to be evolved on the basis of performance of low volume flexible pavements. Considering the above, an attempt has been made in this research endeavor to develop prediction models to understand the progression of roughness, cracking, and potholes in flexible pavements exposed to least or nil routine maintenance. Distress data were collected from the low volume rural roads covering about 173 stretches spread across Tamil Nadu state in India. Based on the above collected data, distress prediction models have been developed using multiple linear regression analysis. Further, the models have been validated using independent field data. It can be concluded that the models developed in this study can serve as useful tools for the practicing engineers maintaining flexible pavements on low volume roads.

  2. Design and analysis of a dead volume control for a solar Stirling engine with induction generator

    International Nuclear Information System (INIS)

    Beltrán-Chacon, Ricardo; Leal-Chavez, Daniel; Sauceda, D.; Pellegrini-Cervantes, Manuel; Borunda, Mónica

    2015-01-01

    In this work, a power generation system dish/Stirling with cavity receiver and an electrical induction generator was simulated. We propose a control system using a variable-dead-volume and analyze its influence on the mechanical performance. A system with a dead volume of 160 cm 3 was designed to control the power and speed of the engine considering annual insolation, mechanical properties of the heater and the limits of frequency and voltage for the systems interconnected to the electricity network. The designed system achieves net efficient solar conversion to electric of 23.38% at an irradiance of 975 W/m 2 and allows an annual increase of 18% of the useful electrical energy compared to a system without control. - Highlights: • Numerical simulation of a nitrogen charged solar Stirling engine for electric power generation. • Design and analysis of a dead volume control for performance increase and power modulation. • Effect of dead space on average working pressure and mass flow rate. • Comparison between dead volume and average pressure control methods. • Impact of Stirling engine control settings on annual generated electric power.

  3. Man-machine analysis of translation and work tasks of Skylab films

    Science.gov (United States)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  4. Effects of elevated vacuum on in-socket residual limb fluid volume: Case study results using bioimpedance analysis

    OpenAIRE

    Sanders, JE; Harrison, DS; Myers, TR; Allyn, KJ

    2011-01-01

    Bioimpedance analysis was used to measure residual limb fluid volume on seven trans-tibial amputee subjects using elevated vacuum sockets and non-elevated vacuum sockets. Fluid volume changes were assessed during sessions with the subjects sitting, standing, and walking. In general, fluid volume losses during 3 or 5 min walks and losses over the course of the 30-min test session were less for elevated vacuum than for suction. A number of variables including the time of day data were collected...

  5. Leveraging Data Analysis for Domain Experts: An Embeddable Framework for Basic Data Science Tasks

    Science.gov (United States)

    Lohrer, Johannes-Y.; Kaltenthaler, Daniel; Kröger, Peer

    2016-01-01

    In this paper, we describe a framework for data analysis that can be embedded into a base application. Since it is important to analyze the data directly inside the application where the data is entered, a tool that allows the scientists to easily work with their data, supports and motivates the execution of further analysis of their data, which…

  6. Anger/frustration, task persistence, and conduct problems in childhood: a behavioral genetic analysis.

    Science.gov (United States)

    Deater-Deckard, Kirby; Petrill, Stephen A; Thompson, Lee A

    2007-01-01

    Individual differences in conduct problems arise in part from proneness to anger/frustration and poor self-regulation of behavior. However, the genetic and environmental etiology of these connections is not known. Using a twin design, we examined genetic and environmental covariation underlying the well-documented correlations between anger/frustration, poor attention regulation (i.e., task persistence), and conduct problems in childhood. Participants included 105 pairs of MZ twins and 154 pairs of same-sex DZ twins (4-8 year olds). Independent observers rated child persistence and affect based on behavior during a challenging in-home cognitive and literacy assessment. Teachers and parents provided reports of conduct problems. Persistence, anger/frustration, and conduct problems included moderate heritable and nonshared environmental variance; conduct problems included moderate shared environmental variance as well. Persistence and anger/frustration had independent genetic covariance with conduct problems and nonshared environmental covariance with each other. The findings indicate genetically distinct though inter-related influences linking affective and self-regulatory aspects of temperament with behavior problems in childhood.

  7. Anger/frustration, task persistence, and conduct problems in childhood: a behavioral genetic analysis

    Science.gov (United States)

    Deater-Deckard, Kirby; Petrill, Stephen A.; Thompson, Lee A.

    2009-01-01

    Background Individual differences in conduct problems arise in part from proneness to anger/frustration and poor self-regulation of behavior. However, the genetic and environmental etiology of these connections is not known. Method Using a twin design, we examined genetic and environmental covariation underlying the well-documented correlations between anger/frustration, poor attention regulation (i.e., task persistence), and conduct problems in childhood. Participants included 105 pairs of MZ twins and 154 pairs of same-sex DZ twins (4–8 year olds). Independent observers rated child persistence and affect based on behavior during a challenging in-home cognitive and literacy assessment. Teachers and parents provided reports of conduct problems. Results Persistence, anger/frustration, and conduct problems included moderate heritable and nonshared environmental variance; conduct problems included moderate shared environmental variance as well. Persistence and anger/frustration had independent genetic covariance with conduct problems and nonshared environmental covariance with each other. Conclusions The findings indicate genetically distinct though inter-related influences linking affective and self-regulatory aspects of temperament with behavior problems in childhood. PMID:17244273

  8. Procedures for handling and analysis of uranium hexafluoride. Volume 2. Analysis

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1972-04-01

    Volume 2 of this report contains analytical procedures - which. may be used by AEC customers and UF6 processors to determine conformance to the Federal Register specifications and other properties for UF6. The procedures presented are typical of those currently in use at the AEC-owned gaseous diffusion plants. Other procedures may be used provided analytical results of comparable precision and accuracy are obtained.

  9. ABOUT THE SYSTEM ANALYSIS OF UNEMPLOYMENT OF YOUTH: GENERAL TASKS AND PRIVATE MODELS OF MARKET INDICATORS

    Directory of Open Access Journals (Sweden)

    Natalia V. Kontsevaya

    2016-01-01

    Full Text Available In this work attempt of system approach to the analysis of labor market of youth is made, the place and a role of youth labor exchange are dened, opportunities and methods of state regulation are opened, contradictions in the analysis of the main market indicators are designated.Within system approach to the analysis of dynamics of market processes modeling of the main indicators of labor market in regional scale is shown.This approach can be useful when developing effective and economically reasonable mechanisms of employment of youth, both at the level of regional services of employment, and in the state scale

  10. Final safety analysis report for the Galileo Mission: Volume 1, Reference design document

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    The Galileo mission uses nuclear power sources called Radioisotope Thermoelectric Generators (RTGs) to provide the spacecraft's primary electrical power. Because these generators contain nuclear material, a Safety Analysis Report (SAR) is required. A preliminary SAR and an updated SAR were previously issued that provided an evolving status report on the safety analysis. As a result of the Challenger accident, the launch dates for both Galileo and Ulysses missions were later rescheduled for November 1989 and October 1990, respectively. The decision was made by agreement between the DOE and the NASA to have a revised safety evaluation and report (FSAR) prepared on the basis of these revised vehicle accidents and environments. The results of this latest revised safety evaluation are presented in this document (Galileo FSAR). Volume I, this document, provides the background design information required to understand the analyses presented in Volumes II and III. It contains descriptions of the RTGs, the Galileo spacecraft, the Space Shuttle, the Inertial Upper Stage (IUS), the trajectory and flight characteristics including flight contingency modes, and the launch site. There are two appendices in Volume I which provide detailed material properties for the RTG.

  11. Economic analysis of a volume reduction/polyethylene solidification system for low-level radioactive wastes

    International Nuclear Information System (INIS)

    Kalb, P.D.; Colombo, P.

    1985-01-01

    A study was conducted at Brookhaven National Laboratory to determine the economic feasibility of a fluidized bed volume reduction/polyethylene solidification system for low-level radioactive wastes. These results are compared with the ''null'' alternative of no volume reduction and solidification of aqueous waste streams in hydraulic cement. The economic analysis employed a levelized revenue requirement (LRR) technique conducted over a ten year period. An interactive computer program was written to conduct the LRR calculations. Both of the treatment/solidification options were considered for a number of scenarios including type of plant (BWR or PWR) and transportation distance to the disposal site. If current trends in the escalation rates of cost components continue, the volume reduction/polyethylene solidification option will be cost effective for both BWRs and PWRs. Data indicate that a minimum net annual savings of $0.8 million per year (for a PWR shipping its waste 750 miles) and a maximum net annual savings of $9 million per year (for a BWR shipping its waste 2500 miles) can be achieved. A sensitivity analysis was performed for the burial cost escalation rate, which indicated that variation of this factor will impact the total levelized revenue requirement. The burial cost escalation rate which yields a break-even condition was determined for each scenario considered. 11 refs., 8 figs., 39 tabs

  12. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L. [Pacific Science and Engineering Group, San Diego, CA (United States)] [and others

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.

  13. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    International Nuclear Information System (INIS)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L.

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  14. Engineering Task Plan for Development and Fabrication and Deployment of a mobile, variable depth sampling At-Tank Analysis Systems

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    This engineering task plan identifies the resources, responsibilities, and schedules for the development and deployment of a mobile, variable depth sampling system and an at-tank analysis system. The mobile, variable depth sampling system concept was developed after a cost assessment indicated a high cost for multiple deployments of the nested, fixed-depth sampling system. The sampling will provide double-shell tank (DST) staging tank waste samples for assuring the readiness of the waste for shipment to the LAW/HLW plant for treatment and immobilization. The at-tank analysis system will provide ''real-time'' assessments of the samples' chemical and physical properties. These systems support the Hanford Phase 1B vitrification project

  15. Indefinite inner product spaces, Schur analysis, and differential equations a volume dedicated to Heinz Langer

    CERN Document Server

    Kirstein, Bernd

    2018-01-01

    This volume, which is dedicated to Heinz Langer, includes biographical material and carefully selected papers. Heinz Langer has made fundamental contributions to operator theory. In particular, he has studied the domains of operator pencils and nonlinear eigenvalue problems, the theory of indefinite inner product spaces, operator theory in Pontryagin and Krein spaces, and applications to mathematical physics. His works include studies on and applications of Schur analysis in the indefinite setting, where the factorization theorems put forward by Krein and Langer for generalized Schur functions, and by Dijksma-Langer-Luger-Shondin, play a key role. The contributions in this volume reflect Heinz Langer’s chief research interests and will appeal to a broad readership whose work involves operator theory.  .

  16. Systems Studies Department FY 78 activity report. Volume 2. Systems analysis

    International Nuclear Information System (INIS)

    Gold, T.S.

    1979-02-01

    The Systems Studies Department at Sandia Laboratories Livermore (SLL) has two primary responsibilities: to provide computational and mathematical services and to perform systems analysis studies. This document (Volume 2) describes the FY Systems Analysis highlights. The description is an unclassified overview of activities and is not complete or exhaustive. The objective of the systems analysis activities is to evaluate the relative value of alternative concepts and systems. SLL systems analysis activities reflect Sandia Laboratory programs and in 1978 consisted of study efforts in three areas: national security: evaluations of strategic, theater, and navy nuclear weapons issues; energy technology: particularly in support of Sandia's solar thermal programs; and nuclear fuel cycle physical security: a special project conducted for the Nuclear Regulatory Commission. Highlights of these activities are described in the following sections. 7 figures

  17. The effect of duration of illness and antipsychotics on subcortical volumes in schizophrenia: Analysis of 778 subjects

    Directory of Open Access Journals (Sweden)

    Naoki Hashimoto

    2018-01-01

    Discussion: A large sample size, uniform data collection methodology and robust statistical analysis are strengths of the current study. This result suggests that we need special attention to discuss about relationship between subcortical regional brain volumes and pathophysiology of schizophrenia because regional brain volumes may be affected by antipsychotic medication.

  18. External validation of a forest inventory and analysis volume equation and comparisons with estimates from multiple stem-profile models

    Science.gov (United States)

    Christopher M. Oswalt; Adam M. Saunders

    2009-01-01

    Sound estimation procedures are desideratum for generating credible population estimates to evaluate the status and trends in resource conditions. As such, volume estimation is an integral component of the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) program's reporting. In effect, reliable volume estimation procedures are...

  19. The effect of duration of illness and antipsychotics on subcortical volumes in schizophrenia: Analysis of 778 subjects.

    Science.gov (United States)

    Hashimoto, Naoki; Ito, Yoichi M; Okada, Naohiro; Yamamori, Hidenaga; Yasuda, Yuka; Fujimoto, Michiko; Kudo, Noriko; Takemura, Ariyoshi; Son, Shuraku; Narita, Hisashi; Yamamoto, Maeri; Tha, Khin Khin; Katsuki, Asuka; Ohi, Kazutaka; Yamashita, Fumio; Koike, Shinsuke; Takahashi, Tsutomu; Nemoto, Kiyotaka; Fukunaga, Masaki; Onitsuka, Toshiaki; Watanabe, Yoshiyuki; Yamasue, Hidenori; Suzuki, Michio; Kasai, Kiyoto; Kusumi, Ichiro; Hashimoto, Ryota

    2018-01-01

    The effect of duration of illness and antipsychotic medication on the volumes of subcortical structures in schizophrenia is inconsistent among previous reports. We implemented a large sample analysis utilizing clinical data from 11 institutions in a previous meta-analysis. Imaging and clinical data of 778 schizophrenia subjects were taken from a prospective meta-analysis conducted by the COCORO consortium in Japan. The effect of duration of illness and daily dose and type of antipsychotics were assessed using the linear mixed effect model where the volumes of subcortical structures computed by FreeSurfer were used as a dependent variable and age, sex, duration of illness, daily dose of antipsychotics and intracranial volume were used as independent variables, and the type of protocol was incorporated as a random effect for intercept. The statistical significance of fixed-effect of dependent variable was assessed. Daily dose of antipsychotics was positively associated with left globus pallidus volume and negatively associated with right hippocampus. It was also positively associated with laterality index of globus pallidus. Duration of illness was positively associated with bilateral globus pallidus volumes. Type of antipsychotics did not have any effect on the subcortical volumes. A large sample size, uniform data collection methodology and robust statistical analysis are strengths of the current study. This result suggests that we need special attention to discuss about relationship between subcortical regional brain volumes and pathophysiology of schizophrenia because regional brain volumes may be affected by antipsychotic medication.

  20. Dose-volume analysis of hypothyroidism in patients irradiated to the neck

    International Nuclear Information System (INIS)

    Te, Vuong; Liu, Mitchell C.C.; Parker, William; Curtin-Savard, Arthur J.; Clark, Brenda

    1997-01-01

    Purpose: To determine if the incidence of hypothyroidism in patients who have received radiation therapy to the neck region has any relationship with the total dose to the thyroid and volume of thyroid irradiation. Methods and Materials: From 1988 to 1996, TSH levels were measured at regular intervals of every 3 to 6 months in 528 patients with head and neck cancers or lymphomas (Hodgkin and non-Hodgkin) who had received radiation therapy to the neck region. Hypothyroidism was defined by TSH of ≥ 5 (normal range: 0.5 - 4mU/L). Medical charts, radiotherapy charts, treatment planning films, dosimetry and CT scans/MRI were reviewed. Thyroid volume was determined utilizing treatment planning films and CT scans/MRI. Four hundred and six patients had normal TSH prior to radiation and sufficient information to be eligible for analysis. There were 264 (65%) male and 142 (35%) female, median age was 59 yr (range: 12 - 85). Median follow-up was 39.5 months (range: 1 - 289 months). Results: Out of the 406 eligible patients, 152 (37%) had developed hypothyroidism. The actuarial incidence of hypothyroidism at 1 yr, 3 yr and 5 yr are 9.1%, 29% and 38.5%, respectively. Analysis of volume effect and dose effect are as follows: When the radiation dose to the thyroid and the volume of thyroid irradiated are analyzed together, the group of patients who received ≥ 60Gy to half of thyroid or received ≥ 30Gy to the whole thyroid has increased risk of developing hypothyroidism as compared to those receiving <60Gy to half the thyroid or <30Gy to the whole thyroid (p=.0001). Conclusions: The actuarial incidence of hypothyroidism at 5 year in patients who had received radiation to the neck is 38.5%. Patients who received ≥ 60Gy to half the thyroid or received ≥ 30Gy to the whole thyroid are at higher risk of developing hyperthyroidism

  1. Analysis in Banach spaces volume II probabilistic methods and operator theory

    CERN Document Server

    Hytönen, Tuomas; Veraar, Mark; Weis, Lutz

    2017-01-01

    This second volume of Analysis in Banach Spaces, Probabilistic Methods and Operator Theory, is the successor to Volume I, Martingales and Littlewood-Paley Theory. It presents a thorough study of the fundamental randomisation techniques and the operator-theoretic aspects of the theory. The first two chapters address the relevant classical background from the theory of Banach spaces, including notions like type, cotype, K-convexity and contraction principles. In turn, the next two chapters provide a detailed treatment of the theory of R-boundedness and Banach space valued square functions developed over the last 20 years. In the last chapter, this content is applied to develop the holomorphic functional calculus of sectorial and bi-sectorial operators in Banach spaces. Given its breadth of coverage, this book will be an invaluable reference to graduate students and researchers interested in functional analysis, harmonic analysis, spectral theory, stochastic analysis, and the operator-theoretic approac...

  2. 241-AW/AN waste storage tanks: Supplemental gravity load analysis. Volume 1

    International Nuclear Information System (INIS)

    Julyk, L.J.

    1994-01-01

    An analysis of the 241SY tanks performed by ADVENT(1994b) to resolve dome overload issues indicated that the tank can sustain the dome loads resulting from additional soil overburden depth, increased soil density, and increased concentrated load. Similar issues exist for the 241AW/AN tanks and therefore, an interim analysis of the 241AW/AN tanks is presented herein. The scope of this effort is to review and compare all design drawings pertaining to the 241AW and 241AN tanks with those pertaining to the 241SY tanks; to modify the axisymmetric model of the 241SY tanks to represent the 241AW/AN tanks; and to evaluate the effect of additional dome load on the 241AW/AN tanks by performing a structural analysis for gravity loads (dead load + live load). ADVENT's additional scope of work is to perform a qualitative evaluation of the 241AW/AN tanks for seismic and thermal loadings (Blume 1976). This qualitative evaluation does not include any detailed finite element analysis of the tanks. Volume 1 of this report contains the text and calculations. Volume 2 contains a printed copy of the computer files used in these analyses

  3. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 2, Task 3, Testing of process improvement concepts: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    This final report, Volume 2, on ``Process Improvement Concepts`` presents the results of work conducted by the Institute of Gas Technology (IGT), the Illinois Institute of Technology (IIT), and the Ohio State University (OSU) to develop three novel approaches for desulfurization that have shown good potential with coal and could be cost-effective for oil shales. These are (1) In-Bed Sulfur Capture using different sorbents (IGT), (2) Electrostatic Desulfurization (IIT), and (3) Microbial Desulfurization and Denitrification (OSU and IGT). Results of work on electroseparation of shale oil and fines conducted by IIT is included in this report, as well as work conducted by IGT to evaluate the restricted pipe discharge system. The work was conducted as part of the overall program on ``Pressurized Fluidized-Bed Hydroretorting of Eastern Oil Shales.``

  4. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    Science.gov (United States)

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  5. Sparse Probabilistic Parallel Factor Analysis for the Modeling of PET and Task-fMRI Data

    DEFF Research Database (Denmark)

    Beliveau, Vincent; Papoutsakis, Georgios; Hinrich, Jesper Løve

    2017-01-01

    Modern datasets are often multiway in nature and can contain patterns common to a mode of the data (e.g. space, time, and subjects). Multiway decomposition such as parallel factor analysis (PARAFAC) take into account the intrinsic structure of the data, and sparse versions of these methods improve...

  6. Guidelines on routine cerebrospinal fluid analysis. Report from an EFNS task force

    DEFF Research Database (Denmark)

    Deisenhammer, F; Bartos, A; Egg, R

    2006-01-01

    A great variety of neurological diseases require investigation of cerebrospinal fluid (CSF) to prove the diagnosis or to rule out relevant differential diagnoses. The objectives were to evaluate the theoretical background and provide guidelines for clinical use in routine CSF analysis including...

  7. Traffic analysis toolbox volume XI : weather and traffic analysis, modeling and simulation.

    Science.gov (United States)

    2010-12-01

    This document presents a weather module for the traffic analysis tools program. It provides traffic engineers, transportation modelers and decisions makers with a guide that can incorporate weather impacts into transportation system analysis and mode...

  8. Cost-effectiveness analysis alongside clinical trials II-An ISPOR Good Research Practices Task Force report.

    Science.gov (United States)

    Ramsey, Scott D; Willke, Richard J; Glick, Henry; Reed, Shelby D; Augustovski, Federico; Jonsson, Bengt; Briggs, Andrew; Sullivan, Sean D

    2015-03-01

    Clinical trials evaluating medicines, medical devices, and procedures now commonly assess the economic value of these interventions. The growing number of prospective clinical/economic trials reflects both widespread interest in economic information for new technologies and the regulatory and reimbursement requirements of many countries that now consider evidence of economic value along with clinical efficacy. As decision makers increasingly demand evidence of economic value for health care interventions, conducting high-quality economic analyses alongside clinical studies is desirable because they broaden the scope of information available on a particular intervention, and can efficiently provide timely information with high internal and, when designed and analyzed properly, reasonable external validity. In 2005, ISPOR published the Good Research Practices for Cost-Effectiveness Analysis Alongside Clinical Trials: The ISPOR RCT-CEA Task Force report. ISPOR initiated an update of the report in 2014 to include the methodological developments over the last 9 years. This report provides updated recommendations reflecting advances in several areas related to trial design, selecting data elements, database design and management, analysis, and reporting of results. Task force members note that trials should be designed to evaluate effectiveness (rather than efficacy) when possible, should include clinical outcome measures, and should obtain health resource use and health state utilities directly from study subjects. Collection of economic data should be fully integrated into the study. An incremental analysis should be conducted with an intention-to-treat approach, complemented by relevant subgroup analyses. Uncertainty should be characterized. Articles should adhere to established standards for reporting results of cost-effectiveness analyses. Economic studies alongside trials are complementary to other evaluations (e.g., modeling studies) as information for decision

  9. Analysis of action tremor and impaired control of movement velocity in multiple sclerosis during visually guided wrist-tracking tasks.

    Science.gov (United States)

    Liu, X; Miall, C; Aziz, T Z; Palace, J A; Haggard, P N; Stein, J F

    1997-11-01

    We investigated the relationship between action tremor (AT) and impaired control of movement velocity (MV) in visually guided tracking tasks, in normal subjects and in patients with multiple sclerosis (MS) with or without motor deficits. The effects of withdrawing visual feedback of either the target or the cursor were then investigated. Visually cued simple reaction times (SRTs) were also measured. The effects of thalamotomy on motor performance in these tasks were evaluated in seven patients. In the MS patients with tremor, there was no correlation between AT and impairment in control of MV, but the latter was highly correlated with an increased delay in SRT. Withdrawal of visually guiding cues increased the error significantly in MV, but reduced AT by approximately 30% in magnitude. Frequency analysis indicated that the AT had two components: (a) non-visual-dependent, oscillatory movements, mainly at 4 Hz; and (2) visual-dependent, repetitive movements, with significant power at 1-2 Hz. Thalamotomy significantly reduced AT but hardly improved accuracy in MV. These results suggest that visual feedback of a spatial mismatch signal may provoke a visually dependent repetitive movement contributing to AT. Conduction delays along either the cortico-cerebello-cortical or the proprioceptive pathways and impaired working memory caused by MS may be responsible for the movement disorders in these patients.

  10. JV Task 99-Integrated Risk Analysis and Contaminant Reduction, Watford City, North Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Jaroslav Solc; Barry W. Botnen

    2007-05-31

    The Energy & Environmental Research Center (EERC) conducted a limited site investigation and risk analyses for hydrocarbon-contaminated soils and groundwater at a Construction Services, Inc., site in Watford City, North Dakota. Site investigation confirmed the presence of free product and high concentrations of residual gasoline-based contaminants in several wells, the presence of 1,2-dichloroethane, and extremely high levels of electrical conductivity indicative of brine residuals in the tank area south of the facility. The risk analysis was based on compilation of information from the site-specific geotechnical investigation, including multiphase extraction pilot test, laser induced fluorescence probing, evaluation of contaminant properties, receptor survey, capture zone analysis and evaluation of well head protection area for municipal well field. The project results indicate that the risks associated with contaminant occurrence at the Construction Services, Inc. site are low and, under current conditions, there is no direct or indirect exposure pathway between the contaminated groundwater and soils and potential receptors.

  11. Quantitative analysis of macular retinal thickness and macular volume in diabetic retinopathy

    Directory of Open Access Journals (Sweden)

    Ying Zhao

    2017-12-01

    difference was statistically significant(PCONCLUSION: The changes of retinal thickness and volume in macular central fovea were related with the progression of diabetic retinopathy. Using OCT to analyze the macular thickness and macular volume in different stages of diabetic retinopathy, helps physicians to understand the morphological changes of macular region and its surrounding macular degeneration with the severity of diabetic retinopathy, and provide a basis for better analysis of the changes of the structure of macular in different severity diabetic retinopathy.

  12. Improved Duct Systems Task Report with StageGate 2 Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moyer, Neil [Florida Solar Energy Center, Cocoa, FL (United States); Stroer, Dennis [Calcs-Plus, Venice, FL (United States)

    2007-12-31

    This report is about Building America Industrialized Housing Partnership's work with two industry partners, Davalier Homes and Southern Energy Homes, in constructing and evaluating prototype interior duct systems. Issues of energy performance, comfort, DAPIA approval, manufacturability and cost is addressed. A stage gate 2 analysis addresses the current status of project showing that there are still refinements needed to the process of incorporating all of the ducts within the air and thermal boundaries of the envelope.

  13. Status report on activities of ASTM E10.05.01 Task Group on Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kam, F.B.K.; Stallman, F.W.

    1979-01-01

    Uncertainties in the field of reactor dosimetry are analyzed. A survey on uncertainty analysis as it is practiced at leading laboratories which are involved in reactor dosimetry is described. A questionnaire was prepared and mailed to about 45 installations and researchers. Nine replies were received, several of them were prepared by more than one author. Three of the nine came from installations outside the US. Results and the questionnaire are presented

  14. Periodic breathing during ascent to extreme altitude quantified by spectral analysis of the respiratory volume signal.

    Science.gov (United States)

    Garde, A; Giraldo, B F; Jane, R; Latshang, T D; Turk, A J; Hess, T; Bosch, M M; Barthelmes, D; Hefti, J Pichler; Maggiorini, M; Hefti, U; Merz, T M; Schoch, O D; Bloch, K E

    2012-01-01

    High altitude periodic breathing (PB) shares some common pathophysiologic aspects with sleep apnea, Cheyne-Stokes respiration and PB in heart failure patients. Methods that allow quantifying instabilities of respiratory control provide valuable insights in physiologic mechanisms and help to identify therapeutic targets. Under the hypothesis that high altitude PB appears even during physical activity and can be identified in comparison to visual analysis in conditions of low SNR, this study aims to identify PB by characterizing the respiratory pattern through the respiratory volume signal. A number of spectral parameters are extracted from the power spectral density (PSD) of the volume signal, derived from respiratory inductive plethysmography and evaluated through a linear discriminant analysis. A dataset of 34 healthy mountaineers ascending to Mt. Muztagh Ata, China (7,546 m) visually labeled as PB and non periodic breathing (nPB) is analyzed. All climbing periods within all the ascents are considered (total climbing periods: 371 nPB and 40 PB). The best crossvalidated result classifying PB and nPB is obtained with Pm (power of the modulation frequency band) and R (ratio between modulation and respiration power) with an accuracy of 80.3% and area under the receiver operating characteristic curve of 84.5%. Comparing the subjects from 1(st) and 2(nd) ascents (at the same altitudes but the latter more acclimatized) the effect of acclimatization is evaluated. SaO(2) and periodic breathing cycles significantly increased with acclimatization (p-value < 0.05). Higher Pm and higher respiratory frequencies are observed at lower SaO(2), through a significant negative correlation (p-value < 0.01). Higher Pm is observed at climbing periods visually labeled as PB with > 5 periodic breathing cycles through a significant positive correlation (p-value < 0.01). Our data demonstrate that quantification of the respiratory volume signal using spectral analysis is suitable to identify

  15. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  16. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  17. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    Science.gov (United States)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  18. Rate Transient Analysis for Multistage Fractured Horizontal Well in Tight Oil Reservoirs considering Stimulated Reservoir Volume

    Directory of Open Access Journals (Sweden)

    Ruizhong Jiang

    2014-01-01

    Full Text Available A mathematical model of multistage fractured horizontal well (MsFHW considering stimulated reservoir volume (SRV was presented for tight oil reservoirs. Both inner and outer regions were assumed as single porosity media but had different formation parameters. Laplace transformation method, point source function integration method, superposition principle, Stehfest numerical algorithm, and Duhamel’s theorem were used comprehensively to obtain the semianalytical solution. Different flow regimes were divided based on pressure transient analysis (PTA curves. According to rate transient analysis (RTA, the effects of related parameters such as SRV radius, storativity ratio, mobility ratio, fracture number, fracture half-length, and fracture spacing were analyzed. The presented model and obtained results in this paper enrich the performance analysis models of MsFHW considering SRV.

  19. Georgetown University Integrated Community Energy System (GU-ICES). Phase III, Stage I. Feasibility analysis. Final report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-10-01

    This Feasibility Analysis covers a wide range of studies and evaluations. The Report is divided into five parts. Section 1 contains all material relating to the Institutional Assessment including consideration of the requirements and position of the Potomac Electric Co. as they relate to cogeneration at Georgetown in parallel with the utility (Task 1). Sections 2 through 7 contain all technical information relating to the Alternative Subsystems Analysis (Task 4). This includes the energy demand profiles upon which the evaluations were based (Task 3). It further includes the results of the Life-Cycle-Cost Analyses (Task 5) which are developed in detail in the Appendix for evaluation in the Technical Report. Also included is the material relating to Incremental Savings and Optimization (Task 6) and the Conceptual Design for candidate alternate subsystems (Task 7). Section 8 contains all material relating to the Environmental Impact Assessment (Task 2). The Appendix contains supplementary material including the budget cost estimates used in the life-cycle-cost analyses, the basic assumptions upon which the life-cycle analyses were developed, and the detailed life-cycle-cost anlysis for each subsystem considered in detail.

  20. Analysis of tidal breathing flow volume loop in dogs with tracheal masses.

    Science.gov (United States)

    Adamama-Moraitou, Kk; Pardali, D; Prassinos, N N; Papazoglou, L G; Makris, D; Gourgoulianis, K I; Papaioannou, N; Rallis, T S

    2010-09-01

    To investigate whether there are any changes in the tidal breathing flow volume loop (TBFVL) in calm, non-dyspnoeic dogs with intratracheal masses. We compared 4 dogs with intratracheal masses (group 1) with 10 healthy dogs (group 2). Routine clinical and laboratory examinations of the dogs were unremarkable, except for episodic upper respiratory obstructive signs in the dogs in group 1. Lateral radiography of the neck and thorax showed that group 1 dogs had masses that appeared to protrude into the tracheal lumen. Tracheoscopy and surgery or necropsy was performed to confirm the presence of the mass. Arterial blood gas and TBFVL analysis was carried out in all dogs to assess respiratory status. The shape of the TBFVL for dogs in group 1 was narrower and ovoid compared with that for the group 2 dogs. Tidal volume and expiratory and inspiratory times were significantly reduced, whereas the respiratory rate was increased for dogs in group 1 compared with dogs in group 2. Arterial blood gas analysis was unremarkable for all dogs. TBFVL is a non-invasive technique that is easy to perform and well tolerated by dogs. In the absence of abnormalities detected by routine diagnostic evaluations and arterial blood gas analysis in dogs with intratracheal masses, the TBFVL contributes to the definition of the physiologic status of the airways at the time of testing, and results suggests that these dogs breathe quite normally when they are calm and non-dyspnoeic.

  1. VOLUME STUDY WITH HIGH DENSITY OF PARTICLES BASED ON CONTOUR AND CORRELATION IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tatyana Yu. Nikolaeva

    2014-11-01

    Full Text Available The subject of study is the techniques of particle statistics evaluation, in particular, processing methods of particle images obtained by coherent illumination. This paper considers the problem of recognition and statistical accounting for individual images of small scattering particles in an arbitrary section of the volume in case of high concentrations. For automatic recognition of focused particles images, a special algorithm for statistical analysis based on contouring and thresholding was used. By means of the mathematical formalism of the scalar diffraction theory, coherent images of the particles formed by the optical system with high numerical aperture were simulated. Numerical testing of the method proposed for the cases of different concentrations and distributions of particles in the volume was performed. As a result, distributions of density and mass fraction of the particles were obtained, and the efficiency of the method in case of different concentrations of particles was evaluated. At high concentrations, the effect of coherent superposition of the particles from the adjacent planes strengthens, which makes it difficult to recognize images of particles using the algorithm considered in the paper. In this case, we propose to supplement the method with calculating the cross-correlation function of particle images from adjacent segments of the volume, and evaluating the ratio between the height of the correlation peak and the height of the function pedestal in the case of different distribution characters. The method of statistical accounting of particles considered in this paper is of practical importance in the study of volume with particles of different nature, for example, in problems of biology and oceanography. Effective work in the regime of high concentrations expands the limits of applicability of these methods for practically important cases and helps to optimize determination time of the distribution character and

  2. Body fluid volume and nutritional status in hemodialysis: vector bioelectric impedance analysis.

    Science.gov (United States)

    Espinosa Cuevas, M A; Navarrete Rodriguez, G; Villeda Martinez, M E; Atilano Carsi, X; Miranda Alatriste, P; Tostado Gutiérrez, T; Correa-Rotter, R

    2010-04-01

    Protein-energy malnutrition and hypervolemia are major causes of morbidity and mortality in patients on chronic hemodialysis (CHD). The methods used to evaluate nutritional status and volume status remain controversial. Vector bioelectric impedance analysis (vector- BIA) has recently been developed to assess both nutritional status and tissue hydration. The purpose of the study was to assess the nutritional status and volume status of patients on CHD with conventional nutritional assessment methods and with vector-BIA and then to compare the resulting findings. 76 Mexican patients on CHD were studied. Nutritional status and body composition were assessed with anthropometry, biochemical variables, and the modified Bilbrey nutritional index (mBNI), the results were compared with both conventional BIA and vector-BIA. The BNI was used to determine the number of patients with normal nutritional status (n = 27, 35.5%), and mild (n = 31, 40.8%), moderate (n = 10, 13.2%) and severe malnutrition (n = 8, 10.5%). Patients displayed shorter vectors with smaller phase angles or with an overhydration vectorial pattern before the initiation of their hemodialysis session. There was general improvement to normal hydration status post-dialysis (p hemodialysis session. Diabetics and those with moderate or severe malnutrition were more overhydrated, which is a condition that may be associated with increased cardiovascular morbidity. Because nutritional and volume status are important factors associated with morbidity and mortality in CHD patients, we focused on optimizing the use of existing methods. Our studies suggest that vector-BIA offers a comprehensive and reliable reproducible means of assessing both volume and masses at the bedside and can complement the traditional methods.

  3. ELT Teacher Education Flipped Classroom: An Analysis of Task Challenge and Student Teachers' Views and Expectations

    Science.gov (United States)

    Karaaslan, Hatice; Çelebi, Hatice

    2017-01-01

    In this study, we explore the interplay between task complexity, task conditions and task difficulty introduced by Robinson (2001) in flipped classroom instruction at tertiary level through the data we collected from undergraduate English Language Teaching (ELT) department students studying at an English-medium state university. For the…

  4. Variability and Variation of L2 Grammar: A Cross-Sectional Analysis of German Learners' Performance on Two Tasks

    Science.gov (United States)

    Abrams, Zsuzsanna; Rott, Susanne

    2017-01-01

    Research on second language (L2) grammar in task-based language learning has yielded inconsistent results regarding the effects of task-complexity, prompting calls for more nuanced analyses of L2 development and task performance. The present cross-sectional study contributes to this discussion by comparing the performance of 245 learners of German…

  5. Low Tidal Volume versus Non-Volume-Limited Strategies for Patients with Acute Respiratory Distress Syndrome. A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Walkey, Allan J; Goligher, Ewan C; Del Sorbo, Lorenzo; Hodgson, Carol L; Adhikari, Neill K J; Wunsch, Hannah; Meade, Maureen O; Uleryk, Elizabeth; Hess, Dean; Talmor, Daniel S; Thompson, B Taylor; Brower, Roy G; Fan, Eddy

    2017-10-01

    Trials investigating use of lower tidal volumes and inspiratory pressures for patients with acute respiratory distress syndrome (ARDS) have shown mixed results. To compare clinical outcomes of mechanical ventilation strategies that limit tidal volumes and inspiratory pressures (LTV) to strategies with tidal volumes of 10 to 15 ml/kg among patients with ARDS. This is a systematic review and meta-analysis of clinical trials investigating LTV mechanical ventilation strategies. We used random effects models to evaluate the effect of LTV on 28-day mortality, organ failure, ventilator-free days, barotrauma, oxygenation, and ventilation. Our primary analysis excluded trials for which the LTV strategy was combined with the additional strategy of higher positive end-expiratory pressure (PEEP), but these trials were included in a stratified sensitivity analysis. We performed metaregression of tidal volume gradient achieved between intervention and control groups on mortality effect estimates. We used Grading of Recommendations Assessment, Development, and Evaluation methodology to determine the quality of evidence. Seven randomized trials involving 1,481 patients met eligibility criteria for this review. Mortality was not significantly lower for patients receiving an LTV strategy (33.6%) as compared with control strategies (40.4%) (relative risk [RR], 0.87; 95% confidence interval [CI], 0.70-1.08; heterogeneity statistic I 2  = 46%), nor did an LTV strategy significantly decrease barotrauma or ventilator-free days when compared with a lower PEEP strategy. Quality of evidence for clinical outcomes was downgraded for imprecision. Metaregression showed a significant inverse association between larger tidal volume gradient between LTV and control groups and log odds ratios for mortality (β, -0.1587; P = 0.0022). Sensitivity analysis including trials that protocolized an LTV/high PEEP cointervention showed lower mortality associated with LTV (nine trials and 1

  6. Ocean thermal energy conversion cold water pipe preliminary design project. Task 2. Analysis for concept selection

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-04-01

    The successful performance of the CWP is of crucial importance to the overall OTEC system; the pipe itself is considered the most critical part of the entire operation. Because of the importance the CWP, a project for the analysis and design of CWP's was begun in the fall of 1978. The goals of this project were to study a variety of concepts for delivering cold water to an OTEC plant, to analyze and rank these concepts based on their relative cost and risk, and to develop preliminary design for those concepts which seemed most promising. Two representative platforms and sites were chosen: a spar buoy of a Gibbs and Cox design to be moored at a site off Punta Tuna, Puerto Rico, and a barge designed by APL/Johns Hopkins University, grazing about a site approximately 200 miles east of the coast of Brazil. The approach was to concentrate on the most promising concepts and on those which were either of general interest or espoused by others (e.g., steel and concrete concepts). Much of the overall attention, therefore, focused on analyzing rigid and compliant wall design, while stockade (except for the special case of the FRP stockade) and bottom-mounted concepts received less attention. A total of 67 CWP concepts were initially generated and subjected to a screening process. Of these, 16 were carried through design analysis, costing, and ranking. Study results are presented in detail. (WHK)

  7. The usability of the fashion product - analysis of user tasks in the creation and production of clothing

    Directory of Open Access Journals (Sweden)

    Tamissa Juliana Barreto Berton

    2016-12-01

    Full Text Available Clothing is an object that is in direct contact with the body, and when poorly designed can limit the individual in the daily activities, thus considering the usability of the product. This article aims to highlight the analysis of user tasks during the development of the fashion product design, showing in which steps the knowledge of movements performed by man shall be inserted. By showing stages, from research and creation until modeling, at which begins the construction of the product, it is remarkable the influence of ergonomics and anthropometry in the process, and the knowledge of the human body in its entirety is essential for the construction of clothing. For this, a literature review was performed to unite the concepts necessary to achieve the objective of the work. The subjects covered are intended to educate the fashion designer as understanding the user's activities will influence the quality of the designed product.

  8. Task 7a - dynamic analysis of Paks NPP structures: Reactor building

    International Nuclear Information System (INIS)

    Zola, M.

    1995-01-01

    This report refers to the activities of a sub-contract to the Project RER/9/046, awarded to ISMES by the International Atomic Energy Agency (IAEA) of Vienna, to compare the results obtained from the experimental activities performed under previous contract by ISMES with those coming from analytical studies performed in the framework of the Coordinated Research Programme (CRP) on 'Benchmark Study for the Seismic Analysis and Testing of WWER-type Nuclear Power Plants' by other Institutions. After a brief introduction to the problem in Chapter 1, the identification of the comparison positions and reference directions is given in Chapter 3. A very quick description of the performed experimental tests is given in Chapter 4, whereas the characteristics of both experimental and analytical data are presented in Chapter 5. The data processing procedures are reported in Chapter 6 and some simple remarks are given in Chapter 7. (author)

  9. Analysis of main tasks of precision farming solved with the use of robotic means

    Directory of Open Access Journals (Sweden)

    Nguyen Vinh

    2017-01-01

    Full Text Available The main classes of problems of exact farming, solved with the involvement of robotics, are discussed. The use of precision navigation systems for robotic agricultural machines allows us to determine the actual soil parameters, calculate the cartograms for fertilizer application, taking into account the spatial variability of the agricultural contour. A promising method is the use of unmanned aerial vehicles for surveying and selecting intra field agricultural contours from a small height. To reduce the variability of soil characteristics in the fields, various robotic means of processing and preparation are used, including for leveling the soil. Given the regularity of planting crops on the fields, automatic systems of parallel driving of agricultural machinery are gaining popularity. Based on the analysis of modern robotic agro systems, promising areas for further research are formulated.

  10. THE ROLE AND PLACE OF LOGISTIC REGRESSION AND ROC ANALYSIS IN SOLVING MEDICAL DIAGNOSTIC TASK

    Directory of Open Access Journals (Sweden)

    S. G. Grigoryev

    2016-01-01

    Full Text Available Diagnostics, equally with  prevention and  treatment, is a basis of medical science and practice. For its history the medicine  has accumulated a great variety  of diagnostic methods for different diseases and  pathologic conditions. Nevertheless, new  tests,  methods and  tools are being  developed and recommended to application nowadays. Such  indicators as sensitivity and  specificity which  are defined on the basis  of fourfold contingency  tables   construction or  ROC-analysis method with  ROC  – curve  modelling (Receiver operating characteristic are used  as the  methods to estimate the  diagnostic capability. Fourfold  table  is used  with  the purpose to estimate the method which confirms or denies the diagnosis, i.e. a quality indicator. ROC-curve, being a graph, allows making the estimation of model  quality by subdivision of two classes  on the  basis  of identifying the  point  of cutting off a continuous or discrete quantitative attribute.The method of logistic regression technique is introduced as a tool to develop some  mathematical-statistical forecasting model  of probability of the event the researcher is interested in if there are two possible variants of the outcome. The method of ROC-analysis is chosen and described in detail as a tool to estimate the  model  quality. The capabilities of the named methods are demonstrated by a real example of creation  and  efficiency estimation (sensitivity and  specificity of a forecasting model  of probability of complication development in the form of pyodermatitis in children with  atopic dermatitis.

  11. Associations Between Daily Mood States and Brain Gray Matter Volume, Resting-State Functional Connectivity and Task-Based Activity in Healthy Adults

    Directory of Open Access Journals (Sweden)

    Elmira Ismaylova

    2018-05-01

    Full Text Available Numerous studies have shown differences in the functioning in the areas of the frontal-limbic circuitry between depressed patients and controls. However, current knowledge on frontal-limbic neural substrates of individual differences in mood states in everyday life in healthy individuals is scarce. The present study investigates anatomical, resting-state, and functional neural correlates of daily mood states in healthy individuals. We expected to observe associations between mood and the frontal-limbic circuitry and the default-mode network (DMN. A total of 42 healthy adults (19 men, 23 women; 34 ± 1.2 years regularly followed for behavior and psychosocial functioning since age of 6, underwent a functional magnetic resonance imaging scan, and completed a daily diary of mood states and related cognitions for 5 consecutive days. Results showed that individuals with smaller left hippocampal gray matter volumes experienced more negative mood and rumination in their daily life. Greater resting-state functional connectivity (rsFC within the DMN, namely between posterior cingulate cortex (PCC and medial prefrontal cortex regions as well as between PCC and precuneus, was associated with both greater negative and positive mood states in daily life. These rsFC results could be indicative of the role of the DMN regional functioning in emotional arousal, irrespective of valence. Lastly, greater daily positive mood was associated with greater activation in response to negative emotional stimuli in the precentral gyri, previously linked to emotional interference on cognitive control. Altogether, present findings might reflect neural mechanisms underlying daily affect and cognition among healthy individuals.

  12. Accuracy evaluation of Fourier series analysis and singular spectrum analysis for predicting the volume of motorcycle sales in Indonesia

    Science.gov (United States)

    Sasmita, Yoga; Darmawan, Gumgum

    2017-08-01

    This research aims to evaluate the performance of forecasting by Fourier Series Analysis (FSA) and Singular Spectrum Analysis (SSA) which are more explorative and not requiring parametric assumption. Those methods are applied to predicting the volume of motorcycle sales in Indonesia from January 2005 to December 2016 (monthly). Both models are suitable for seasonal and trend component data. Technically, FSA defines time domain as the result of trend and seasonal component in different frequencies which is difficult to identify in the time domain analysis. With the hidden period is 2,918 ≈ 3 and significant model order is 3, FSA model is used to predict testing data. Meanwhile, SSA has two main processes, decomposition and reconstruction. SSA decomposes the time series data into different components. The reconstruction process starts with grouping the decomposition result based on similarity period of each component in trajectory matrix. With the optimum of window length (L = 53) and grouping effect (r = 4), SSA predicting testing data. Forecasting accuracy evaluation is done based on Mean Absolute Percentage Error (MAPE), Mean Absolute Error (MAE) and Root Mean Square Error (RMSE). The result shows that in the next 12 month, SSA has MAPE = 13.54 percent, MAE = 61,168.43 and RMSE = 75,244.92 and FSA has MAPE = 28.19 percent, MAE = 119,718.43 and RMSE = 142,511.17. Therefore, to predict volume of motorcycle sales in the next period should use SSA method which has better performance based on its accuracy.

  13. Principal components analysis of reward prediction errors in a reinforcement learning task.

    Science.gov (United States)

    Sambrook, Thomas D; Goslin, Jeremy

    2016-01-01

    Models of reinforcement learning represent reward and punishment in terms of reward prediction errors (RPEs), quantitative signed terms describing the degree to which outcomes are better than expected (positive RPEs) or worse (negative RPEs). An electrophysiological component known as feedback related negativity (FRN) occurs at frontocentral sites 240-340ms after feedback on whether a reward or punishment is obtained, and has been claimed to neurally encode an RPE. An outstanding question however, is whether the FRN is sensitive to the size of both positive RPEs and negative RPEs. Previous attempts to answer this question have examined the simple effects of RPE size for positive RPEs and negative RPEs separately. However, this methodology can be compromised by overlap from components coding for unsigned prediction error size, or "salience", which are sensitive to the absolute size of a prediction error but not its valence. In our study, positive and negative RPEs were parametrically modulated using both reward likelihood and magnitude, with principal components analysis used to separate out overlying components. This revealed a single RPE encoding component responsive to the size of positive RPEs, peaking at ~330ms, and occupying the delta frequency band. Other components responsive to unsigned prediction error size were shown, but no component sensitive to negative RPE size was found. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Some considerations on the stress analysis of the Swiss coil for the Large Coil Task

    International Nuclear Information System (INIS)

    Jaeger, J.F.; Stiefel, U.; Dean, J.R.; Klauser, P.

    1981-08-01

    A finite element stress analysis of the coil under consideration is unavoidable due to the proximity of the stresses to the 0.2% yield strength. The casing of this D-shaped coil is both a hollow body and a looped one. This leads to computing costs and memory requirements which are enormous and preclude any parametric study. To reduce computer costs a newly developed code, FLASH, has been used. It has a hybrid stress model leading to more rapid converge and thick plate elements which allow bending moments to be computed. Only one thick plate is needed across the thickness of the casing and local stress concentrations are obtained from the mean stress and the bending moment. Several models were developed most of which can be set up automatically. Comparisons between the models and with ASKA finite element results from BROWN BOVERI Co. essentially show agreement. The casing of individual conductors has also been investigated with the same code. Both the effect of the Lorentz forces and those arising from the quench pressure due to helium heating on loss of superconductivity have been considered. (Auth.)

  15. Network analysis of returns and volume trading in stock markets: The Euro Stoxx case

    Science.gov (United States)

    Brida, Juan Gabriel; Matesanz, David; Seijas, Maria Nela

    2016-02-01

    This study applies network analysis to analyze the structure of the Euro Stoxx market during the long period from 2002 up to 2014. The paper generalizes previous research on stock market networks by including asset returns and volume trading as the main variables to study the financial market. A multidimensional generalization of the minimal spanning tree (MST) concept is introduced, by adding the role of trading volume to the traditional approach which only includes price returns. Additionally, we use symbolization methods to the raw data to study the behavior of the market structure in different, normal and critical, situations. The hierarchical organization of the network is derived, and the MST for different sub-periods of 2002-2014 is created to illustrate how the structure of the market evolves over time. From the structural topologies of these trees, different clusters of companies are identified and analyzed according to their geographical and economic links. Two important results are achieved. Firstly, as other studies have highlighted, at the time of the financial crisis after 2008 the network becomes a more centralized one. Secondly and most important, during our second period of analysis, 2008-2014, we observe that hierarchy becomes more country-specific where different sub-clusters of stocks belonging to France, Germany, Spain or Italy are found apart from their business sector group. This result may suggest that during this period of time financial investors seem to be worried most about country specific economic circumstances.

  16. Object-based representation and analysis of light and electron microscopic volume data using Blender.

    Science.gov (United States)

    Asadulina, Albina; Conzelmann, Markus; Williams, Elizabeth A; Panzera, Aurora; Jékely, Gáspár

    2015-07-25

    Rapid improvements in light and electron microscopy imaging techniques and the development of 3D anatomical atlases necessitate new approaches for the visualization and analysis of image data. Pixel-based representations of raw light microscopy data suffer from limitations in the number of channels that can be visualized simultaneously. Complex electron microscopic reconstructions from large tissue volumes are also challenging to visualize and analyze. Here we exploit the advanced visualization capabilities and flexibility of the open-source platform Blender to visualize and analyze anatomical atlases. We use light-microscopy-based gene expression atlases and electron microscopy connectome volume data from larval stages of the marine annelid Platynereis dumerilii. We build object-based larval gene expression atlases in Blender and develop tools for annotation and coexpression analysis. We also represent and analyze connectome data including neuronal reconstructions and underlying synaptic connectivity. We demonstrate the power and flexibility of Blender for visualizing and exploring complex anatomical atlases. The resources we have developed for Platynereis will facilitate data sharing and the standardization of anatomical atlases for this species. The flexibility of Blender, particularly its embedded Python application programming interface, means that our methods can be easily extended to other organisms.

  17. The Effect of a Workload-Preview on Task-Prioritization and Task-Performance

    Science.gov (United States)

    Minotra, Dev

    2012-01-01

    With increased volume and sophistication of cyber attacks in recent years, maintaining situation awareness and effective task-prioritization strategy is critical to the task of cybersecurity analysts. However, high levels of mental-workload associated with the task of cybersecurity analyst's limits their ability to prioritize tasks.…

  18. The effect of a dual task on gait speed in community dwelling older adults: A systematic review and meta-analysis.

    Science.gov (United States)

    Smith, Erin; Cusack, Tara; Blake, Catherine

    2016-02-01

    Reduced walking speed in older adults is associated with adverse health outcomes. This review aims to examine the effect of a cognitive dual-task on the gait speed of community-dwelling older adults with no significant pathology affecting gait. Electronic database searches were performed in, Web of Science, PubMed, SCOPUS, Embase and psychINFO. Eligibility and methodological quality was assessed by two independent reviewers. The effect size on gait speed was measured as the raw mean difference (95% confidence interval) between single and dual-task performance. Pooled estimates of the overall effect were computed using a random effects method and forest plots generated. 22 studies (27 data sets) with a population of 3728 were reviewed and pooled for meta-analysis. The mean walking speed of participants included in all studies was >1.0m/s and all studies reported the effect of a cognitive dual-task on gait speed. Sub-analysis examined the effect of type of cognitive task (mental-tracking vs. verbal-fluency). Mean single-task gait speed was 1.21 (0.13)m/s, the addition of a dual-task reduced speed by 0.19 m/s to 1.02 (0.16)m/s (pgait speed. The cross-sectional design of the studies made quality assessment difficult. Despite efforts, high heterogeneity remained, possibly due to participant characteristics and testing protocols. This meta-analysis shows that in community-dwelling older adults, the addition of a dual-task significantly reduces gait speed and may indicate the value of including dual-task walking as part of the standard clinical assessment of older people. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Effects of elevated vacuum on in-socket residual limb fluid volume: Case study results using bioimpedance analysis

    Science.gov (United States)

    Sanders, JE; Harrison, DS; Myers, TR; Allyn, KJ

    2015-01-01

    Bioimpedance analysis was used to measure residual limb fluid volume on seven trans-tibial amputee subjects using elevated vacuum sockets and non-elevated vacuum sockets. Fluid volume changes were assessed during sessions with the subjects sitting, standing, and walking. In general, fluid volume losses during 3 or 5 min walks and losses over the course of the 30-min test session were less for elevated vacuum than for suction. A number of variables including the time of day data were collected, soft tissue consistency, socket-to-limb size differences and shape differences, and subject health may have affected the results and had an equivalent or greater impact on limb fluid volume compared with elevated vacuum. Researchers should well consider these variables in study design of future investigations on the effects of elevated vacuum on residual limb volume. PMID:22234667

  20. The history of NATO TNF policy: The role of studies, analysis and exercises conference proceedings. Volume 2: Papers and presentations

    Energy Technology Data Exchange (ETDEWEB)

    Rinne, R.L.

    1994-02-01

    This conference was organized to study and analyze the role of simulation, analysis, modeling, and exercises in the history of NATO policy. The premise was not that the results of past studies will apply to future policy, but rather that understanding what influenced the decision process -- and how -- would be of value. The structure of the conference was built around discussion panels. The panels were augmented by a series of papers and presentations focusing on particular TNF events, issues, studies, or exercises. The conference proceedings consist of three volumes. Volume 1 contains the conference introduction, agenda, biographical sketches of principal participants, and analytical summary of the presentations and panels. This volume contains a short introduction and the papers and presentations from the conference. Volume 3 contains selected papers by Brig. Gen. Robert C. Richardson III (Ret.). Individual papers in this volume were abstracted and indexed for the database.