WorldWideScience

Sample records for analysis task volume

  1. Underground Test Area Subproject Phase I Data Analysis Task. Volume VI - Groundwater Flow Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-11-01

    Volume VI of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the groundwater flow model data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  2. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  3. Underground Test Area Subproject Phase I Data Analysis Task. Volume IV - Hydrologic Parameter Data Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-09-01

    Volume IV of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the hydrologic parameter data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  4. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  5. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  6. Underground Test Area Subproject Phase I Data Analysis Task. Volume III - Groundwater Recharge and Discharge Data Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-10-01

    Volume III of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the data covering groundwater recharge and discharge. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  7. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A. [Pacific Science and Engineering Group, San Diego, CA (United States); Saunders, W.M.; Lepage, R.P.; Chin, E. [University of California San Diego Medical Center, CA (United States). Div. of Radiation Oncology; Schoenfeld, I.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.

  8. Subcortical volume analysis in traumatic brain injury: the importance of the fronto-striato-thalamic circuit in task switching.

    Science.gov (United States)

    Leunissen, Inge; Coxon, James P; Caeyenberghs, Karen; Michiels, Karla; Sunaert, Stefan; Swinnen, Stephan P

    2014-02-01

    Traumatic brain injury (TBI) is associated with neuronal loss, diffuse axonal injury and executive dysfunction. Whereas executive dysfunction has traditionally been associated with prefrontal lesions, ample evidence suggests that those functions requiring behavioral flexibility critically depend on the interaction between frontal cortex, basal ganglia and thalamus. To test whether structural integrity of this fronto-striato-thalamic circuit can account for executive impairments in TBI we automatically segmented the thalamus, putamen and caudate of 25 patients and 21 healthy controls and obtained diffusion weighted images. We assessed components of executive function using the local-global task, which requires inhibition, updating and switching between actions. Shape analysis revealed localized atrophy of the limbic, executive and rostral-motor zones of the basal ganglia, whereas atrophy of the thalami was more global in TBI. This subcortical atrophy was related to white matter microstructural organization in TBI, suggesting that axonal injuries possibly contribute to subcortical volume loss. Global volume of the nuclei showed no clear relationship with task performance. However, the shape analysis revealed that participants with smaller volume of those subregions that have connections with the prefrontal cortex and rostral motor areas showed higher switch costs and mixing costs, and made more errors while switching. These results support the idea that flexible cognitive control over action depends on interactions within the fronto-striato-thalamic circuit.

  9. Blade loss transient dynamics analysis, volume 1. Task 2: TETRA 2 theoretical development

    Science.gov (United States)

    Gallardo, Vincente C.; Black, Gerald

    1986-01-01

    The theoretical development of the forced steady state analysis of the structural dynamic response of a turbine engine having nonlinear connecting elements is discussed. Based on modal synthesis, and the principle of harmonic balance, the governing relations are the compatibility of displacements at the nonlinear connecting elements. There are four displacement compatibility equations at each nonlinear connection, which are solved by iteration for the principle harmonic of the excitation frequency. The resulting computer program, TETRA 2, combines the original TETRA transient analysis (with flexible bladed disk) with the steady state capability. A more versatile nonlinear rub or bearing element which contains a hardening (or softening) spring, with or without deadband, is also incorporated.

  10. Blade loss transient dynamics analysis, volume 1. Task 1: Survey and perspective. [aircraft gas turbine engines

    Science.gov (United States)

    Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.

  11. Genetic Inventory Task Final Report. Volume 2

    Science.gov (United States)

    Venkateswaran, Kasthuri; LaDuc, Myron T.; Vaishampayan, Parag

    2012-01-01

    Contaminant terrestrial microbiota could profoundly impact the scientific integrity of extraterrestrial life-detection experiments. It is therefore important to know what organisms persist on spacecraft surfaces so that their presence can be eliminated or discriminated from authentic extraterrestrial biosignatures. Although there is a growing understanding of the biodiversity associated with spacecraft and cleanroom surfaces, it remains challenging to assess the risk of these microbes confounding life-detection or sample-return experiments. A key challenge is to provide a comprehensive inventory of microbes present on spacecraft surfaces. To assess the phylogenetic breadth of microorganisms on spacecraft and associated surfaces, the Genetic Inventory team used three technologies: conventional cloning techniques, PhyloChip DNA microarrays, and 454 tag-encoded pyrosequencing, together with a methodology to systematically collect, process, and archive nucleic acids. These three analysis methods yielded considerably different results: Traditional approaches provided the least comprehensive assessment of microbial diversity, while PhyloChip and pyrosequencing illuminated more diverse microbial populations. The overall results stress the importance of selecting sample collection and processing approaches based on the desired target and required level of detection. The DNA archive generated in this study can be made available to future researchers as genetic-inventory-oriented technologies further mature.

  12. [Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio]. Volume 3, Sampling and analysis plan (SAP): Phase 1, Task 4, Field Investigation: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    In April 1990, Wright-Patterson Air Force Base (WPAFB), initiated an investigation to evaluate a potential Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) removal action to prevent, to the extent practicable, the offsite migration of contaminated ground water from WPAFB. WPAFB retained the services of the Environmental Management Operations (EMO) and its principle subcontractor, International Technology Corporation (IT) to complete Phase 1 of the environmental investigation of ground-water contamination at WPAFB. Phase 1 of the investigation involves the short-term evaluation and potential design for a program to remove ground-water contamination that appears to be migrating across the western boundary of Area C, and across the northern boundary of Area B along Springfield Pike. Primarily, Task 4 of Phase 1 focuses on collection of information at the Area C and Springfield Pike boundaries of WPAFB. This Sampling and Analysis Plan (SAP) has been prepared to assist in completion of the Task 4 field investigation and is comprised of the Quality Assurance Project Plan (QAPP) and the Field Sampling Plan (FSP).

  13. Using Task Data in Diagnostic Radiology. Research Report No. 8. Volume 2. Curriculum Objectives for Radiologic Technology.

    Science.gov (United States)

    Gilpatrick, Eleanor; Gullion, Christina

    This volume is the result of the application of the Health Services Mobility Study (HSMS) curriculum design method in radiologic technology and is presented in conjunction with volume 1, which reports the task analysis results. Volume 2 contains job-related behavioral curriculum objectives for the aide, technician, and technologist levels in…

  14. Bare-Hand Volume Cracker for Raw Volume Data Analysis

    Directory of Open Access Journals (Sweden)

    Bireswar Laha

    2016-09-01

    Full Text Available Analysis of raw volume data generated from different scanning technologies faces a variety of challenges, related to search, pattern recognition, spatial understanding, quantitative estimation, and shape description. In a previous study, we found that the Volume Cracker (VC 3D interaction (3DI technique mitigated some of these problems, but this result was from a tethered glove-based system with users analyzing simulated data. Here, we redesigned the VC by using untethered bare-hand interaction with real volume datasets, with a broader aim of adoption of this technique in research labs. We developed symmetric and asymmetric interfaces for the Bare-Hand Volume Cracker (BHVC through design iterations with a biomechanics scientist. We evaluated our asymmetric BHVC technique against standard 2D and widely used 3D interaction techniques with experts analyzing scanned beetle datasets. We found that our BHVC design significantly outperformed the other two techniques. This study contributes a practical 3DI design for scientists, documents lessons learned while redesigning for bare-hand trackers, and provides evidence suggesting that 3D interaction could improve volume data analysis for a variety of visual analysis tasks. Our contribution is in the realm of 3D user interfaces tightly integrated with visualization, for improving the effectiveness of visual analysis of volume datasets. Based on our experience, we also provide some insights into hardware-agnostic principles for design of effective interaction techniques.

  15. Analysis of the structural parameters that influence gas production from the Devonian shale. Volume 1. Executive Summary and Task Reports. Annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Shumaker, R.C.; de Wys, J.N.; Dixon, J.M.

    1978-10-01

    The first portion of the report, from the Executive Summary (page 1) through the Schedule of Milestones (page 10), gives a general overview which highlights our progress and problems for the second year. The Task report portion of the text, written by individual task investigators, is designed primarily for scientists interested in technical details of the second year's work. The second portion of the report consists of appendices of data compiled by the principal investigators.

  16. Task Descriptions in Diagnostic Radiology. Research Report No. 7. Volume 2, Radiologic Technologist Tasks Dealing with Patient Procedures. Part I: Tasks 7 through 386.

    Science.gov (United States)

    Gilpatrick, Eleanor

    Part I of the second of four volumes in Research Report No. 7 of the Health Services Mobility Study (HSMS), this book contains 76 task descriptions covering most of the medical activities carried out by radiologic technologists. Chapter I of this volume defines "tasks" and tells how the descriptions were developed. Chapter 2 lists the…

  17. Deriving directions through procedural task analysis.

    Science.gov (United States)

    Yuen, H K; D'Amico, M

    1998-01-01

    Task analysis is one of the essential components of activity analysis. Procedural task analysis involves breaking down an activity into a sequence of steps. Directions are the sequence of steps resulting from the task analysis (i.e., the product of the task analysis). Directions become a guide for caregivers or trainers use in teaching clients a specific skill. However, occupational therapy students often have difficulty in writing directions that are clear enough for caregivers or trainers to carry out. Books on activity analysis only provide examples of directions without giving guidelines on how to perform the writing process. The purposes of this paper are to describe the process of procedural task analysis and to provide a guideline for writing steps of directions.

  18. A Brief Analysis of Communication Tasks in Task- based Teaehing

    Institute of Scientific and Technical Information of China (English)

    Xu Xiaoying

    2011-01-01

    Task -Based Language Teaching (TBLT) aims at proving opportunities for the learners to experiment with and explore both spoken and written language through learning activities. This passage further exam if the following four communicative tasks jigsaw tasks, role - play tasks, problem solving tasks, and information gap tasks can assist classroom learning.

  19. Analysis of Human Communication during Assembly Tasks.

    Science.gov (United States)

    1986-06-01

    AD-A7l 43 ANALYSIS OF HUMAN COMMUNICATION DURING ASSEMBLY TASKS in1(U) CRNEGIE-MELLO UNIY PITTSBURGH PA ROBOTICS INST UNCLSSIIEDK S BARBER ET AL...ao I Dur~~~~IngAbcbyTs; 7c .S:in i lSAo .0. Analysis of Human Communication During Assembly Tasks K. Suzanne Barber and Gerald J. Agin CMU-RI-TR-86-1...TYPE or REPORT & PE-Rioo CevCZaz Analysis of Human Communication During Assembly Inlterim Tasks I . PERFORMING 00RG. REPORT NUMBER 1. £UT~oOR~e) IL

  20. Task Analysis of the UH-60 Mission and Decision Rules for Developing a UH-60 Workload Prediction Model. Volume 1. Summary Report

    Science.gov (United States)

    1989-02-01

    as a baseline for all proposed model changes or other proposed multistage improvement program ( MSIP ). A computer model of this analysis was used to...support in the coordination of activities with F Company. The authors wish to thank Ms . Cassandra Hocutt, Anacapa Sciences, Inc., for her assistance in...develop smooth-flowing function and segment decision rules. Her assistance is greatly appreciated. The authors also wish to thank Ms . Nadine McCollim

  1. Task Descriptions in Diagnostic Radiology. Research Report No. 7. Volume 2, Radiologic Technologist Tasks Dealing with Patient Procedures. Part II: Tasks 387 through 526.

    Science.gov (United States)

    Gilpatrick, Eleanor

    Part II of the second of four volumes in Research Report No. 7 of the Health Services Mobility Study (HSMS), this book is the remainder of Chapter 3, which contains 76 task descriptions covering most of the medical activities carried out by radiologic technologists. The steps of the task descriptions are presented in logical sequence in…

  2. An ergonomic task analysis of spinal anaesthesia.

    LENUS (Irish Health Repository)

    Ajmal, Muhammad

    2009-12-01

    Ergonomics is the study of physical interaction between humans and their working environment. The objective of this study was to characterize the performance of spinal anaesthesia in an acute hospital setting, applying ergonomic task analysis.

  3. Task force on compliance and enforcement. Final report. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    Recommendations for measures to strengthen the FEA enforcement program in the area of petroleum price regulation are presented. Results of task force efforts are presented in report and recommendations sections concerned with pending cases, compliance program organization, enforcement powers, compliance strategy, and audit staffing and techniques. (JRD)

  4. A Cognitive Task Analysis for Dental Hygiene.

    Science.gov (United States)

    Cameron, Cheryl A.; Beemsterboer, Phyllis L.; Johnson, Lynn A.; Mislevy, Robert J.; Steinberg, Linda S.; Breyer, F. Jay

    2000-01-01

    As part of the development of a scoring algorithm for a simulation-based dental hygiene initial licensure examination, this effort conducted a task analysis of the dental hygiene domain. Broad classes of behaviors that distinguish along the dental hygiene expert-novice continuum were identified and applied to the design of nine paper-based cases…

  5. National facilities study. Volume 5: Space research and development facilities task group

    Science.gov (United States)

    1994-01-01

    With the beginnings of the U.S. space program, there was a pressing need to develop facilities that could support the technology research and development, testing, and operations of evolving space systems. Redundancy in facilities that was once and advantage in providing flexibility and schedule accommodation is instead fast becoming a burden on scarce resources. As a result, there is a clear perception in many sectors that the U.S. has many space R&D facilities that are under-utilized and which are no longer cost-effective to maintain. At the same time, it is clear that the U.S. continues to possess many space R&D facilities which are the best -- or among the best -- in the world. In order to remain world class in key areas, careful assessment of current capabilities and planning for new facilities is needed. The National Facility Study (NFS) was initiated in 1992 to develop a comprehensive and integrated long-term plan for future aerospace facilities that meets current and projected government and commercial needs. In order to assess the nation's capability to support space research and development (R&D), a Space R&D Task Group was formed. The Task Group was co-chaired by NASA and DOD. The Task Group formed four major, technologically- and functionally- oriented working groups: Human and Machine Operations; Information and Communications; Propulsion and Power; and Materials, Structures, and Flight Dynamics. In addition to these groups, three supporting working groups were formed: Systems Engineering and Requirements; Strategy and Policy; and Costing Analysis. The Space R&D Task Group examined several hundred facilities against the template of a baseline mission and requirements model (developed in common with the Space Operations Task Group) and a set of excursions from the baseline. The model and excursions are described in Volume 3 of the NFS final report. In addition, as a part of the effort, the group examined key strategic issues associated with space R

  6. Final report on the Pathway Analysis Task

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, F.W.; Kirchner, T.B. [Colorado State Univ., Fort Collins, CO (United States)

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University`s Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere.

  7. Research and development of a heat-pump water heater. Volume 2. R and D task reports

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, R.L.; Amthor, F.R.; Doyle, E.J.

    1978-08-01

    The heat pump water heater is a device that works much like a window air conditioner except that heat from the home is pumped into a water tank rather than to the outdoors. The objective established for the device is to operate with a Coefficient of Performance (COP) of 3 or, an input of one unit of electric energy would create three units of heat energy in the form of hot water. With such a COP, the device would use only one-third the energy and at one-third the cost of a standard resistance water heater. This Volume 2 contains the final reports of the three major tasks performed in Phase I. In Task 2, a market study identifies the future market and selects an initial target market and channel of distribution, all based on an analysis of the parameters affecting feasibility of the device and the factors that will affect its market acceptance. In the Task 3 report, the results of a design and test program to arrive at final designs of heat pumps for both new water heaters and for retrofitting existing water heaters are presented. In the Task 4 report, a plan for an extensive field demonstration involving use in actual homes is presented. Volume 1 contains a final summary report of the information in Volume 2.

  8. Data analysis & probability task & drill sheets

    CERN Document Server

    Cook, Tanya

    2011-01-01

    For grades 3-5, our State Standards-based combined resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. The task sheets introduce the mathematical concepts to the students around a central problem taken from real-life experiences, while the drill sheets provide warm-up and timed practice questions for the students to strengthen their procedural proficiency skills. Included in our resource are activities to help students learn how to collect, organize, analyze, interpret, and predict data pro

  9. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  10. Task Analysis in Instructional Program Development. Theoretical Paper No. 52.

    Science.gov (United States)

    Bernard, Michael E.

    A review of task analysis procedures beginning with the military training and systems development approach and covering the more recent work of Gagne, Klausmeier, Merrill, Resnick, and others is presented along with a plan for effective instruction based on the review of task analysis. Literature dealing with the use of task analysis in programmed…

  11. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  12. Cue Representation and Situational Awareness in Task Analysis

    Science.gov (United States)

    Carl, Diana R.

    2009-01-01

    Task analysis in human performance technology is used to determine how human performance can be well supported with training, job aids, environmental changes, and other interventions. Early work by Miller (1953) and Gilbert (1969, 1974) addressed cue processing in task execution and recommended cue descriptions in task analysis. Modern task…

  13. Analysis of Task-based Syllabus

    Institute of Scientific and Technical Information of China (English)

    马进胜

    2011-01-01

    Task-based language teaching is very popular in the modem English teaching.It is based on the Task-based Syllabus.Taskbased Syllabus focuses on the learners' communicative competence,which stresses learning by doing.From the theoretical assumption and definitions of the task,the paper analysizes the components of the task,then points out the merits and demerits of the syllabus.By this means the paper may give some tips to teachers and students when they use the tsk-based language teaching.

  14. Spatial Visualization Tasks To Support Students’ Spatial Structuring In Learning Volume Measurement

    Directory of Open Access Journals (Sweden)

    Shintia Revina

    2011-07-01

    Full Text Available Many prior researches found that most of students in grade five tended to have difficulty in fully grasping the concept of volume measurement because they have to build their competence in spatial structuring. The unit of volume “packing” measurement must be integrated and coordinated in three-dimension. On the other hand, it is revealed the errors that students made on the volume measurement tasks with threedimensional cube arrays are related to some aspects of spatial visualization, such as the skill to "read off" two-dimensional representation of solid objects. For those reasons, this research is aimed to develop classroom activities with the use of spatial visualization tasks to support students’ spatial structuring in learning volume measurement. Consequently, design research was chosen as an appropriate means to achieve this research goal. In this research, a sequence of instructional activities is designed and developed based on the hypothesis of students’ learning processes. This research was conducted in grade 5 of SD Pupuk Sriwijaya Palembang, Indonesia.Keywords: volume measurement, spatial structuring, spatial visualization, design research. DOI: http://dx.doi.org/10.22342/jme.2.2.745.127-146

  15. Workplace for analysis of task performance

    NARCIS (Netherlands)

    Bos, J; Mulder, LJM; van Ouwerkerk, RJ; Maarse, FJ; Akkerman, AE; Brand, AN; Mulder, LJM

    2003-01-01

    In current research on mental workload and task performance a large gap exists between laboratory based studies and research projects in real life working practice. Tasks conducted within a laboratory environment often lack a strong resemblance with real life working situations. This paper presents

  16. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 2: Participant Case Studies

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  17. Spatial Visualization Tasks To Support Students’ Spatial Structuring In Learning Volume Measurement

    Directory of Open Access Journals (Sweden)

    Shintia Revina

    2011-07-01

    Full Text Available Many prior researches found that most of students in grade five tendedto have difficulty in fully grasping the concept of volume measurementbecause they have to build their competence in spatial structuring. The unit of volume “packing” measurement must be integrated andcoordinated in three-dimension. On the other hand, it is revealed theerrors that students made on the volume measurement tasks with threedimensional cube arrays are related to some aspects of spatialvisualization, such as the skill to "read off" two-dimensionalrepresentation of solid objects. For those reasons, this research is aimed to develop classroom activities with the use of spatial visualization tasks to support students’ spatial structuring in learning volume measurement. Consequently, design research was chosen as an appropriate means to achieve this research goal. In this research, a sequence of instructional activities is designed and developed based on the hypothesis of students’ learning processes. This research was conducted in grade 5 of SD Pupuk Sriwijaya Palembang, Indonesia

  18. Task Analysis Assessment on Intrastate Bus Traffic Controllers

    Science.gov (United States)

    Yen Bin, Teo; Azlis-Sani, Jalil; Nur Annuar Mohd Yunos, Muhammad; Ismail, S. M. Sabri S. M.; Tajedi, Noor Aqilah Ahmad

    2016-11-01

    Public transportation acts as social mobility and caters the daily needs of the society for passengers to travel from one place to another. This is true for a country like Malaysia where international trade has been growing significantly over the past few decades. Task analysis assessment was conducted with the consideration of cognitive ergonomic view towards problem related to human factors. Conducting research regarding the task analysis on bus traffic controllers had allowed a better understanding regarding the nature of work and the overall monitoring activities of the bus services. This paper served to study the task analysis assessment on intrastate bus traffic controllers and the objectives of this study include to conduct task analysis assessment on the bus traffic controllers. Task analysis assessment for the bus traffic controllers was developed via Hierarchical Task Analysis (HTA). There are a total of five subsidiary tasks on level one and only two were able to be further broken down in level two. Development of HTA allowed a better understanding regarding the work and this could further ease the evaluation of the tasks conducted by the bus traffic controllers. Thus, human error could be reduced for the safety of all passengers and increase the overall efficiency of the system. Besides, it could assist in improving the operation of the bus traffic controllers by modelling or synthesizing the existing tasks if necessary.

  19. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  20. High volume data storage architecture analysis

    Science.gov (United States)

    Malik, James M.

    1990-01-01

    A High Volume Data Storage Architecture Analysis was conducted. The results, presented in this report, will be applied to problems of high volume data requirements such as those anticipated for the Space Station Control Center. High volume data storage systems at several different sites were analyzed for archive capacity, storage hierarchy and migration philosophy, and retrieval capabilities. Proposed architectures were solicited from the sites selected for in-depth analysis. Model architectures for a hypothetical data archiving system, for a high speed file server, and for high volume data storage are attached.

  1. A Task-Content Analysis of an Introductory Entomology Curriculum.

    Science.gov (United States)

    Brandenburg, R.

    Described is an analysis of the content, tasks, and strategies needed by students to enable them to identify insects to order by sight and to family by use of a standard dichotomous taxonomic key. Tasks and strategies are broken down and arranged progressively in the approximate order in which students should progress. Included are listings of…

  2. Professional Development for Mathematics Teachers: Using Task Design and Analysis

    Science.gov (United States)

    Lee, Hea-Jin; Özgün-Koca, S. Asli

    2016-01-01

    This study is based on a Task Design and Analysis activity from a year-long professional development program. The activity was designed to increase teacher growth in several areas, including knowledge of mathematics, understanding of students' cognitive activity, knowledge of good questions, and ability to develop and improve high quality tasks.…

  3. A Task that Elicits Reasoning: A Dual Analysis

    Science.gov (United States)

    Yankelewitz, Dina; Mueller, Mary; Maher, Carolyn A.

    2010-01-01

    This paper reports on the forms of reasoning elicited as fourth grade students in a suburban district and sixth grade students in an urban district worked on similar tasks involving reasoning with the use of Cuisenaire rods. Analysis of the two data sets shows similarities in the reasoning used by both groups of students on specific tasks, and the…

  4. Sentiment Analysis of Suicide Notes: A Shared Task.

    Science.gov (United States)

    Pestian, John P; Matykiewicz, Pawel; Linn-Gust, Michelle; South, Brett; Uzuner, Ozlem; Wiebe, Jan; Cohen, K Bretonnel; Hurdle, John; Brew, Christopher

    2012-01-30

    This paper reports on a shared task involving the assignment of emotions to suicide notes. Two features distinguished this task from previous shared tasks in the biomedical domain. One is that it resulted in the corpus of fully anonymized clinical text and annotated suicide notes. This resource is permanently available and will (we hope) facilitate future research. The other key feature of the task is that it required categorization with respect to a large set of labels. The number of participants was larger than in any previous biomedical challenge task. We describe the data production process and the evaluation measures, and give a preliminary analysis of the results. Many systems performed at levels approaching the inter-coder agreement, suggesting that human-like performance on this task is within the reach of currently available technologies.

  5. Swept Volume Parameterization for Isogeometric Analysis

    Science.gov (United States)

    Aigner, M.; Heinrich, C.; Jüttler, B.; Pilgerstorfer, E.; Simeon, B.; Vuong, A.-V.

    Isogeometric Analysis uses NURBS representations of the domain for performing numerical simulations. The first part of this paper presents a variational framework for generating NURBS parameterizations of swept volumes. The class of these volumes covers a number of interesting free-form shapes, such as blades of turbines and propellers, ship hulls or wings of airplanes. The second part of the paper reports the results of isogeometric analysis which were obtained with the help of the generated NURBS volume parameterizations. In particular we discuss the influence of the chosen parameterization and the incorporation of boundary conditions.

  6. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.B.; Stobbs, J.J.; Collier, D.M.; Hobbs, J.S.

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US counries. Data are compiled in this volume on Canada, Egypt, Federal Republic of Germany, Finland, and France.

  7. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US countries. This volume contains compiled data on Mexico, Netherlands, Pakistan, Philippines, South Africa, South Korea, and Spain.

  8. Units of analysis in task-analytic research.

    Science.gov (United States)

    Haring, T G; Kennedy, C H

    1988-01-01

    We develop and discuss four criteria for evaluating the appropriateness of units of analysis for task-analytic research and suggest potential alternatives to the units of analysis currently used. Of the six solutions discussed, the most commonly used unit of analysis in current behavior analytic work, percentage correct, meets only one of the four criteria. Five alternative units of analysis are presented and evaluated: (a) percentage of opportunities to perform meeting criterion, (b) trials to criteria, (c) cumulative competent performances, (d) percentage correct with competent performance coded, and (e) percentage correct with competent performance coded and a grid showing performance on individual steps of the task analysis. Of the solutions evaluated, only one--percentage correct with competent performance coded and a task analysis grid--met all four criteria.

  9. International Behavior Analysis: The Operationalization Task

    Science.gov (United States)

    1976-02-15

    both source analysis and process analysis: (l) pcycholocical; (2) political ; (3) societal; (U) interstate; and (5) global. Nations have been...dimensions, generating 27 variables for 56 nations. Events are elftMified on th; basis of’six dimensions: (l) spa- tial; (?) rclaiicnal; (3) ter.-xorul, (U...stimuli. Accor- d ncly, it is potslbl« to identify three d’..estic (or internal) and two foreign (»«eternal) stimuli: (l) psycholoßical- (2) political

  10. Recurrence interval analysis of trading volumes.

    Science.gov (United States)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  11. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  12. Bifilar analysis study, volume 1

    Science.gov (United States)

    Miao, W.; Mouzakis, T.

    1980-01-01

    A coupled rotor/bifilar/airframe analysis was developed and utilized to study the dynamic characteristics of the centrifugally tuned, rotor-hub-mounted, bifilar vibration absorber. The analysis contains the major components that impact the bifilar absorber performance, namely, an elastic rotor with hover aerodynamics, a flexible fuselage, and nonlinear individual degrees of freedom for each bifilar mass. Airspeed, rotor speed, bifilar mass and tuning variations are considered. The performance of the bifilar absorber is shown to be a function of its basic parameters: dynamic mass, damping and tuning, as well as the impedance of the rotor hub. The effect of the dissimilar responses of the individual bifilar masses which are caused by tolerance induced mass, damping and tuning variations is also examined.

  13. Designing and Examining the Effects of a Dynamic Geometry Task Analysis Framework on Teachers' Written Geometer's Sketchpad Tasks

    Science.gov (United States)

    Trocki, Aaron David

    2015-01-01

    This study investigates the usefulness of a Dynamic Geometry Task Analysis Framework for indicating task quality in dynamic geometry environments in general, and The Geometer's Sketchpad in particular. This research sought to first establish the validity of the framework for indicating task quality, and to second explore the effects of the…

  14. Probabilistic analysis of manipulation tasks: A conceptual framework

    Energy Technology Data Exchange (ETDEWEB)

    Brost, R.C. [Sandia National Lab., Albuquerque, NM (United States); Christiansen, A.D. [Tulane Univ., New Orleans, LA (United States)

    1996-02-01

    This article addresses the problem of manipulation planning in the presence of uncertainty. We begin by reviewing the worst-case planning techniques introduced by Lozano-Perez et al. (1984) and show that these methods are limited by an information gap inherent to worst-case analysis techniques. As the task uncertainty increases, these methods fail to produce useful information, even though a high-quality plan may exist. To fill this gap, we present the notion of a probabilistic back-projection, which describes the likelihood that a given action will achieve the task goal from a given initial state. We provide a constructive definition of the probabilistic backprojection and related probabilistic models of munipulation task mechanics and show how these models unify and enhance several past results in manipulation planning. These models capture the fundamental nature of the task behavior but appear to be very complex. We present the results of laboratory experiments, comprising over 100,000 grasping trials, that measured the probabilistic backprojection of a grasping task under varying conditions. The resulting data support the probabilistic backprojection model and illustrate a task in which probabilistic analysis is required. We sketch methods for computing these models and using them to construct multiple-step plans. 25 refs., 20 figs.

  15. Feasibility of developing a portable driver performance data acquisition system for human factors research: Technical tasks. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.J.; Barickman, F.S.; Spelt, P.F.; Schmoyer, R.L.; Kirkpatrick, J.R.

    1998-01-01

    A two-phase, multi-year research program entitled ``development of a portable driver performance data acquisition system for human factors research`` was recently completed. The primary objective of the project was to develop a portable data acquisition system for crash avoidance research (DASCAR) that will allow drive performance data to be collected using a large variety of vehicle types and that would be capable of being installed on a given vehicle type within a relatively short-time frame. During phase 1 a feasibility study for designing and fabricating DASCAR was conducted. In phase 2 of the research DASCAR was actually developed and validated. This technical memorandum documents the results from the feasibility study. It is subdivided into three volumes. Volume one (this report) addresses the last five items in the phase 1 research and the first issue in the second phase of the project. Volumes two and three present the related appendices, and the design specifications developed for DASCAR respectively. The six tasks were oriented toward: identifying parameters and measures; identifying analysis tools and methods; identifying measurement techniques and state-of-the-art hardware and software; developing design requirements and specifications; determining the cost of one or more copies of the proposed data acquisition system; and designing a development plan and constructing DASCAR. This report also covers: the background to the program; the requirements for the project; micro camera testing; heat load calculations for the DASCAR instrumentation package in automobile trunks; phase 2 of the research; the DASCAR hardware and software delivered to the National Highway Traffic Safety Administration; and crash avoidance problems that can be addressed by DASCAR.

  16. A Validated Task Analysis of the Single Pilot Operations Concept

    Science.gov (United States)

    Wolter, Cynthia A.; Gore, Brian F.

    2015-01-01

    The current day flight deck operational environment consists of a two-person Captain/First Officer crew. A concept of operations (ConOps) to reduce the commercial cockpit to a single pilot from the current two pilot crew is termed Single Pilot Operations (SPO). This concept has been under study by researchers in the Flight Deck Display Research Laboratory (FDDRL) at the National Aeronautics and Space Administration's (NASA) Ames (Johnson, Comerford, Lachter, Battiste, Feary, and Mogford, 2012) and researchers from Langley Research Centers (Schutte et al., 2007). Transitioning from a two pilot crew to a single pilot crew will undoubtedly require changes in operational procedures, crew coordination, use of automation, and in how the roles and responsibilities of the flight deck and ATC are conceptualized in order to maintain the high levels of safety expected of the US National Airspace System. These modifications will affect the roles and the subsequent tasks that are required of the various operators in the NextGen environment. The current report outlines the process taken to identify and document the tasks required by the crew according to a number of operational scenarios studied by the FDDRL between the years 2012-2014. A baseline task decomposition has been refined to represent the tasks consistent with a new set of entities, tasks, roles, and responsibilities being explored by the FDDRL as the move is made towards SPO. Information from Subject Matter Expert interviews, participation in FDDRL experimental design meetings, and study observation was used to populate and refine task sets that were developed as part of the SPO task analyses. The task analysis is based upon the proposed ConOps for the third FDDRL SPO study. This experiment possessed nine different entities operating in six scenarios using a variety of SPO-related automation and procedural activities required to guide safe and efficient aircraft operations. The task analysis presents the roles and

  17. Affordance Analysis--Matching Learning Tasks with Learning Technologies

    Science.gov (United States)

    Bower, Matt

    2008-01-01

    This article presents a design methodology for matching learning tasks with learning technologies. First a working definition of "affordances" is provided based on the need to describe the action potentials of the technologies (utility). Categories of affordances are then proposed to provide a framework for analysis. Following this, a…

  18. An Analysis of Tasks Performed in the Ornamental Horticulture Industry.

    Science.gov (United States)

    Berkey, Arthur L.; Drake, William E.

    This publication is the result of a detailed task analysis study of the ornamental horticulture industry in New York State. Nine types of horticulture businesses identified were: (1) retail florists, (2) farm and garden supply store, (3) landscape services, (4) greenhouse production, (5) nursery production, (6) turf production, (7) arborist…

  19. Partial Derivative Games in Thermodynamics: A Cognitive Task Analysis

    Science.gov (United States)

    Kustusch, Mary Bridget; Roundy, David; Dray, Tevian; Manogue, Corinne A.

    2014-01-01

    Several studies in recent years have demonstrated that upper-division students struggle with the mathematics of thermodynamics. This paper presents a task analysis based on several expert attempts to solve a challenging mathematics problem in thermodynamics. The purpose of this paper is twofold. First, we highlight the importance of cognitive task…

  20. Sandia-Power Surety Task Force Hawaii foam analysis.

    Energy Technology Data Exchange (ETDEWEB)

    McIntyre, Annie

    2010-11-01

    The Office of Secretary of Defense (OSD) Power Surety Task Force was officially created in early 2008, after nearly two years of work in demand reduction and renewable energy technologies to support the Warfighter in Theater. The OSD Power Surety Task Force is tasked with identifying efficient energy solutions that support mission requirements. Spray foam insulation demonstrations were recently expanded beyond field structures to include military housing at Ft. Belvoir. Initial results to using the foam in both applications are favorable. This project will address the remaining key questions: (1) Can this technology help to reduce utility costs for the Installation Commander? (2) Is the foam cost effective? (3) What application differences in housing affect those key metrics? The critical need for energy solutions in Hawaii and the existing relationships among Sandia, the Department of Defense (DOD), the Department of Energy (DOE), and Forest City, make this location a logical choice for a foam demonstration. This project includes application and analysis of foam to a residential duplex at the Waikulu military community on Oahu, Hawaii, as well as reference to spray foam applied to a PACOM facility and additional foamed units on Maui, conducted during this project phase. This report concludes the analysis and describes the utilization of foam insulation at military housing in Hawaii and the subsequent data gathering and analysis.

  1. Working with horses: an OWAS work task analysis.

    Science.gov (United States)

    Löfqvist, L; Pinzke, S

    2011-01-01

    Most work in horse stables is performed manually in much the same way as a century ago. It is the least mechanized sector dealing with large animals. People working with horses are exposed to several types of risk for developing musculoskeletal problems, but the work tasks and workload have not been investigated in detail. The aim of this study was to estimate the postural load of the work tasks performed around horses to find those that were harmful and required measures to be taken to reduce physical strain. Altogether, 20 subjects (stable attendants and riding instructors) were video recorded while carrying out their work in the stable, and preparing and conducting riding lessons. The work was analyzed with the Ovako Working posture Analysis System (OWAS) to determine the postural load and to categorize the potential harmfulness of the work postures. Three work tasks involved about 50% of the work positions in the three OWAS categories (AC2 to AC4) where measures for improvement are needed: "mucking out" (50%), "bedding preparation" (48%), and "sweeping" (48%). These work tasks involved over 60% work postures where the back was bent, twisted, or both bent and twisted. Therefore, it is important to find preventive measures to reduce the workload, which could include improved tools, equipment, and work technique.

  2. Relationship between stroke volume and pulse pressure during blood volume perturbation: a mathematical analysis.

    Science.gov (United States)

    Bighamian, Ramin; Hahn, Jin-Oh

    2014-01-01

    Arterial pulse pressure has been widely used as surrogate of stroke volume, for example, in the guidance of fluid therapy. However, recent experimental investigations suggest that arterial pulse pressure is not linearly proportional to stroke volume. However, mechanisms underlying the relation between the two have not been clearly understood. The goal of this study was to elucidate how arterial pulse pressure and stroke volume respond to a perturbation in the left ventricular blood volume based on a systematic mathematical analysis. Both our mathematical analysis and experimental data showed that the relative change in arterial pulse pressure due to a left ventricular blood volume perturbation was consistently smaller than the corresponding relative change in stroke volume, due to the nonlinear left ventricular pressure-volume relation during diastole that reduces the sensitivity of arterial pulse pressure to perturbations in the left ventricular blood volume. Therefore, arterial pulse pressure must be used with care when used as surrogate of stroke volume in guiding fluid therapy.

  3. Learners Performing Tasks in a Japanese EFL Classroom: A Multimodal and Interpersonal Approach to Analysis

    Science.gov (United States)

    Stone, Paul

    2012-01-01

    In this paper I describe and analyse learner task-based interactions from a multimodal perspective with the aim of better understanding how learners' interpersonal relationships might affect task performance. Task-based pedagogy is focused on classroom interaction between learners, yet analysis of tasks has often neglected the analysis of this…

  4. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 4, Task 5, Operation of PFH on beneficiated shale, Task 6, Environmental data and mitigation analyses and Task 7, Sample procurement, preparation, and characterization: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    The objective of Task 5 (Operation of Pressurized Fluidized-Bed Hydro-Retorting (PFH) on Beneficiated Shale) was to modify the PFH process to facilitate its use for fine-sized, beneficiated Eastern shales. This task was divided into 3 subtasks: Non-Reactive Testing, Reactive Testing, and Data Analysis and Correlations. The potential environment impacts of PFH processing of oil shale must be assessed throughout the development program to ensure that the appropriate technologies are in place to mitigate any adverse effects. The overall objectives of Task 6 (Environmental Data and Mitigation Analyses) were to obtain environmental data relating to PFH and shale beneficiation and to analyze the potential environmental impacts of the integrated PFH process. The task was divided into the following four subtasks. Characterization of Processed Shales (IGT), 6.2. Water Availability and Treatment Studies, 6.3. Heavy Metals Removal and 6.4. PFH Systems Analysis. The objective of Task 7 (Sample Procurement, Preparation, and Characterization) was to procure, prepare, and characterize raw and beneficiated bulk samples of Eastern oil shale for all of the experimental tasks in the program. Accomplishments for these tasks are presented.

  5. Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis

    Science.gov (United States)

    2006-01-01

    Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis Laura Kurland, Abigail Gertner, Tom Bartee, Michael Chisholm and...environment to capture qualitative and quantitative data by recording time-stamped eye tracking , screen capture of an Electronic Light Table...TITLE AND SUBTITLE Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  6. Spatial Visualization Tasks to Support Students' Spatial Structuring in Learning Volume Measurement

    Science.gov (United States)

    Revina, Shintia; Zulkardi; Darmawijoyo; van Galen, Frans

    2011-01-01

    Many prior researches found that most of students in grade five tended to have difficulty in fully grasping the concept of volume measurement because they have to build their competence in spatial structuring. The unit of volume "packing" measurement must be integrated and coordinated in three-dimension. On the other hand, it is revealed…

  7. U. S. Army Land Warfare Laboratory. Volume II Appendix B. Task Sheets

    Science.gov (United States)

    1974-06-01

    69 Personnel Marker Grenade (RC) B-367 12-C-b9 Characterization of Effluvia From Cannabis B-368 01-C-70 Man-Portable Pyrotechnic Searchlight B-369 02...AUTHORIZED FUNDING: $99,656 TASK DURATION: 4 August 1970 to 29 June 1973 CONTRACTOR: Breed Corporation DESCRIPTION AND RESULTS: The Pursuit Deterring...could commence 90 days from award of contract. T B-368 TASK NUMBER: 12-C-69 TITLE: Characterization of Effluvia from Cannabis AUTHORIZED FUNDING

  8. Aerobic fitness relates to learning on a virtual morris water task and hippocampal volume in adolescents

    OpenAIRE

    Herting, Megan M.; Nagel, Bonnie J

    2012-01-01

    In rodents, exercise increases hippocampal neurogenesis and allows for better learning and memory performance on water maze tasks. While exercise has also been shown to be beneficial for the brain and behavior in humans, no study has examined how exercise impacts spatial learning using a directly translational water maze task, or if these relationships exist during adolescence – a developmental period which the animal literature has shown to be especially vulnerable to exercise effects. In th...

  9. Physical and cognitive task analysis in interventional radiology

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S. [School of Psychology, University of Liverpool, Liverpool (United Kingdom)]. E-mail: sheenajj@liv.ac.uk; Healey, A. [Royal Liverpool University Hospital, Liverpool (United Kingdom); Evans, J. [Royal Liverpool University Hospital, Liverpool (United Kingdom); Murphy, M. [Royal Liverpool University Hospital, Liverpool (United Kingdom); Crawshaw, M. [Department of Psychology, University of Hull, Hull (United Kingdom); Gould, D. [Royal Liverpool University Hospital, Liverpool (United Kingdom)

    2006-01-15

    AIM: To identify, describe and detail the cognitive thought processes, decision-making, and physical actions involved in the preparation and successful performance of core interventional radiology procedures. MATERIALS AND METHODS: Five commonly performed core interventional radiology procedures were selected for cognitive task analysis. Several examples of each procedure being performed by consultant interventional radiologists were videoed. The videos of those procedures, and the steps required for successful outcome, were analysed by a psychologist and an interventional radiologist. Once a skeleton algorithm of the procedures was defined, further refinement was achieved using individual interview techniques with consultant interventional radiologists. Additionally a critique of each iteration of the established algorithm was sought from non-participating independent consultant interventional radiologists. RESULTS: Detailed task descriptions and decision protocols were developed for five interventional radiology procedures (arterial puncture, nephrostomy, venous access, biopsy-using both ultrasound and computed tomography, and percutaneous transhepatic cholangiogram). Identical tasks performed within these procedures were identified and standardized within the protocols. CONCLUSIONS: Complex procedures were broken down and their constituent processes identified. This might be suitable for use as a training protocol to provide a universally acceptable safe practice at the most fundamental level. It is envisaged that data collected in this way can be used as an educational resource for trainees and could provide the basis for a training curriculum in interventional radiology. It will direct trainees towards safe practice of the highest standard. It will also provide performance objectives of a simulator model.

  10. Using Goal Setting and Task Analysis to Enhance Task-Based Language Learning and Teaching

    Science.gov (United States)

    Rubin, Joan

    2015-01-01

    Task-Based Language Learning and Teaching has received sustained attention from teachers and researchers for over thirty years. It is a well-established pedagogy that includes the following characteristics: major focus on authentic and real-world tasks, choice of linguistic resources by learners, and a clearly defined non-linguistic outcome. This…

  11. Across-Task Priming Revisited: Response and Task Conflicts Disentangled Using Ex-Gaussian Distribution Analysis

    Science.gov (United States)

    Moutsopoulou, Karolina; Waszak, Florian

    2012-01-01

    The differential effects of task and response conflict in priming paradigms where associations are strengthened between a stimulus, a task, and a response have been demonstrated in recent years with neuroimaging methods. However, such effects are not easily disentangled with only measurements of behavior, such as reaction times (RTs). Here, we…

  12. Theoretical Analysis of Task-Based Language Teaching Pedagogy

    Institute of Scientific and Technical Information of China (English)

    HUANG Li-na

    2013-01-01

    Since the implementation of English class XinCheng, English teachers actively studying task-based language teaching approach, try to use task-based language teaching in the classroom teaching. This article will combine the implementation of task-based language teaching, and discussed the application of the task-based language teaching in English teaching.

  13. Evaluation of stereoscopic 3D displays for image analysis tasks

    Science.gov (United States)

    Peinsipp-Byma, E.; Rehfeld, N.; Eck, R.

    2009-02-01

    In many application domains the analysis of aerial or satellite images plays an important role. The use of stereoscopic display technologies can enhance the image analyst's ability to detect or to identify certain objects of interest, which results in a higher performance. Changing image acquisition from analog to digital techniques entailed the change of stereoscopic visualisation techniques. Recently different kinds of digital stereoscopic display techniques with affordable prices have appeared on the market. At Fraunhofer IITB usability tests were carried out to find out (1) with which kind of these commercially available stereoscopic display techniques image analysts achieve the best performance and (2) which of these techniques achieve a high acceptance. First, image analysts were interviewed to define typical image analysis tasks which were expected to be solved with a higher performance using stereoscopic display techniques. Next, observer experiments were carried out whereby image analysts had to solve defined tasks with different visualization techniques. Based on the experimental results (performance parameters and qualitative subjective evaluations of the used display techniques) two of the examined stereoscopic display technologies were found to be very good and appropriate.

  14. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  15. Consensus statement of the ESICM task force on colloid volume therapy in critically ill patients

    DEFF Research Database (Denmark)

    Reinhart, Konrad; Perner, Anders; Sprung, Charles L

    2012-01-01

    PURPOSE: Colloids are administered to more patients than crystalloids, although recent evidence suggests that colloids may possibly be harmful in some patients. The European Society of Intensive Care Medicine therefore assembled a task force to compile consensus recommendations based on the curre...

  16. National facilities study. Volume 2: Task group on aeronautical research and development facilities report

    Science.gov (United States)

    1994-01-01

    The Task Group on Aeronautics R&D Facilities examined the status and requirements for aeronautics facilities against the competitive need. Emphasis was placed on ground-based facilities for subsonic, supersonic and hypersonic aerodynamics, and propulsion. Subsonic and transonic wind tunnels were judged to be most critical and of highest priority. Results of the study are presented.

  17. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  18. Taste perception analysis using a semantic verbal fluency task

    Directory of Open Access Journals (Sweden)

    Ghemulet M

    2014-09-01

    Full Text Available Maria Ghemulet,1,2 Maria Baskini,3 Lambros Messinis,2,4 Eirini Mouza,1 Hariklia Proios1,5 1Department of Speech Therapy, Anagennisis (Revival Physical Recovery and Rehabilitation Centre, Nea Raidestos, Filothei, Thessaloniki, Greece; 2Department of Speech and Language Therapy, Technological Institute of Western Greece, Patra, Greece; 3Department of Neurosurgery, Interbalkan European Medical Centre, Thessaloniki, Greece; 4Neuropsychology Section, Department of Neurology, University of Patras, Medical School, Patras, Greece; 5Department of Education and Social Policy, University of Macedonia, Thessaloniki, Greece Abstract: A verbal fluency (VF task is a test used to examine cognitive perception. The main aim of this study was to explore a possible relationship between taste perception in the basic taste categories (sweet, salty, sour, and bitter and subjects’ taste preferences, using a VF task in healthy and dysphagic subjects. In addition, we correlated the results of the VF task with body mass index (BMI. The hypothesis is that categorical preferences would be consistent with the number of verbal responses. We also hypothesized that higher BMI (.30 kg/m2 would correlate with more responses in either some or all four categories. VF tasks were randomly administered. Analysis criteria included number of verbally produced responses, number of clusters, number of switches, number and type of errors, and VF consistency with taste preferences. Sixty Greek-speaking individuals participated in this study. Forty-three healthy subjects were selected with a wide range of ages, sex, and education levels. Seventeen dysphagic patients were then matched with 17 healthy subjects according to age, sex, and BMI. Quantitative one-way analysis of variance (between groups as well as repeated measures, post hoc, and chi-square, and qualitative analyses were performed. In the healthy subjects’ group, the differences among the mean number of responses for the four

  19. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  20. Comparative analysis of cognitive tasks for modeling mental workload with electroencephalogram.

    Science.gov (United States)

    Hwang, Taeho; Kim, Miyoung; Hwangbo, Minsu; Oh, Eunmi

    2014-01-01

    Previous electroencephalogram (EEG) studies have shown that cognitive workload can be estimated by using several types of cognitive tasks. In this study, we attempted to characterize cognitive tasks that have been used to manipulate workload for generating classification models. We carried out a comparative analysis between two representative types of working memory tasks: the n-back task and the mental arithmetic task. Based on experiments with 7 healthy subjects using Emotiv EPOC, we compared the consistency, robustness, and efficiency of each task in determining cognitive workload in a short training session. The mental arithmetic task seems consistent and robust in manipulating clearly separable high and low levels of cognitive workload with less training. In addition, the mental arithmetic task shows consistency despite repeated usage over time and without notable task adaptation in users. The current study successfully quantifies the quality and efficiency of cognitive workload modeling depending on the type and configuration of training tasks.

  1. Aircrew/Groundcrew Life Support Systems Research. Volume 2. CLIN 0002 Task Order Requirements

    Science.gov (United States)

    1993-07-01

    T. Webb, Dr. Lisa F. Weinstein, and Ms. Janet F. Wiegman . The author cross-reference in Part D allows the reader to find thw task report...exposures. B.13.2 Accomplishments The results of this effort are described in the following technical report. Wiegman JF, McLean SA, Olson RM, Pilmanis AA...Acquisition of physiological data during G-induced loss of consciousness (G-LOC). AL-TP-1992-0053. 1993:12pp. [T.O. 243 Wiegman JF, McLean SA, Olson RM

  2. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  3. Handbook of Systems Analysis: Volume 1. Overview. Chapter 2. The Genesis of Applied Systems Analysis

    OpenAIRE

    1981-01-01

    The International Institute for Applied Systems Analysis is preparing a Handbook of Systems Analysis, which will appear in three volumes: Volume 1: Overview is aimed at a widely varied audience of producers and users of systems analysis studies. Volume 2: Methods is aimed at systems analysts and other members of systems analysis teams who need basic knowledge of methods in which they are not expert; this volume contains introductory overviews of such methods. Volume 3: Cases co...

  4. The assessment of risky decision making: a factor analysis of performance on the Iowa Gambling Task, Balloon Analogue Risk Task, and Columbia Card Task.

    Science.gov (United States)

    Buelow, Melissa T; Blaine, Amber L

    2015-09-01

    Researchers and clinicians frequently use behavioral measures to assess decision making. The most common task that is marketed to clinicians is the Iowa Gambling Task (IGT), thought to assess risky decision making. How does performance on the IGT relate to performance on other common measures of decision making? The present study sought to examine relationships between the IGT, the Balloon Analogue Risk Task (BART), and the Columbia Card Task (CCT). Participants were 390 undergraduate students who completed the IGT, BART, and either the "hot" or "cold" CCT. Principal components factor analysis on the IGT, BART, and CCT-cold (n = 112) indicated that the IGT measures a different component of decision making than the BART, and the CCT-cold weakly correlated with early IGT trials. Results of the exploratory factor analysis on the IGT, BART, and CCT-hot (n = 108) revealed a similar picture: the IGT and BART assessed different types of decision making, and the BART and CCT-hot were weakly correlated. A confirmatory factor analysis (n = 170) indicated that a 3-factor model without the CCT-cold (Factor 1: later IGT trials; Factor 2: BART; and Factor 3: early IGT trials) was a better fitting model than one that included the CCT-cold and early IGT trials on the same factor. Collectively, the present results suggest that the IGT, BART, and CCT all measure unique, nonoverlapping decision making processes. Further research is needed to more fully understand the neuropsychological construct of decision making.

  5. National Aviation Fuel Scenario Analysis Program (NAFSAP). Volume I. Model Description. Volume II. User Manual.

    Science.gov (United States)

    1980-03-01

    TESI CHART NATIONAI RUREAt (F ANDA[)Rt 1V4 A NATIONAL. AVIATION ~ FUEL SCENARIO.. ANALYSIS PROGRAM 49!! VOLUM I: MODEL DESCRIA~v 4<C VOLUME II: tr)ER...executes post processor which translates results of the graphics program to machine readable code used by the pen plotter) cr (depressing the carriage

  6. ALE Meta-Analysis of Schizophrenics Performing the N-Back Task

    Science.gov (United States)

    Harrell, Zachary

    2010-10-01

    MRI/fMRI has already proven itself as a valuable tool in the diagnosis and treatment of many illnesses of the brain, including cognitive problems. By exploiting the differences in magnetic susceptibility between oxygenated and deoxygenated hemoglobin, fMRI can measure blood flow in various regions of interest within the brain. This can determine the level of brain activity in relation to motor or cognitive functions and provide a metric for tissue damage or illness symptoms. Structural imaging techniques have shown lesions or deficiencies in tissue volumes in schizophrenics corresponding to areas primarily in the frontal and temporal lobes. These areas are currently known to be involved in working memory and attention, which many schizophrenics have trouble with. The ALE (Activation Likelihood Estimation) Meta-Analysis is able to statistically determine the significance of brain area activations based on the post-hoc combination of multiple studies. This process is useful for giving a general model of brain function in relation to a particular task designed to engage the affected areas (such as working memory for the n-back task). The advantages of the ALE Meta-Analysis include elimination of single subject anomalies, elimination of false/extremely weak activations, and verification of function/location hypotheses.

  7. Applying of the NVIDIA CUDA to the video processing in the task of the roundwood volume estimation

    Directory of Open Access Journals (Sweden)

    Kruglov Artem

    2016-01-01

    Full Text Available The paper is devoted to the parallel computing. The algorithm for roundwood volume estimation had insufficient performance so it was decided to port its bottleneck part on the GPU. The analysis of various GPGPU techniques was observed and the NVIDIA CUDA technology was chosen for implementation. The results of the research have shown the high potential of the GPU implementation in the improvement performance of the computation. The speedup of the algorithm for the roundwood volume estimation is more than 300% after porting on GPU with implementation of the CUDA technology. This helps to apply the machine vision algorithm in real-time system.

  8. Improving the quality of team task analysis: experimental validation of guidelines

    NARCIS (Netherlands)

    Berlo, M.P.W. van

    2004-01-01

    Traditionally, task analysis is conducted at the level of individual task performers. Instructional designers for team training find it hard to conduct task analyses at the level of a team. It is difficult to capture the interactivity and interdependency between the various team members, and to form

  9. Bringing Reading-to-Write and Writing-Only Assessment Tasks Together: A Generalizability Analysis

    Science.gov (United States)

    Gebril, Atta

    2010-01-01

    Integrated tasks are currently employed in a number of L2 exams since they are perceived as an addition to the writing-only task type. Given this trend, the current study investigates composite score generalizability of both reading-to-write and writing-only tasks. For this purpose, a multivariate generalizability analysis is used to investigate…

  10. A Longitudinal Behavioral Genetic Analysis of Task Persistence

    Science.gov (United States)

    Deater-Deckard, Kirby; Petrill, Stephen A.; Thompson, Lee A.; DeThorne, Laura S.

    2006-01-01

    Change in task persistence was assessed in two annual assessments using teachers', testers', and observers' ratings. Participants included 79 monozygotic and 116 same-sex dizygotic twin pairs who were in Kindergarten or 1st grade (4.3 to 7.9 years old) at the initial assessment. Task persistence was widely distributed and higher among older…

  11. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  12. Identification and Analysis of Multi-tasking Product Information Search Sessions with Query Logs

    Directory of Open Access Journals (Sweden)

    Xiang Zhou

    2016-09-01

    Full Text Available Purpose: This research aims to identify product search tasks in online shopping and analyze the characteristics of consumer multi-tasking search sessions. Design/methodology/approach: The experimental dataset contains 8,949 queries of 582 users from 3,483 search sessions. A sequential comparison of the Jaccard similarity coefficient between two adjacent search queries and hierarchical clustering of queries is used to identify search tasks. Findings: (1 Users issued a similar number of queries (1.43 to 1.47 with similar lengths (7.3-7.6 characters per task in mono-tasking and multi-tasking sessions, and (2 Users spent more time on average in sessions with more tasks, but spent less time for each task when the number of tasks increased in a session. Research limitations: The task identification method that relies only on query terms does not completely reflect the complex nature of consumer shopping behavior. Practical implications: These results provide an exploratory understanding of the relationships among multiple shopping tasks, and can be useful for product recommendation and shopping task prediction. Originality/value: The originality of this research is its use of query clustering with online shopping task identification and analysis, and the analysis of product search session characteristics.

  13. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  14. Combined analysis of job and task benzene air exposures among workers at four US refinery operations.

    Science.gov (United States)

    Burns, Amanda; Shin, Jennifer Mi; Unice, Ken M; Gaffney, Shannon H; Kreider, Marisa L; Gelatt, Richard H; Panko, Julie M

    2017-03-01

    Workplace air samples analyzed for benzene at four US refineries from 1976 to 2007 were pooled into a single dataset to characterize similarities and differences between job titles, tasks and refineries, and to provide a robust dataset for exposure reconstruction. Approximately 12,000 non-task (>180 min) personal samples associated with 50 job titles and 4000 task (job titles and task codes across all four refineries, and (5) our analysis of variance (ANOVA) of the distribution of benzene air concentrations for select jobs/tasks across all four refineries. The jobs and tasks most frequently sampled included those with highest potential contact with refinery product streams containing benzene, which reflected the targeted sampling approach utilized by the facility industrial hygienists. Task and non-task data were analyzed to identify and account for significant differences within job-area, task-job, and task-area categories. This analysis demonstrated that in general, areas with benzene containing process streams were associated with greater benzene air concentrations compared to areas with process streams containing little to no benzene. For several job titles and tasks analyzed, there was a statistically significant decrease in benzene air concentration after 1990. This study provides a job and task-focused analysis of occupational exposure to benzene during refinery operations, and it should be useful for reconstructing refinery workers' exposures to benzene over the past 30 years.

  15. Darwinian algorithms and the Wason selection task: a factorial analysis of social contract selection task problems.

    Science.gov (United States)

    Platt, R D; Griggs, R A

    1993-08-01

    In four experiments with 760 subjects, the present study examined Cosmides' Darwinian algorithm theory of reasoning: specifically, its explanation of facilitation on the Wason selection task. The first experiment replicated Cosmides' finding of facilitation for social contract versions of the selection task, using both her multiple-problem format and a single-problem format. Experiment 2 examined performance on Cosmides' three main social contract problems while manipulating the perspective of the subject and the presence and absence of cost-benefit information. The presence of cost-benefit information improved performance in two of the three problems while the perspective manipulation had no effect. In Experiment 3, the cost-benefit effect was replicated; and performance on one of the three problems was enhanced by the presence of explicit negatives on the NOT-P and NOT-Q cards. Experiment 4 examined the role of the deontic term "must" in the facilitation observed for two of the social contract problems. The presence of "must" led to a significant improvement in performance. The results of these experiments are strongly supportive of social contract theory in that cost-benefit information is necessary for substantial facilitation to be observed in Cosmides' problems. These findings also suggest the presence of other cues that can help guide subjects to a deontic social contract interpretation when the social contract nature of the problem is not clear.

  16. Economic Analysis. Volume V. Course Segments 65-79.

    Science.gov (United States)

    Sterling Inst., Washington, DC. Educational Technology Center.

    The fifth volume of the multimedia, individualized course in economic analysis produced for the United States Naval Academy covers segments 65-79 of the course. Included in the volume are discussions of monopoly markets, monopolistic competition, oligopoly markets, and the theory of factor demand and supply. Other segments of the course, the…

  17. A task analysis of underway replenishment for virtual environment ship-handling simulator scenario development

    OpenAIRE

    Norris, Steven D.

    1998-01-01

    Approved for public release; distribution is unlimited While developing a Virtual Reality (VR) Ship handling simulator for the Surface Warfare Officer School (SWOS) in Newport, RI, researchers at the Naval Air Warfare Center Training Systems Division (NAWCTSD) in Orlando, FL discovered a need for a task analysis of a Conning Officer during an Underway Replenishment (UNREP). The purpose of this task analysis was to document the tasks the Conning Officer performs and cues used to accomplish ...

  18. A comparative critical analysis of modern task-parallel runtimes.

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Kyle Bruce; Stark, Dylan; Murphy, Richard C.

    2012-12-01

    The rise in node-level parallelism has increased interest in task-based parallel runtimes for a wide array of application areas. Applications have a wide variety of task spawning patterns which frequently change during the course of application execution, based on the algorithm or solver kernel in use. Task scheduling and load balance regimes, however, are often highly optimized for specific patterns. This paper uses four basic task spawning patterns to quantify the impact of specific scheduling policy decisions on execution time. We compare the behavior of six publicly available tasking runtimes: Intel Cilk, Intel Threading Building Blocks (TBB), Intel OpenMP, GCC OpenMP, Qthreads, and High Performance ParalleX (HPX). With the exception of Qthreads, the runtimes prove to have schedulers that are highly sensitive to application structure. No runtime is able to provide the best performance in all cases, and those that do provide the best performance in some cases, unfortunately, provide extremely poor performance when application structure does not match the schedulers assumptions.

  19. EEG based topography analysis in string recognition task

    Science.gov (United States)

    Ma, Xiaofei; Huang, Xiaolin; Shen, Yuxiaotong; Qin, Zike; Ge, Yun; Chen, Ying; Ning, Xinbao

    2017-03-01

    Vision perception and recognition is a complex process, during which different parts of brain are involved depending on the specific modality of the vision target, e.g. face, character, or word. In this study, brain activities in string recognition task compared with idle control state are analyzed through topographies based on multiple measurements, i.e. sample entropy, symbolic sample entropy and normalized rhythm power, extracted from simultaneously collected scalp EEG. Our analyses show that, for most subjects, both symbolic sample entropy and normalized gamma power in string recognition task are significantly higher than those in idle state, especially at locations of P4, O2, T6 and C4. It implies that these regions are highly involved in string recognition task. Since symbolic sample entropy measures complexity, from the perspective of new information generation, and normalized rhythm power reveals the power distributions in frequency domain, complementary information about the underlying dynamics can be provided through the two types of indices.

  20. 49 CFR 236.923 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... employer shall, at a minimum: (1) Identify the specific goals of the training program with regard to the... for the performance of the tasks identified; (4) Identify the additional knowledge, skills, and... training curriculum that includes classroom, simulator, computer-based, hands-on, or other...

  1. A meta-analysis of reinforcement sensitivity theory: on performance parameters in reinforcement tasks.

    Science.gov (United States)

    Leue, Anja; Beauducel, André

    2008-11-01

    J. A. Gray's Reinforcement Sensitivity Theory (RST) has produced a wealth of quasi-experimental studies in more than 35 years of research on personality and reinforcement sensitivity. The present meta-analysis builds on this literature by investigating RST in conflict and nonconflict reinforcement tasks in humans. Based on random-effects meta-analysis, we confirmed RST predictions of performance parameters (e.g., number of responses, reaction time) in reinforcement tasks for impulsivity- and anxiety-related traits. In studies on anxiety-related traits, the effect size variance was smaller for conflict tasks than for nonconflict tasks. A larger mean effect size and a larger variability of effect sizes were found for conflict compared to nonconflict tasks in studies on impulsivity-related traits. Our results suggest that problems with RST confirmation in reinforcement tasks are at least partly caused by insufficient statistical power of primary studies, and thus, encourage future research on RST.

  2. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  3. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  4. Fluid Vessel Quantity Using Non-invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  5. Fluid Vessel Quantity using Non-Invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A.

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  6. Cognitive Process Analysis of Aptitude: The Nature of Inductive Reasoning Tasks.

    Science.gov (United States)

    Glaser, Robert; Pellegrino, James W.

    This paper provides an overview of the cognitive process analysis of tasks used to measure aptitude and intelligence. As an illustration of this approach, performance on inductive reasoning tasks such as series extrapolation and analogy problems is considered in terms of the factors that contribute to item difficulty and individual differences in…

  7. Procedural and Conceptual Difficulties with Slope: An Analysis of Students' Mistakes on Routine Tasks

    Science.gov (United States)

    Cho, Peter; Nagle, Courtney

    2017-01-01

    This study extends past research on students' understanding of slope by analyzing college students' mistakes on routine tasks involving slope. We conduct both quantitative and qualitative analysis of students' mistakes on common slope tasks to extract information regarding procedural proficiencies and conceptual underpinnings required in order for…

  8. Financial Mathematical Tasks in a Middle School Mathematics Textbook Series: A Content Analysis

    Science.gov (United States)

    Hamburg, Maryanna P.

    2009-01-01

    This content analysis examined the distribution of financial mathematical tasks (FMTs), mathematical tasks that contain financial terminology and require financially related solutions, across the National Standards in K-12 Personal Finance Education categories (JumpStart Coalition, 2007), the thinking skills as identified by "A Taxonomy for…

  9. A Cross-Sectional Behavioral Genetic Analysis of Task Persistence in the Transition to Middle Childhood

    Science.gov (United States)

    Deater-Deckard, Kirby; Petrill, Stephen A.; Thompson, Lee A.; DeThorne, Laura S.

    2005-01-01

    Task persistence, measured by a composite score of independent teacher, tester and observer reports, was examined using behavioral genetic analysis. Participants included 92 monozygotic and 137 same-sex dizygotic twin pairs in Kindergarten or 1st grade (4.3 to 7.9 years old). Task persistence was widely distributed, higher among older children,…

  10. Increasing Pizza Box Assembly Using Task Analysis and a Least-to-Most Prompting Hierarchy

    Science.gov (United States)

    Stabnow, Erin F.

    2015-01-01

    This study was designed to use a task analysis and a least-to-most prompting hierarchy to teach students with cognitive disabilities pizza box assembly skills. The purpose of this study was to determine whether a least-to-most prompting hierarchy was effective in teaching students with cognitive disabilities to increase the number of task-analyzed…

  11. Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory

    Science.gov (United States)

    Fiester, Herbert R.

    2010-01-01

    The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…

  12. Analysis of urea distribution volume in hemodialysis.

    Science.gov (United States)

    Maduell, F; Sigüenza, F; Caridad, A; Miralles, F; Serrato, F

    1994-01-01

    According to the urea kinetic model it is considered that the urea distribution volume (V) is that of body water, and that it is distributed in only one compartment. Since the V value is different to measure, it is normal to use 58% of body weight, in spite of the fact that it may range from 35 to 75%. In this study, we have calculated the value of V by using an accurate method based on the total elimination of urea from the dialysate. We have studied the V, and also whether the different dialysis characteristics modify it. Thirty-five patients were included in this study, 19 men and 16 women, under a chronic hemodialysis programme. The dialysate was collected in a graduated tank, and the concentration of urea in plasma and in dialysate were determined every hour. Every patient received six dialysis sessions, changing the blood flow (250 or 350 ml/min), the ultrafiltration (0.5 or 1.5 l/h), membrane (cuprophane or polyacrylonitrile) and/or buffer (bicarbonate or acetate). At the end of the hemodialysis session, the V value ranged from 43 to 72% of body weight; nevertheless, this value was practically constant in every patient. The V value gradually increased throughout the dialysis session, 42.1 +/- 6.9% of body weight in the first hour, 50.7 +/- 7.5% in the second hour and 55.7 +/- 7.9% at the end of the dialysis session. The change of blood flow, ultrafiltration, membrane or buffer did not alter the results. The V value was significantly higher in men in comparison with women, 60.0 +/- 6.6% vs. 50.5 +/- 5.9% of body weight (p < 0.001).

  13. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    Science.gov (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  14. Prototype Task Network Model to Simulate the Analysis of Narrow Band Sonar Data and the Effects of Automation on Critical Operator Tasks

    Science.gov (United States)

    2016-06-07

    generate appropriate functions and tasks to be modelled. The Integrated Performance Modelling Environment (IPME) software was used to build a task...repeated, low-level cognitive analysis. This would free up operators to perform the more complex and critical tasks of analysing contact signatures...could be created by an automated function that deals with lines that are easy to detect and identify, thereby freeing the operator to focus on more

  15. Task versus relationship conflict, team performance and team member satisfaction: a meta-analysis

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Weingart, L.R.

    2003-01-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, resultsrevealed strong and negative correlations between relationship conflict, team performance, and tea

  16. Dashboard Task Monitor for managing ATLAS user analysis on the Grid

    CERN Document Server

    Sargsyan, L; The ATLAS collaboration; Jha, M; Karavakis, E; Kokoszkiewicz, L; Saiz, P; Schovancova, J; Tuckett, D

    2013-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is operating system and GRID environment independent. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  17. Dashboard Task Monitor for managing ATLAS user analysis on the Grid

    CERN Document Server

    Sargsyan, L; Jha, M; Karavakis, E; Kokoszkiewicz, L; Saiz, P; Schovancova, J; Tuckett, D

    2014-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and GRID environment . This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  18. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    Science.gov (United States)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  19. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 2

    Science.gov (United States)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 2 of the four major tasks included in the study. Task 2 compares various catagories of flight plans and flight tracking data produced by a simulation system developed for the Federal Aviation Administrations by SRI International. (Flight tracking data simulate actual flight tracks of all aircraft operating at a given time and provide for rerouting of flights as necessary to resolve traffic conflicts.) The comparisons of flight plans on the forecast to flight plans on the verifying analysis confirm Task 1 findings that wind speeds are generally underestimated. Comparisons involving flight tracking data indicate that actual fuel burn is always higher than planned, in either direction, and even when the same weather data set is used. Since the flight tracking model output results in more diversions than is known to be the case, it was concluded that there is an error in the flight tracking algorithm.

  20. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 1: Issues, Impacts, and Economics of Wind and Hydropower Integration

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  1. Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.

    Science.gov (United States)

    De Dreu, Carsten K W; Weingart, Laurie R

    2003-08-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.

  2. Analysis of Spectral Features of EEG during four different Cognitive Tasks

    Directory of Open Access Journals (Sweden)

    S.BAGYARAJ

    2014-05-01

    Full Text Available Cognition is a group of information processing activities that involves the visual attention, visual awareness, problem solving and decision making. Finding the cognitive task related regional cerebral activations are of great interest among researchers in cognitive neuroscience. In this study four different types of cognitive tasks, namely tracking pendulum movement and counting, red flash counting, sequential subtraction, spot the difference is performed by 32 subjects and the EEG signals are acquired by using 24 channels RMS EEG-32 Super Spec machine. The analyses of the EEG signal are done by using well known spectral methods. The band powers are calculated in the frequency domain by using the Welch method. The task- relaxes relative band power values and the ratios of theta band power/ beta band power are the two variables used to find the regional cerebral activations during the four different cognitive tasks. The statistical paired t test is used to evaluate the significant difference between the particular tasks related cerebral activations and relaxation. The statistical significance level is set at p< 0.05. During the tracking pendulum movement and counting task, the cerebral activations are found to be bilateral prefrontal, frontal, right central and temporal regions. Red flash counting task has activations in bilateral prefrontal, frontal, right central, right parietal and right occipital lobes. Bilateral prefrontal regions are activated during the sequence subtraction task. The spot the difference task has activations in the left and right prefrontal cortex. The unique and common activations regions for the selected four different cognitive tasks are found to be left and right prefrontal cortex. The pre frontal lobe electrodes namely Fp1 & Fp2 can be used as the recording electrodes for detailed cognitive task analysis were cerebral activations are observed when compared with the other cerebral regions.

  3. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  4. Failure Analysis Seminar: Techniques and Teams. Seminar Notes. Volume I.

    Science.gov (United States)

    1981-01-01

    and Progress - Evaluate 7* 6 *~ 0 6 9 9 S 9 FAILURE ANALYSIS STRATEGY1 Augustine E. Magistro *. Introduction A primary task of management and systems...by Augustine Magistro , Picatinny Arsenal and Lawrence R. Seggel, U. S. Army Missile Command. The report Is available from the National Technical...to emphasize techniques - Identification and improvement of your leadership styles 2I BIOGRAPHIC SKETCHES: A.E. "Gus" Magistro - Systems Evaluation

  5. Method for measuring anterior chamber volume by image analysis

    Science.gov (United States)

    Zhai, Gaoshou; Zhang, Junhong; Wang, Ruichang; Wang, Bingsong; Wang, Ningli

    2007-12-01

    Anterior chamber volume (ACV) is very important for an oculist to make rational pathological diagnosis as to patients who have some optic diseases such as glaucoma and etc., yet it is always difficult to be measured accurately. In this paper, a method is devised to measure anterior chamber volumes based on JPEG-formatted image files that have been transformed from medical images using the anterior-chamber optical coherence tomographer (AC-OCT) and corresponding image-processing software. The corresponding algorithms for image analysis and ACV calculation are implemented in VC++ and a series of anterior chamber images of typical patients are analyzed, while anterior chamber volumes are calculated and are verified that they are in accord with clinical observation. It shows that the measurement method is effective and feasible and it has potential to improve accuracy of ACV calculation. Meanwhile, some measures should be taken to simplify the handcraft preprocess working as to images.

  6. Micro analysis of fringe field formed inside LDA measuring volume

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.

    2016-05-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement.

  7. District heating and cooling systems for communities through power-plant retrofit and distribution network. Volume 2. Tasks 1-3. Final report. [Downtown Toledo steam system

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Each of the tasks is described separately: Task 1 - Demonstration Team; Task 2 - Identify Thermal Energy Source(s) and Potential Service Area(s); and Task 3 - Energy Market Analysis. The purpose of the project is to establish and implement measures in the downtown Toledo steam system for conserving scarce fuel supplies through cogeneration, by retrofit of existing base- or intermediate-loaded electric-generating plants to provide for central heating and cooling systems, with the ultimate purpose of applying the results to other communities. For Task 1, Toledo Edison Company has organized a Demonstration Team (Battelle Columbus Laboratories; Stone and Webster; Ohio Dept. of Energy; Public Utilities Commission of Ohio; Toledo Metropolitan Area Council of Governments; and Toledo Edison) that it hopes has the expertise to evaluate the technical, legal, economic, and marketing issues related to the utilization of by-product heat from power generation to supply district heating and cooling services. Task 2 gives a complete technical description of the candidate plant(s), its thermodynamic cycle, role in load dispatch, ownership, and location. It is concluded that the Toledo steam distribution system can be the starting point for developing a new district-heating system to serve an expanding market. Battelle is a member of the team employed as a subcontractor to complete the energy market analysis. The work is summarized in Task 3. (MCW)

  8. Energy use in the marine transportation industry: Task III. Efficiency improvements; Task IV. Industry future. Final report, Volume IV. [Projections for year 2000

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Tasks III and IV measure the characteristics of potential research and development programs that could be applied to the maritime industry. It was necessary to identify potential operating scenarios for the maritime industry in the year 2000 and determine the energy consumption that would result given those scenarios. After the introductory chapter the operational, regulatory, and vessel-size scenarios for the year 2000 are developed in Chapter II. In Chapter III, future cargo flows and expected levels of energy use for the baseline 2000 projection are determined. In Chapter IV, the research and development programs are introduced into the future US flag fleet and the energy-savings potential associated with each is determined. The first four appendices (A through D) describe each of the generic technologies. The fifth appendix (E) contains the baseline operating and cost parameters against which 15 program areas were evaluated. (MCW)

  9. Task 11 - systems analysis of environmental management technologies

    Energy Technology Data Exchange (ETDEWEB)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy & Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech.

  10. Trading Volume and Stock Indices: A Test of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Paul Abbondante

    2010-01-01

    Full Text Available Problem statement: Technical analysis and its emphasis on trading volume has been used to analyze movements in individual stock prices and make investment recommendations to either buy or sell that stock. Little attention has been paid to investigating the relationship between trading volume and various stock indices. Approach: Since stock indices track overall stock market movements, trends in trading volume could be used to forecast future stock market trends. Instead of focusing only on individual stocks, this study will examine movements in major stock markets as a whole. Regression analysis was used to investigate the relationship between trading volume and five popular stock indices using daily data from January, 2000 to June, 2010. A lag of 5 days was used because this represents the prior week of trading volume. The total sample size ranges from 1,534-2,638 to observations. Smaller samples were used to test the investment horizon that explains movements of the indices more completely. Results: The F statistics were significant for samples using 6 and 16 months of data. The F statistic was not significant using a sample of 1 month of data. This is surprising given the short term focus of technical analysis. The results indicate that above-average returns can be achieved using futures, options and exchange traded funds which track these indices. Conclusion: Future research efforts will include out-of-sample forecasting to determine if above-average returns can be achieved. Additional research can be conducted to determine the optimal number of lags for each index.

  11. Cognitive Task Analysis (l’Analyse des taches cognitives)

    Science.gov (United States)

    2000-10-01

    pilotage, guidage, gestion de produits chimiques), aujourd’hui, le CTA est principalement utilisé pour des tâches décisionnelles telles que le...communauté CTA de pouvoir démontrer la valeur ajoutée du CTA. Ainsi, l’analyste va au-delà des simples observations et soumet ses idées à des...da Força Aérea AnalysisCommunications et gestion de SDFA - Centro de Documentação Institute of Military Technologyl’information - DRDCGI 3

  12. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology.

  13. Extending hierarchical task analysis to identify cognitive demands and information design requirements.

    Science.gov (United States)

    Phipps, Denham L; Meakin, George H; Beatty, Paul C W

    2011-07-01

    While hierarchical task analysis (HTA) is well established as a general task analysis method, there appears a need to make more explicit both the cognitive elements of a task and design requirements that arise from an analysis. One way of achieving this is to make use of extensions to the standard HTA. The aim of the current study is to evaluate the use of two such extensions--the sub-goal template (SGT) and the skills-rules-knowledge (SRK) framework--to analyse the cognitive activity that takes place during the planning and delivery of anaesthesia. In quantitative terms, the two methods were found to have relatively poor inter-rater reliability; however, qualitative evidence suggests that the two methods were nevertheless of value in generating insights about anaesthetists' information handling and cognitive performance. Implications for the use of an extended HTA to analyse work systems are discussed.

  14. Workflow Modelling and Analysis Based on the Construction of Task Models

    Directory of Open Access Journals (Sweden)

    Glória Cravo

    2015-01-01

    Full Text Available We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.

  15. Analysis of Member State RED implementation. Final Report (Task 2)

    Energy Technology Data Exchange (ETDEWEB)

    Peters, D.; Alberici, S.; Toop, G. [Ecofys, Utrecht (Netherlands); Kretschmer, B. [Institute for European Environmental Policy IEEP, London (United Kingdom)

    2012-12-15

    This report describes the way EU Member States have transposed the sustainability and chain of custody requirements for biofuels as laid down in the Renewable Energy Directive (RED) and Fuel Quality Directive (FQD). In the assessment of Member States' implementation, the report mainly focuses on effectiveness and administrative burden. Have Member States transposed the Directives in such a way that compliance with the sustainability criteria can be ensured as effectively as possible? To what extent does the Member States' implementation lead to unnecessary administrative burden for economic operators in the (bio)fuel supply chain? The report focuses specifically on the transposition of the sustainability and chain of custody requirements, not on the target for renewables on transport. This means that for example the double counting provision is not included as part of the scope of this report. This report starts with an introduction covering the implementation of the Renewable Energy (and Fuel Quality) Directive into national legislation, the methodology by which Member States were assessed against effectiveness and administrative burden and the categorisation of Member State's national systems for RED-implementation (Chapter 1). The report continues with a high level description of each Member State system assessed (Chapter 2). Following this, the report includes analysis of the Member States on the effectiveness and administrative burden of a number of key ('major') measures (Chapter 3). The final chapter presents the conclusions and recommendations (Chapter 4)

  16. Performance monitoring and analysis of task-based OpenMP.

    Directory of Open Access Journals (Sweden)

    Yi Ding

    Full Text Available OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  17. Performance monitoring and analysis of task-based OpenMP.

    Science.gov (United States)

    Ding, Yi; Hu, Kai; Wu, Kai; Zhao, Zhenlong

    2013-01-01

    OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  18. Chaos and Fractal Analysis of Electroencephalogram Signals during Different Imaginary Motor Movement Tasks

    Science.gov (United States)

    Soe, Ni Ni; Nakagawa, Masahiro

    2008-04-01

    This paper presents the novel approach to evaluate the effects of different motor activation tasks of the human electroencephalogram (EEG). The applications of chaos and fractal properties that are the most important tools in nonlinear analysis are been presented for four tasks of EEG during the real and imaginary motor movement. Three subjects, aged 23-30 years, participated in the experiment. Correlation dimension (D2), Lyapunov spectrum (λi), and Lyapunov dimension (DL) are been estimated to characterize the movement related EEG signals. Experimental results show that these nonlinear measures are good discriminators of EEG signals. There are significant differences in all conditions of subjective task. The fractal dimension appeared to be higher in movement conditions compared to the baseline condition. It is concluded that chaos and fractal analysis could be powerful methods in investigating brain activities during motor movements.

  19. An analysis of physiological signals as a measure of task engagement in a multi-limb-coordination motor-learning task.

    Science.gov (United States)

    Murray, Spencer A; Goldfarb, Michael

    2015-01-01

    There is widespread agreement in the physical rehabilitation community that task engagement is essential to effective neuromuscular recovery. Despite this, there are no clear measures of such task engagement. This paper assesses the extent to which certain physiological measurements might provide a measure of task engagement. In previous studies, correlations between mental focus and certain physiological measurements have been observed in subjects performing tasks requiring mental effort. In this study, the authors analyzed whether these signals showed similar correlation when subjects performed a multi-limb-coordination motor-learning task. Subjects played a video game which required the use of both arms and one leg to play a simplified electronic drum set with varying difficulty. Heart rate (HR), skin conductance level (SCL), and facial electromyogram (EMG) were recorded while the subjects played. Analysis of the recordings showed statistically significant correlations relating task difficulty to SCL, HR and EMG amplitude in corrugator supercilii. No statistically significant correlation was observed between task difficulty and EMG in frontalis.

  20. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  1. Task Recognition and Person Identification in Cyclic Dance Sequences with Multi Factor Tensor Analysis

    Science.gov (United States)

    Perera, Manoj; Shiratori, Takaaki; Kudoh, Shunsuke; Nakazawa, Atsushi; Ikeuchi, Katsushi

    In this paper, we present a novel approach to recognize motion styles and identify people using the Multi Factor Tensor (MFT) model. We apply a musical information analysis method in segmenting the motion sequence relevant to the keyposes and the musical rhythm. We define a task model by considering the repeated motion segments, where the motion is decomposed into a person-invariant factor task and a person-dependent factor style. Given the motion data set, we formulate the MFT model, factorize it efficiently in different modes, and use it in recognizing the tasks and the identities of the persons performing the tasks. We capture the motion data of different people for a few cycles, segment it using the musical analysis approach, normalize the segments using a vectorization method, and realize our MFT model. In our experiments, Japanese traditional dance sequences performed by several people are used. Provided with an unknown motion segment which is to be probed and which was performed at a different time in the time space, we first normalize the motion segment and flatten our MFT model appropriately, then recognize the task and the identity of the person. We follow two approaches in conducting our experiments. In one experiment, we recognize the tasks and the styles by maximizing a function in the tensor subdomain, and in the next experiment, we use a function value in the tensorial subdomain with a threshold for recognition. Interestingly, unlike the first experiment, we are capable of recognizing tasks and human identities that were not known beforehand. We conducted various experiments to evaluate the potential of the recognition ability of our proposed approaches, and the results demonstrate the high accuracy of our model.

  2. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  3. The Use of Cognitive Task Analysis to Capture Expertise for Tracheal Extubation Training in Anesthesiology

    Science.gov (United States)

    Embrey, Karen K.

    2012-01-01

    Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…

  4. Noticing in Task Performance and Learning Outcomes: A Qualitative Analysis of Instructional Effects in Interlanguage Pragmatics

    Science.gov (United States)

    Takahashi, Satomi

    2005-01-01

    This study aims to provide an in-depth qualitative analysis of instructional effects in L2 pragmatics by exploring the manner in which Japanese EFL learners' noticing of target English request forms is constrained by different types of treatment tasks and the subsequent effect of the learners' noticing on their learning outcomes. Following the…

  5. Formative Research on the Simplifying Conditions Method (SCM) for Task Analysis and Sequencing.

    Science.gov (United States)

    Kim, YoungHwan; Reigluth, Charles M.

    The Simplifying Conditions Method (SCM) is a set of guidelines for task analysis and sequencing of instructional content under the Elaboration Theory (ET). This article introduces the fundamentals of SCM and presents the findings from a formative research study on SCM. It was conducted in two distinct phases: design and instruction. In the first…

  6. Boundary error analysis and categorization in the TRECVID news story segmentation task

    NARCIS (Netherlands)

    Arlandis, J.; Over, P.; Kraaij, W.

    2005-01-01

    In this paper, an error analysis based on boundary error popularity (frequency) including semantic boundary categorization is applied in the context of the news story segmentation task from TRECVTD1. Clusters of systems were defined based on the input resources they used including video, audio and a

  7. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    NARCIS (Netherlands)

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    PURPOSE: Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwife

  8. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    Energy Technology Data Exchange (ETDEWEB)

    Torralba, B.; Martinez-Arias, R.

    2007-07-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  9. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  10. Behavior of testosterone and cortisol during an intensity-controlled high-volume training period measured by a training task-specific test in men rowers.

    Science.gov (United States)

    Rämson, Raul; Jürimäe, Jaak; Jürimäe, Toivo; Mäestu, Jarek

    2009-03-01

    The aim of this study was to investigate changes in the stress hormones testosterone and cortisol after a task-specific exercise during a high-volume endurance training cycle in men rowers. Eight highly trained men rowers were investigated during a high-volume, low-intensity training period. A 2-hour, low-intensity, long-distance rowing (LDT) test was conducted at baseline, after a high-volume period, and after the recovery period. Training and performance intensities were obtained at the graded incremental test, were preset individually, and were the same during all LDTs. Fasting blood samples were taken during the same days as the LDTs. Exercise-induced blood samples were taken before, 5 minutes after, and 30 minutes after (post 30') the completion of each LDT. There were no significant changes in fasting cortisol and testosterone values during the whole study period, and there were no significant changes in cortisol and testosterone concentrations during the LDT. However, testosterone concentration was significantly decreased at post 30' compared with posttest values during the second LDT that was held after the 2-week high-training-volume period, and, during the second LDT, post 30' values of cortisol tended to be decreased compared with posttest values (p = 0.063). In conclusion, changes in the concentrations of testosterone and cortisol after long-distance rowing indicate decreased adaptivity after the training-specific performance test.

  11. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    Directory of Open Access Journals (Sweden)

    Yigzaw T

    2016-05-01

    Full Text Available Tegbar Yigzaw,1 Catherine Carr,2 Jelle Stekelenburg,3,4 Jos van Roosmalen,5 Hannah Gibson,1 Mintwab Gelagay,1 Azeb Admassu6 1Jhpiego, Addis Ababa, Ethiopia; 2Jhpiego, Washington DC, USA; 3Department of Obstetrics and Gynecology, Leeuwarden Medical Centre, Leeuwarden, 4Department of Health Sciences, Global Health, University Medical Centre Groningen, University of Groningen, Groningen, 5Faculty of Earth and Life Sciences, Vrije Universiteit, Amsterdam, the Netherlands; 6Federal Ministry of Health, Addis Ababa, Ethiopia Purpose: Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. Methods: We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest “not capable” and “never” done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. Results: One hundred and thirty-eight midwives participated in the study. The majority of

  12. Combat History Analysis Study Effort (CHASE) Data Enhancement Study (CDES). Volume 3. Task 2 and Task 3

    Science.gov (United States)

    1986-01-31

    Talavera , #128 Bussaco, #129 JA: D: 65,900 51,910 X X _ 65,900 51,910 0 0 4,500 1,300 61,400 50,610 Fuentes de Onoro, #130 |A: p...WINA" IRCHA’ ACHÜ i CHITLHIÜN I RESO I Talavera , #128 * i r ! 6 I .1 -1^ I -1 ACH Bussaco, #129 Fuentes de Onoro, #130* ACH

  13. Combat History Analysis Study Effort (CHASE) Data Enhancement Study (CDES). Volume 4. Task 4 and Task 5

    Science.gov (United States)

    1986-01-31

    Talavera , #128* 1180907 1280500 180907 280540 180907 281330 180907 281800 5.2 13.0 Bussaoo, #129* r 181009 1 271400 ! 181009 270545 8.3...The Raab, ♦126; Chandler, Dictionary, p. 355. Waerramf »127: Oiandler, Napoleon, pp. 719, 722-728. Talavera , »128; Napier, Vol. 11:171-172, 175, 178...CLARIFICATION DATA Wagram, #127* WOK (Km) j Attacker I Defender ! 24.0 | I I 24.0 Talavera , #128* 4.8 4.8 Bussaco, #129

  14. Grcarma: A fully automated task-oriented interface for the analysis of molecular dynamics trajectories.

    Science.gov (United States)

    Koukos, Panagiotis I; Glykos, Nicholas M

    2013-10-05

    We report the availability of grcarma, a program encoding for a fully automated set of tasks aiming to simplify the analysis of molecular dynamics trajectories of biological macromolecules. It is a cross-platform, Perl/Tk-based front-end to the program carma and is designed to facilitate the needs of the novice as well as those of the expert user, while at the same time maintaining a user-friendly and intuitive design. Particular emphasis was given to the automation of several tedious tasks, such as extraction of clusters of structures based on dihedral and Cartesian principal component analysis, secondary structure analysis, calculation and display of root-meansquare deviation (RMSD) matrices, calculation of entropy, calculation and analysis of variance–covariance matrices, calculation of the fraction of native contacts, etc. The program is free-open source software available immediately for download.

  15. Development of an advanced, continuous mild gasification process for the production of co-products (Task 4. 7), Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Knight, R.A.; Gissy, J.L.; Onischak, M.; Babu, S.P.; Carty, R.H. (Institute of Gas Technology, Chicago, IL (United States)); Duthie, R.G. (Bechtel Group, Inc., San Francisco, CA (United States)); Wootten, J.M. (Peabody Holding Co., Inc., St. Louis, MO (United States))

    1991-09-01

    The focus of this task is the preparation of (1) preliminary piping and instrument diagrams (P IDs) and single line electrical diagrams for a site-specific conceptual design and (2) a factored cost estimate for a 24 ton/day (tpd) capacity mild gasification process development unit (PDU) and an associated form coke preparation PDU. The intended site for this facility is the Illinois Coal Development Park at Carterville, Illinois, which is operated by Southern Illinois University at Carbondale. (VC)

  16. The Defense Science Board 1998 Summer Study Task Force on DoD Logistics Transformation. Volume 1: Final Report

    Science.gov (United States)

    1998-12-01

    Summer Study was tasked to recommend actions to be taken that achieve a true transformation, not marginal improvements to the U.S. military logistics system...frequently constrains operations and drains scarce resources needed for force modernization; (4) Failure to seamlessly blend military logistics with...preserving its capability for early, then continuous, application of dominant control effects across the full spectrum of conflict; (2) The military

  17. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  18. Synfuel program analysis. Volume I. Procedures-capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This is the first of the two volumes describing the analytic procedures and resulting capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative synfuel projects and integrated programs. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specifiy cases and interpret outputs. It also contains an explicit description (with examples) of the types of results which can be obtained when applied to: the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. In all cases, the objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  19. Analysis of working postures in hammering tasks on building construction sites using the computerized OWAS method.

    Science.gov (United States)

    Mattila, M; Karwowski, W; Vilkki, M

    1993-12-01

    The main objectives of this study were to identify the most problematic postures in hammering tasks performed at building construction sites through application of the computerized OWAS method, and to develop recommendations for improvement of working method and workplaces. Eighteen construction workers, with mean age of 41.6, from three construction companies participated in the field study. The hammering tasks observed during the two-month period included roof boarding, concrete form preparation, clamping support braces, assembling roof frames, roof joisting, shelter form preparation, and fixing fork clamps. Three different types of hammer, including a small Fiskar's hammer, a Fiskar's construction hammer, and a Rocket hammer, were used by the workers. Of all the observations, poor working postures were observed most frequently in roof joisting (12.4% of all observations within the task), followed by concrete form preparation (8.6%), and construction of frames for the roof (7.5%). Overall, out of 593 different postures analysed, a total of 7.8% of postures adopted by the workers during various hammering tasks were classified into OWAS categories III or IV, indicating that these postures should be corrected either soon or immediately. The computerized OWAS method for postural data analysis proved to be a very useful way to reduce postural load of dynamic hammering tasks, and allowed for efficient application of the original OWAS method.

  20. Does Complexity Matter? Meta-Analysis of Learner Performance in Artificial Grammar Tasks

    Directory of Open Access Journals (Sweden)

    Rachel eSchiff

    2014-09-01

    Full Text Available Complexity has been shown to affect performance on artificial grammar learning (AGL tasks (categorization of test items as grammatical/ungrammatical according to the implicitly trained grammar rules. However, previously published AGL experiments did not utilize consistent measures to investigate the comprehensive effect of grammar complexity on task performance. The present study focused on computerizing Bollt and Jones's (2000 technique of calculating topological entropy (TE, a quantitative measure of AGL charts' complexity, with the aim of examining associations between grammar systems' TE and learners’ AGL task performance. We surveyed the literature and identified 56 previous AGL experiments based on 10 different grammars that met the sampling criteria. Using the automated matrix-lift-action method, we assigned a TE value for each of these 10 previously used AGL systems and examined its correlation with learners' task performance. The meta-regression analysis showed a significant correlation, demonstrating that the complexity effect transcended the different settings and conditions in which the categorization task was performed. The results reinforced the importance of using this new automated tool to uniformly measure grammar systems’ complexity when experimenting with and evaluating the findings of AGL studies.

  1. Does complexity matter? Meta-analysis of learner performance in artificial grammar tasks.

    Science.gov (United States)

    Schiff, Rachel; Katan, Pesia

    2014-01-01

    Complexity has been shown to affect performance on artificial grammar learning (AGL) tasks (categorization of test items as grammatical/ungrammatical according to the implicitly trained grammar rules). However, previously published AGL experiments did not utilize consistent measures to investigate the comprehensive effect of grammar complexity on task performance. The present study focused on computerizing Bollt and Jones's (2000) technique of calculating topological entropy (TE), a quantitative measure of AGL charts' complexity, with the aim of examining associations between grammar systems' TE and learners' AGL task performance. We surveyed the literature and identified 56 previous AGL experiments based on 10 different grammars that met the sampling criteria. Using the automated matrix-lift-action method, we assigned a TE value for each of these 10 previously used AGL systems and examined its correlation with learners' task performance. The meta-regression analysis showed a significant correlation, demonstrating that the complexity effect transcended the different settings and conditions in which the categorization task was performed. The results reinforced the importance of using this new automated tool to uniformly measure grammar systems' complexity when experimenting with and evaluating the findings of AGL studies.

  2. Motion analysis of knee joint using dynamic volume images

    Science.gov (United States)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  3. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  4. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    Directory of Open Access Journals (Sweden)

    Guan Yu

    Full Text Available Accurately identifying mild cognitive impairment (MCI individuals who will progress to Alzheimer's disease (AD is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI and fluorodeoxyglucose positron emission tomography (FDG-PET. However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI subjects and 226 stable MCI (sMCI subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images and also the single-task classification method (using only MRI or only subjects with both MRI and

  5. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    Science.gov (United States)

    Yu, Guan; Liu, Yufeng; Thung, Kim-Han; Shen, Dinggang

    2014-01-01

    Accurately identifying mild cognitive impairment (MCI) individuals who will progress to Alzheimer's disease (AD) is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI) and fluorodeoxyglucose positron emission tomography (FDG-PET). However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI) study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD) analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification) for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF) learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI) subjects and 226 stable MCI (sMCI) subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images) and also the single-task classification method (using only MRI or only subjects with both MRI and PET images

  6. Concurrent multidisciplinary mechanical design based on design task analysis and knowledge sharing; Sekkei task bunseki to joho kyoyu ni yoru mechatronics kyocho sekkei

    Energy Technology Data Exchange (ETDEWEB)

    Kondo, K.; Ozawa, M.; Mori, T. [Toshiba Corp., Tokyo (Japan)

    1999-09-01

    We have developed a systematic design task planning method based on a design structure matrix(DSM) and a lumped model- based framework for knowledge sharing in a concurrent design environment as key techniques for developing higher quality products in a shorter design time. The DSM facilitates systematic analysis of dependencies among design tasks and optimization of the design process. The framework based on a lumped model description of mechanical systems enables concurrent and cooperative work among multidisciplinary designers at an early stage of the design process. In this paper, we also discuss the relationships between these techniques and the product development flow from product definition to detailed design. (author)

  7. Application of probabilistic and decision analysis methods to structural mechanics and materials sciences problems. Volume 2. Resource document

    Energy Technology Data Exchange (ETDEWEB)

    Garrick, B.J.; Tagart, S.W. Jr. (eds.)

    1984-08-01

    This volume presents background resource material on the field of structural reliability assessment and its relationship to the discipline of probabilistic risk analysis and decision analysis. First, general background material is presented on the field of structural reliability assessment. Next, some sample applications of probabilistic and decision analysis methods are presented. A hypothetical example illustrates how a probabilistic approach could be used in structural design, and a brief description is given of how the results of structural reliability analyses can be used as input to a PRA. A case study is described on the use of decision analysis to select strategies for dealing with intergranular stress corrosion cracking. The use of decision analysis to evaluate the merits of different possible research tasks is also discussed. A discussion of decision analysis is then presented. Finally, the document presents a discussion of open issues in the area of structural reliability.

  8. Task parallel sensitivity analysis and parameter estimation of groundwater simulations through the SALSSA framework

    Energy Technology Data Exchange (ETDEWEB)

    Schuchardt, Karen L.; Agarwal, Khushbu; Chase, Jared M.; Rockhold, Mark L.; Freedman, Vicky L.; Elsethagen, Todd O.; Scheibe, Timothy D.; Chin, George; Sivaramakrishnan, Chandrika

    2010-07-15

    The Support Architecture for Large-Scale Subsurface Analysis (SALSSA) provides an extensible framework, sophisticated graphical user interface, and underlying data management system that simplifies the process of running subsurface models, tracking provenance information, and analyzing the model results. Initially, SALSSA supported two styles of job control: user directed execution and monitoring of individual jobs, and load balancing of jobs across multiple machines taking advantage of many available workstations. Recent efforts in subsurface modelling have been directed at advancing simulators to take advantage of leadership class supercomputers. We describe two approaches, current progress, and plans toward enabling efficient application of the subsurface simulator codes via the SALSSA framework: automating sensitivity analysis problems through task parallelism, and task parallel parameter estimation using the PEST framework.

  9. PC-PVT: A Platform for Psychomotor Vigilance Task Testing, Analysis, and Prediction

    Science.gov (United States)

    2014-01-01

    PC-PVT: A platform for psychomotor vigilance task testing, analysis, and prediction Maxim Y. Khitrov & Srinivas Laxminarayan & David Thorsley...the rela- tively low hardware cost, user familiarity, and the relative ease of software development for specific neurobehavioral testing protocols... developed and characterized a freely available system for PC-based simple visual reaction time testing that is analogous to the widely used psychomo

  10. Three-dimensional volume analysis of vasculature in engineered tissues

    Science.gov (United States)

    YousefHussien, Mohammed; Garvin, Kelley; Dalecki, Diane; Saber, Eli; Helguera, María.

    2013-01-01

    Three-dimensional textural and volumetric image analysis holds great potential in understanding the image data produced by multi-photon microscopy. In this paper, an algorithm that quantitatively analyzes the texture and the morphology of vasculature in engineered tissues is proposed. The investigated 3D artificial tissues consist of Human Umbilical Vein Endothelial Cells (HUVEC) embedded in collagen exposed to two regimes of ultrasound standing wave fields under different pressure conditions. Textural features were evaluated using the normalized Gray-Scale Cooccurrence Matrix (GLCM) combined with Gray-Level Run Length Matrix (GLRLM) analysis. To minimize error resulting from any possible volume rotation and to provide a comprehensive textural analysis, an averaged version of nine GLCM and GLRLM orientations is used. To evaluate volumetric features, an automatic threshold using the gray level mean value is utilized. Results show that our analysis is able to differentiate among the exposed samples, due to morphological changes induced by the standing wave fields. Furthermore, we demonstrate that providing more textural parameters than what is currently being reported in the literature, enhances the quantitative understanding of the heterogeneity of artificial tissues.

  11. Energy Consumption Analysis Procedure for Robotic Applications in different task motion

    Science.gov (United States)

    Ahmed, Iman; Aris, Ishak b.; Hamiruce Marhaban, Mohammad; Juraiza Ishak, Asnor

    2015-11-01

    This work proposes energy analysis method for humanoid robot, seen from simple motion task to complex one in energy chain. The research developed a procedure suitable for analysis, saving and modelling of energy consumption not only in this type of robot but also in most robots that based on electrical power as an energy source. This method has validated by an accurate integration using Matlab software for the power consumption curve to calculate the energy of individual and multiple servo motors. Therefore, this study can be considered as a procedure for energy analysis by utilizing the laboratory instruments capabilities to measure the energy parameters. We performed a various task motions with different angular speed to find out the speed limits in terms of robot stability and control strategy. A battery capacity investigation have been searched for several types of batteries to extract the power modelling equation and energy density parameter for each battery type, Matlab software have been built to design the algorithm and to evaluate experimental amount of the energy which is represented by area under the curve of the power curves. This will provide a robust estimation for the required energy in different task motions to be considered in energy saving (i.e., motion planning and real time scheduling).

  12. Hawaii Energy Strategy Project 2: Fossil Energy Review. Task IV. Scenario development and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, N.D.; Breazeale, K. [ed.

    1993-12-01

    The Hawaii Energy Strategy (HES) Program is a seven-project effort led by the State of Hawaii Department of Business, Economic Development & Tourism (DBEDT) to investigate a wide spectrum of Hawaii energy issues. The East-West Center`s Program on Resources: Energy and Minerals, has been assigned HES Project 2, Fossil Energy Review, which focuses on fossil energy use in Hawaii and the greater regional and global markets. HES Project 2 has four parts: Task I (World and Regional Fossil Energy Dynamics) covers petroleum, natural gas, and coal in global and regional contexts, along with a discussion of energy and the environment. Task II (Fossil Energy in Hawaii) focuses more closely on fossil energy use in Hawaii: current utilization and trends, the structure of imports, possible future sources of supply, fuel substitutability, and energy security. Task III`s emphasis is Greenfield Options; that is, fossil energy sources not yet used in Hawaii. This task is divided into two sections: first, an in-depth {open_quotes}Assessment of Coal Technology Options and Implications for the State of Hawaii,{close_quotes} along with a spreadsheet analysis model, which was subcontracted to the Environmental Assessment and Information Sciences Division of Argonne National Laboratory; and second, a chapter on liquefied natural gas (LNG) in the Asia-Pacific market and the issues surrounding possible introduction of LNG into the Hawaii market.

  13. Analysis on Refinery System as a Complex Task-resource Network

    Institute of Scientific and Technical Information of China (English)

    LIU Suyu; RONG Gang

    2013-01-01

    Refinery system,a typical example of process systems,is presented as complex network in this paper.The topology of this system is described by task-resource network and modeled as directed and weighted graph,in which nodes represent various tasks and edges denote the resources exchanged among tasks.Using the properties of node degree distribution,strength distribution and other weighted quantities,we demonstrate the heterogeneity of the network and point out the relation between structural characters of vertices and the functionality of corresponding tasks.The above phenomena indicate that the design requirements and principles of production process contribute to the heterogeneous features of the network.Besides,betweenness centrality of nodes can be used as an importance indicator to provide additional information for decision making.The correlations between structure and weighted properties are investigated to further address the influence brought by production schemes in system connectivity patterns.Cascading failures model is employed to analyze the robustness of the network when targeted attack happens.Two capacity assignment strategies are compared in order to improve the robustness of the network at certain cost.The refinery system displays more reliable behavior when the protecting strategy considers heterogeneous properties.This phenomenon further implies the structure-activity relationship of the refinery system and provides insightful suggestions for process system design.The results also indicate that robustness analysis is a promising application of methodologies from complex networks to process system engineering.

  14. Analysis of Partial Volume Effects on Accurate Measurement of the Hippocampus Volume

    Institute of Scientific and Technical Information of China (English)

    Maryam Hajiesmaeili; Jamshid Dehmeshki; Tim Ellis

    2014-01-01

    Hippocampal volume loss is an important biomarker in distinguishing subjects with Alzheimer’s disease (AD) and its measurement in magnetic resonance images (MRI) is influenced by partial volume effects (PVE). This paper describes a post-processing approach to quantify PVE for correction of the hippocampal volume by using a spatial fuzzyC-means (SFCM) method. The algorithm is evaluated on a dataset of 20 T1-weighted MRI scans sampled at two different resolutions. The corrected volumes for left and right hippocampus (HC) which are 23% and 18% for the low resolution and 6% and 5% for the high resolution datasets, respectively are lower than hippocampal volume results from manual segmentation. Results show the importance of applying this technique in AD detection with low resolution datasets.

  15. Parametric analysis of architectural volumes through genetic algorithms

    Directory of Open Access Journals (Sweden)

    Pedro Salcedo Lagos

    2015-03-01

    Full Text Available During the last time, architectural design has developed partly due to new digital design techniques, which allow the generation of geometries based on the definition of initial parameters and the programming of formal relationship between them. Design processes based on these technologies allow to create shapes with the capacity to modify and adapt to multiple constrains or specific evaluation criteria, which raises the problem of identifying the best architectural solution. Several experiences have set up the utilization of genetic algorithm to face this problem. This paper demonstrates the possibility to implement a parametric analysis of architectural volumes with genetic algorithm, in order to combine functional, environmental and structural requirements, with an effective search method to select a variety of proper solutions through digital technologies.

  16. Coal gasification systems engineering and analysis. Volume 1: Executive summary

    Science.gov (United States)

    1980-01-01

    Feasibility analyses and systems engineering studies for a 20,000 tons per day medium Btu (MBG) coal gasification plant to be built by TVA in Northern Alabama were conducted. Major objectives were as follows: (1) provide design and cost data to support the selection of a gasifier technology and other major plant design parameters, (2) provide design and cost data to support alternate product evaluation, (3) prepare a technology development plan to address areas of high technical risk, and (4) develop schedules, PERT charts, and a work breakdown structure to aid in preliminary project planning. Volume one contains a summary of gasification system characterizations. Five gasification technologies were selected for evaluation: Koppers-Totzek, Texaco, Lurgi Dry Ash, Slagging Lurgi, and Babcock and Wilcox. A summary of the trade studies and cost sensitivity analysis is included.

  17. Thermal-Hydraulic Analysis Tasks for ANAV NPPs in Support of Plant Operation and Control

    OpenAIRE

    2007-01-01

    Thermal-hydraulic analysis tasks aimed at supporting plant operation and control of nuclear power plants are an important issue for the Asociación Nuclear Ascó-Vandellòs (ANAV). ANAV is the consortium that runs the Ascó power plants (2 units) and the Vandellòs-II power plant. The reactors are Westinghouse-design, 3-loop PWRs with an approximate electrical power of 1000 MW. The Technical University of Catalonia (UPC) thermal-hydraulic analysis team has jointly worked togeth...

  18. Aviation and programmatic analyses; Volume 1, Task 1: Aviation data base development and application. [for NASA OAST programs

    Science.gov (United States)

    1977-01-01

    A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.

  19. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  20. Thermal-Hydraulic Analysis Tasks for ANAV NPPs in Support of Plant Operation and Control

    Directory of Open Access Journals (Sweden)

    F. Reventós

    2008-01-01

    Full Text Available Thermal-hydraulic analysis tasks aimed at supporting plant operation and control of nuclear power plants are an important issue for the Asociación Nuclear Ascó-Vandellòs (ANAV. ANAV is the consortium that runs the Ascó power plants (2 units and the Vandellòs-II power plant. The reactors are Westinghouse-design, 3-loop PWRs with an approximate electrical power of 1000 MW. The Technical University of Catalonia (UPC thermal-hydraulic analysis team has jointly worked together with ANAV engineers at different levels in the analysis and improvement of these reactors. This article is an illustration of the usefulness of computational analysis for operational support. The contents presented were operational between 1985 and 2001 and subsequently changed slightly following various organizational adjustments. The paper has two different parts. In the first part, it describes the specific aspects of thermal-hydraulic analysis tasks related to operation and control and, in the second part, it briefly presents the results of three examples of analyses that were performed. All the presented examples are related to actual situations in which the scenarios were studied by analysts using thermal-hydraulic codes and prepared nodalizations. The paper also includes a qualitative evaluation of the benefits obtained by ANAV through thermal-hydraulic analyses aimed at supporting operation and plant control.

  1. Implementation of Hierarchical Task Analysis for User Interface Design in Drawing Application for Early Childhood Education

    Directory of Open Access Journals (Sweden)

    Mira Kania Sabariah

    2016-05-01

    Full Text Available Draw learning in early childhood is an important lesson and full of stimulation of the process of growth and development of children which could help to train the fine motor skills. We have had a lot of applications that can be used to perform learning, including interactive learning applications. Referring to the observations that have been conducted showed that the experiences given by the applications that exist today are very diverse and have not been able to represent the model of learning and characteristics of early childhood (4-6 years. Based on the results, Hierarchical Task Analysis method generated a list of tasks that must be done in designing an user interface that represents the user experience in draw learning. Then by using the Heuristic Evaluation method the usability of the model has fulfilled a very good level of understanding and also it can be enhanced and produce a better model.

  2. Control Volume Analysis, Entropy Balance and the Entropy Production in Flow Systems

    OpenAIRE

    Niven, Robert K.; Noack, Bernd R.

    2014-01-01

    This chapter concerns "control volume analysis", the standard engineering tool for the analysis of flow systems, and its application to entropy balance calculations. Firstly, the principles of control volume analysis are enunciated and applied to flows of conserved quantities (e.g. mass, momentum, energy) through a control volume, giving integral (Reynolds transport theorem) and differential forms of the conservation equations. Several definitions of steady state are discussed. The concept of...

  3. Genre analysis and task-based course design for isiXhosa second language teaching in local government contexts

    Directory of Open Access Journals (Sweden)

    Edith Venter

    2011-08-01

    Full Text Available The successful implementation of a multilingual language policy in the public and private sectors in South Africa depends on vibrant research. This article explores the design and nature of the isiXhosa communication tasks for specific purposes second language teaching in local government context, within a framework of genre-based and task-based approaches to language teaching. These two approaches also form the theoretical basis of the analysis of the rhetorical move structure and the task types of selected communication tasks.

  4. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  5. Force Management Methods Task II. Volume I. Summary and Analysis Considerations

    Science.gov (United States)

    1980-11-01

    iiDi’stLII.J TABLE OF CONTENTS SECTION PAGE INTRODUCTION 1 2 FORCE MANAGEMENT OVERVIEW 2 K 2.1 FORCE MANAGEMENT DEFINITION 4 2.2 FORCE MANAGEMENT ELEMENTS...34A w toIW W" r z a . 0a to, to co f. go-I I % at,, o" -, .... w a 1.45.4 -- - to~1.. S. h - .ar.. ]h. 2.1 FORCE MANAGEMENT DEFINITION The MIL-STD-1530A

  6. Blade loss transient dynamics analysis, volume 2. Task 2: TETRA 2 user's manual

    Science.gov (United States)

    Black, Gerald; Gallardo, Vincente C.

    1986-01-01

    This is the user's manual for the TETRA 2 Computer Code, a program developed in the NASA-Lewis Blade Loss Program. TETRA 2 calculates a turbine engine's dynamic structural response from applied stimuli. The calculation options are: (1) transient response; and (2) steady state forced response. Based on the method of modal syntheses, the program allows the use of linear, as well as nonlinear connecting elements. Both transient and steady state options can include: flexible Bladed Disk Module, and Nonlinear Connecting Elements (including deadband, hardening/softening spring). The transient option has the additional capability to calculate response with a squeeze film bearing module. TETRA 2 output is summarized in a plotfile which permits post processing such as FFT or graphical animation with the proper software and computer equipment.

  7. Analysis of Mexico wind tunnel measurements. Final report of IEA Task 29, Mexnext (Phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G.; Boorsma, K. [Energy research Center of the Netherlands ECN, Petten (Netherlands); Cho, T. [Korea Aerospace Research Institute KARI, Daejeon (Korea, Republic of); Gomez-Iradi, S. [National Renewable Energy Center of Spain CENER, Sarriguren (Spain); Schaffarczyk, P. [A. Jeromin University of Applied Sciences, CEWind EG, Kiel (Germany); Shen, W.Z. [The Technical University of Denmark, Kongens Lyngby (Denmark); Lutz, T. [K. Meister University of Stuttgart, Stuttgart (Germany); Stoevesandt, B. [ForWind, Zentrum fuer Windenergieforschung, Oldenburg (Germany); Schreck, S. [National Renewable Energy Laboratory NREL, Golden, CO (United States); Micallef, D.; Pereira, R.; Sant, T. [Delft University of Technology TUD, Delft (Netherlands); Madsen, H.A.; Soerensen, N. [Risoe-DTU, Roskilde (Denmark)

    2012-02-15

    This report describes the work performed within the first phase of IEA Task 29 Mexnext. In this IEA Task 29 a total of 20 organisations from 11 different countries collaborated in analysing the measurements which have been performed in the EU project 'Mexico'. Within this Mexico project 9 European institutes carried out a wind tunnel experiment in the Large Low Speed Facility (LLF) of the German Dutch Wind Facilities DNW on a rotor with a diameter of 4.5 m. Pressure distributions were measured at five locations along the blade along with detailed flow field measurements around the rotor plane using stereo PIV. As a result of the international collaboration within this task a very thorough analysis of the data could be carried out and a large number of codes were validated not only in terms of loads but also in terms of underlying flow field. The detailed pressure measurements along the blade in combination with the detailed flow field measurements gave a unique opportunity to better understand the response of a wind turbine to the incoming flow field. Deficiencies in modelling have been established and directions for model improvement can be given.

  8. Using cognitive task analysis to create a teaching protocol for bovine dystocia.

    Science.gov (United States)

    Read, Emma K; Baillie, Sarah

    2013-01-01

    When learning skilled techniques and procedures, students face many challenges. Learning is easier when detailed instructions are available, but experts often find it difficult to articulate all of the steps involved in a task or relate to the learner as a novice. This problem is further compounded when the technique is internal and unsighted (e.g., obstetrical procedures). Using expert bovine practitioners and a life-size model cow and calf, the steps and decision making involved in performing correction of two different dystocia presentations (anterior leg back and breech) were deconstructed using cognitive task analysis (CTA). Video cameras were positioned to capture movement inside and outside the cow model while the experts were asked to first perform the technique as they would in a real situation and then perform the procedure again as if articulating the steps to a novice learner. The audio segments were transcribed and, together with the video components, analyzed to create a list of steps for each expert. Consensus was achieved between experts during individual interviews followed by a group discussion. A "gold standard" list or teaching protocol was created for each malpresentation. CTA was useful in defining the technical and cognitive steps required to both perform and teach the tasks effectively. Differences between experts highlight the need for consensus before teaching the skill. In addition, the study identified several different, yet effective, techniques and provided information that could allow experts to consider other approaches they might use when their own technique fails.

  9. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  10. Stereological analysis of nuclear volume in recurrent meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1994-01-01

    A stereological estimation of nuclear volume in recurrent and non-recurrent meningiomas was made. The aim was to investigate whether this method could discriminate between these two groups. We found that the mean nuclear volumes in recurrent meningiomas were all larger at debut than in any...... of the control tumors. The mean nuclear volume of the individual recurrent tumors appeared to change with time, showing a tendency to diminish. A relationship between large nuclear volume at presentation and number of or time interval between recurrences was not found. We conclude that measurement of mean...

  11. Efficacy of bronchoscopic lung volume reduction: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Iftikhar IH

    2014-05-01

    Full Text Available Imran H Iftikhar,1 Franklin R McGuire,1 Ali I Musani21Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, University of South Carolina, Columbia, SC, USA; 2Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, National Jewish Health, Denver, CO, USABackground: Over the last several years, the morbidity, mortality, and high costs associated with lung volume reduction (LVR surgery has fuelled the development of different methods for bronchoscopic LVR (BLVR in patients with emphysema. In this meta-analysis, we sought to study and compare the efficacy of most of these methods.Methods: Eligible studies were retrieved from PubMed and Embase for the following BLVR methods: one-way valves, sealants (BioLVR, LVR coils, airway bypass stents, and bronchial thermal vapor ablation. Primary study outcomes included the mean change post-intervention in the lung function tests, the 6-minute walk distance, and the St George's Respiratory Questionnaire. Secondary outcomes included treatment-related complications.Results: Except for the airway bypass stents, all other methods of BLVR showed efficacy in primary outcomes. However, in comparison, the BioLVR method showed the most significant findings and was the least associated with major treatment-related complications. For the BioLVR method, the mean change in forced expiratory volume (in first second was 0.18 L (95% confidence interval [CI]: 0.09 to 0.26; P<0.001; in 6-minute walk distance was 23.98 m (95% CI: 12.08 to 35.88; P<0.01; and in St George's Respiratory Questionnaire was −8.88 points (95% CI: −12.12 to −5.64; P<0.001.Conclusion: The preliminary findings of our meta-analysis signify the importance of most methods of BLVR. The magnitude of the effect on selected primary outcomes shows noninferiority, if not equivalence, when compared to what is known for surgical LVR.Keyword: emphysema, endobronchial valves, sealants, stents, coils

  12. Brain connectivity analysis from EEG signals using stable phase-synchronized states during face perception tasks

    Science.gov (United States)

    Jamal, Wasifa; Das, Saptarshi; Maharatna, Koushik; Pan, Indranil; Kuyucu, Doga

    2015-09-01

    Degree of phase synchronization between different Electroencephalogram (EEG) channels is known to be the manifestation of the underlying mechanism of information coupling between different brain regions. In this paper, we apply a continuous wavelet transform (CWT) based analysis technique on EEG data, captured during face perception tasks, to explore the temporal evolution of phase synchronization, from the onset of a stimulus. Our explorations show that there exists a small set (typically 3-5) of unique synchronized patterns or synchrostates, each of which are stable of the order of milliseconds. Particularly, in the beta (β) band, which has been reported to be associated with visual processing task, the number of such stable states has been found to be three consistently. During processing of the stimulus, the switching between these states occurs abruptly but the switching characteristic follows a well-behaved and repeatable sequence. This is observed in a single subject analysis as well as a multiple-subject group-analysis in adults during face perception. We also show that although these patterns remain topographically similar for the general category of face perception task, the sequence of their occurrence and their temporal stability varies markedly between different face perception scenarios (stimuli) indicating toward different dynamical characteristics for information processing, which is stimulus-specific in nature. Subsequently, we translated these stable states into brain complex networks and derived informative network measures for characterizing the degree of segregated processing and information integration in those synchrostates, leading to a new methodology for characterizing information processing in human brain. The proposed methodology of modeling the functional brain connectivity through the synchrostates may be viewed as a new way of quantitative characterization of the cognitive ability of the subject, stimuli and information integration

  13. Style, content and format guide for writing safety analysis documents. Volume 1, Safety analysis reports for DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    The purpose of Volume 1 of this 4-volume style guide is to furnish guidelines on writing and publishing Safety Analysis Reports (SARs) for DOE nuclear facilities at Sandia National Laboratories. The scope of Volume 1 encompasses not only the general guidelines for writing and publishing, but also the prescribed topics/appendices contents along with examples from typical SARs for DOE nuclear facilities.

  14. District heating and cooling systems for communities through power plant retrofit and distribution network. Volume 3. Tasks 4-6. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Stone and Webster Engineering Corporation is a member of the Demonstration Team to review and assess the technical aspects of cogeneration for district heating. Task 4 details the most practical retrofit schemes. Of the cogeneration schemes studied, a back-pressure turbine is considered the best source of steam for district heating. Battelle Columbus Laboratories is a member of the Demonstration Team employed to investigate several institutional issues affecting the success of district heating. The Toledo Edison legal staff reviewed the legal aspects of mandate to serve, easement and franchise requirements, and corporate charter requirements. The principal findings of both the Battelle investigations and the legal research are summarized in Task 5. A complete discussion of each issue is included in the two sections labeled Legal Issues and Institutional Issues. In Task 6, Battelle Columbus Laboratories completed a preliminary economic analysis, incorporating accurate input parameters applicable to utility ownership of the proposed district-heating system. The methodology used is summarized, the assumptions are listed, and the results are briefly reviewed.

  15. Performance Analysis Of A Upnp/Dhcompliant Robotic Adapter For Collaborative Tasks Development

    Directory of Open Access Journals (Sweden)

    Alejandro Alvarez Vazquez

    2012-02-01

    Full Text Available This paper describes the performance analysis of an adapter in accordance with standard UPnP DHCompliant (Digital Home Compliant for a service robot. The DHCompliant adapter has been developed to solve some limitations that UPnP protocol suffers and to develop new DHC concepts. Moreover, it showcases with a particular example how the open protocol DHC is useful for the development of collaborative tasks, localization, energy management and other fields altogether. That interoperability is being done between devices obtaining a virtual device which can obtain the controlpoint logic and the device logic simultaneously.

  16. Task and error analysis balancing benefits over business of electronic medical records.

    Science.gov (United States)

    Carstens, Deborah Sater; Rodriguez, Walter; Wood, Michael B

    2014-01-01

    Task and error analysis research was performed to identify: a) the process for healthcare organisations in managing healthcare for patients with mental illness or substance abuse; b) how the process can be enhanced and; c) if electronic medical records (EMRs) have a role in this process from a business and safety perspective. The research question is if EMRs have a role in enhancing the healthcare for patients with mental illness or substance abuse. A discussion on the business of EMRs is addressed to understand the balancing act between the safety and business aspects of an EMR.

  17. Analysis of brain activity and response to colour stimuli during learning tasks: an EEG study

    Science.gov (United States)

    Folgieri, Raffaella; Lucchiari, Claudio; Marini, Daniele

    2013-02-01

    The research project intends to demonstrate how EEG detection through BCI device can improve the analysis and the interpretation of colours-driven cognitive processes through the combined approach of cognitive science and information technology methods. To this end, firstly it was decided to design an experiment based on comparing the results of the traditional (qualitative and quantitative) cognitive analysis approach with the EEG signal analysis of the evoked potentials. In our case, the sensorial stimulus is represented by the colours, while the cognitive task consists in remembering the words appearing on the screen, with different combination of foreground (words) and background colours. In this work we analysed data collected from a sample of students involved in a learning process during which they received visual stimuli based on colour variation. The stimuli concerned both the background of the text to learn and the colour of the characters. The experiment indicated some interesting results concerning the use of primary (RGB) and complementary (CMY) colours.

  18. A Content Analysis of General Chemistry Laboratory Manuals for Evidence of Higher-Order Cognitive Tasks

    Science.gov (United States)

    Domin, Daniel S.

    1999-01-01

    The science laboratory instructional environment is ideal for fostering the development of problem-solving, manipulative, and higher-order thinking skills: the skills needed by today's learner to compete in an ever increasing technology-based society. This paper reports the results of a content analysis of ten general chemistry laboratory manuals. Three experiments from each manual were examined for evidence of higher-order cognitive activities. Analysis was based upon the six major cognitive categories of Bloom's Taxonomy of Educational Objectives: knowledge, comprehension, application, analysis, synthesis, and evaluation. The results of this study show that the overwhelming majority of general chemistry laboratory manuals provide tasks that require the use of only the lower-order cognitive skills: knowledge, comprehension, and application. Two of the laboratory manuals were disparate in having activities that utilized higher-order cognition. I describe the instructional strategies used within these manuals to foster higher-order cognitive development.

  19. Demonstration project as a procedure for accelerating the application of new technology (Charpie Task Force report). Volume II

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-02-01

    This report examines the issues associated with government programs proposed for the ''commercialization'' of new energy technologies; these programs are intended to hasten the pace at which target technologies are adopted by the private sector. The ''commercial demonstration'' is the principal tool used in these programs. Most previous government interventions in support of technological change have focused on R and D and left to the private sector the decision as to adoption for commercial utilization; thus there is relatively little in the way of analysis or experience which bears direct application. The analysis is divided into four sections. First, the role of R, D, and D within the structure of the national energy goals and policies is examined. The issue of ''prices versus gaps'' is described as a crucial difference of viewpoint concerning the role of the government in the future of the energy system. Second, the process of technological change as it occurs with respect to energy technologies is then examined for possible sources of misalignment of social and private incentives. The process is described as a series of investments. Third, correction of these sources of misalignment then becomes the goal of commercial demonstration programs as this goal and the means for attaining it are explored. Government-supported commercialization may be viewed as a subsidy to the introduction stage of the process; the circumstances under which such subsidies are likely to affect the success of the subsequent diffusion stage are addressed. The discussion then turns to the political, legal, and institutional problems. Finally, methods for evaluation and planning of commercial demonstration programs are analyzed. The critical areas of ignorance are highlighted and comprise a research agenda for improved analytical techniques to support decisions in this area.

  20. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Pol, Hilleke E. Hulshoff; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension. perceptual organization and perceptual speed as assessed by the Wechsler I

  1. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  2. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    Leeuwen, van Marieke; Peper, Jiska S.; Berg, van den Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler I

  3. An Empirical Analysis of Interspersal Research Evidence, Implications, and Applications of the Discrete Task Completion Hypothesis.

    Science.gov (United States)

    Skinner, Christopher H.

    2002-01-01

    Researchers have posited that when students work on assignments with many discrete tasks, that each completed discrete task may be a conditioned reinforcer. If the discrete task completion hypothesis is accurate, then relative task completion rates should influence choice behavior in the same manner as relative rates of reinforcement. Results of a…

  4. A task analysis of pier side ship-handling for virtual environment ship-handling simulator scenario development

    OpenAIRE

    Grassi, Charles R.

    2000-01-01

    Approved for public release: distribution is unlimited Researchers at the Naval Air Warfare Center Training Systems Divisions (NAWCTSD) in Orlando, FL have developed a testbed for the Conning Officer Virtual Environment (COVE) Ship-handling simulator. The purpose of this task analysis was to provide a workable document that they could use in the development of pier side ship-handling scenarios for their simulator. The task analysis not only identified the general procedures and methodologi...

  5. Performance Task using Video Analysis and Modelling to promote K12 eight practices of science

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    We will share on the use of Tracker as a pedagogical tool in the effective learning and teaching of physics performance tasks taking root in some Singapore Grade 9 (Secondary 3) schools. We discuss the pedagogical use of Tracker help students to be like scientists in these 6 to 10 weeks where all Grade 9 students are to conduct a personal video analysis and where appropriate the 8 practices of sciences (1. ask question, 2. use models, 3. Plan and carry out investigation, 4. Analyse and interpret data, 5. Using mathematical and computational thinking, 6. Construct explanations, 7. Discuss from evidence and 8. Communicating information). We will situate our sharing on actual students work and discuss how tracker could be an effective pedagogical tool. Initial research findings suggest that allowing learners conduct performance task using Tracker, a free open source video analysis and modelling tool, guided by the 8 practices of sciences and engineering, could be an innovative and effective way to mentor authent...

  6. Common tasks in microscopic and ultrastructural image analysis using ImageJ.

    Science.gov (United States)

    Papadopulos, Francesca; Spinelli, Matthew; Valente, Sabrina; Foroni, Laura; Orrico, Catia; Alviano, Francesco; Pasquinelli, Gianandrea

    2007-01-01

    Cooperation between research communities and software-development teams has led to the creation of novel software. The purpose of this paper is to show an alternative work method based on the usage of ImageJ (http://rsb.info.nih.gov/ij/), which can be effectively employed in solving common microscopic and ultrastructural image analysis tasks. As an open-source software, ImageJ provides the possibility to work in a free-development/sharing world. Its very "friendly" graphical user interface helps users to manage and edit biomedical images. The on-line material such as handbooks, wikis, and plugins leads users through various functions, giving clues about potential new applications. ImageJ is not only a morphometric analysis software, it is sufficiently flexible to be adapted to the numerous requirements tasked in the laboratories as routine as well as research demands. Examples include area measurements on selectively stained tissue components, cell count and area measurements at single cell level, immunohistochemical antigen quantification, and immunoelectron microscopy gold particle count.

  7. Determination of fiber volume in graphite/epoxy materials using computer image analysis

    Science.gov (United States)

    Viens, Michael J.

    1990-01-01

    The fiber volume of graphite/epoxy specimens was determined by analyzing optical images of cross sectioned specimens using image analysis software. Test specimens were mounted and polished using standard metallographic techniques and examined at 1000 times magnification. Fiber volume determined using the optical imaging agreed well with values determined using the standard acid digestion technique. The results were found to agree within 5 percent over a fiber volume range of 45 to 70 percent. The error observed is believed to arise from fiber volume variations within the graphite/epoxy panels themselves. The determination of ply orientation using image analysis techniques is also addressed.

  8. STATE-OF-THE-ART TASKS AND ACHIEVEMENTS OF PARALINGUISTIC SPEECH ANALYSIS SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. A. Karpov

    2016-07-01

    Full Text Available We present analytical survey of state-of-the-art actual tasks in the area of computational paralinguistics, as well as the recent achievements of automatic systems for paralinguistic analysis of conversational speech. Paralinguistics studies non-verbal aspects of human communication and speech such as: natural emotions, accents, psycho-physiological states, pronunciation features, speaker’s voice parameters, etc. We describe architecture of a baseline computer system for acoustical paralinguistic analysis, its main components and useful speech processing methods. We present some information on an International contest called Computational Paralinguistics Challenge (ComParE, which is held each year since 2009 in the framework of the International conference INTERSPEECH organized by the International Speech Communication Association. We present sub-challenges (tasks that were proposed at the ComParE Challenges in 2009-2016, and analyze winning computer systems for each sub-challenge and obtained results. The last completed ComParE-2015 Challenge was organized in September 2015 in Germany and proposed 3 sub-challenges: 1 Degree of Nativeness (DN sub-challenge, determination of nativeness degree of speakers based on acoustics; 2 Parkinson's Condition (PC sub-challenge, recognition of a degree of Parkinson’s condition based on speech analysis; 3 Eating Condition (EC sub-challenge, determination of the eating condition state during speaking or a dialogue, and classification of consumed food type (one of seven classes of food by the speaker. In the last sub-challenge (EC, the winner was a joint Turkish-Russian team consisting of the authors of the given paper. We have developed the most efficient computer-based system for detection and classification of the corresponding (EC acoustical paralinguistic events. The paper deals with the architecture of this system, its main modules and methods, as well as the description of used training and evaluation

  9. The application of independent component analysis with projection method to two-task fMRI data over multiple subjects

    Science.gov (United States)

    Li, Rui; Hui, Mingqi; Yao, Li; Chen, Kewei; Long, Zhiying

    2011-03-01

    Spatial Independent component analysis (sICA) has been successfully used to analyze functional magnetic resonance (fMRI) data. However, the application of ICA was limited in multi-task fMRI data due to the potential spatial dependence between task-related components. Long et al. (2009) proposed ICA with linear projection (ICAp) method and demonstrated its capacity to solve the interaction among task-related components in multi-task fMRI data of single subject. However, it's unclear that how to perform ICAp over a group of subjects. In this study, we proposed a group analysis framework on multi-task fMRI data by combining ICAp with the temporal concatenation method reported by Calhoun (2001). The results of real fMRI experiment containing multiple visual processing tasks demonstrated the feasibility and effectiveness of the group ICAp method. Moreover, compared to the GLM method, the group ICAp method is more sensitive to detect the regions specific to each task.

  10. Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis

    Science.gov (United States)

    Caulkins, J. L.

    2010-12-01

    We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever

  11. Three-dimensional knee kinematics by conventional gait analysis for eleven motor tasks of daily living: typical patterns and repeatability.

    Science.gov (United States)

    Scheys, Lennart; Leardini, Alberto; Wong, Pius D; Van Camp, Laurent; Callewaert, Barbara; Bellemans, Johan; Desloovere, Kaat

    2013-04-01

    The availability of detailed knee kinematic data during various activities can facilitate clinical studies of this joint. To describe in detail normal knee joint rotations in all three anatomical planes, 25 healthy subjects (aged 22-49 years) performed eleven motor tasks, including walking, step ascent and descent, each with and without sidestep or crossover turns, chair rise, mild and deep squats, and forward lunge. Kinematic data were obtained with a conventional lower-body gait analysis protocol over three trials per task. To assess the repeatability with standard indices, a representative subset of 10 subjects underwent three repetitions of the entire motion capture session. Extracted parameters with good repeatability included maximum and minimum axial rotation during turning, local extremes of the flexion curves during gait tasks, and stride times. These specific repeatable parameters can be used for task selection or power analysis when planning future clinical studies.

  12. Mission analysis of photovoltaic solar energy conversion. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, S.L.; Rattin, E.J.; Siegel, B.

    1977-03-01

    An investigation of terrestrial applications for the photovoltaic conversion of solar energy is summarized. The specific objectives of the study were: (a) to survey and evaluate near-term (1976--1985) civilian photovoltaic applications in the United States; (b) to evaluate the most promising major missions for the mid-term period (1986--2000) and to determine the conditions under which photovoltaic technology can compete in those applications at array prices consistent with ERDA goals; (c) to address critical external issues and identify the sensitivity of photovoltaic system technical requirements to such factors; and (d) to quantify the societal costs of alternative energy sources and identify equalizing incentives. The study was divided into six separate but interrelated tasks: Task 1, Analysis of Near-Term Applications; Task 2, Analysis of Major Mid-Term Missions; Task 3, Review and Updating of the ERDA Technology Implementation Plan; Task 4, Critical External Issues; Task 5, The Impact of Incentives; and Task 6, The Societal Costs of Conventional Power Generation. The emphasis of the study was on the first two of these tasks, the other four serving to provide supplementary information.

  13. Design and task analysis for a game-based shiphandling simulator using an open source game engine (DELTA3D)

    OpenAIRE

    de Moraes, Claudio Coreixas

    2011-01-01

    Approved for public release; distribution is unlimited plication designed to reduce the knowledge gap between classroom instruction and hands-on training onboard naval academy training boats (YPs). The goal was to develop a proof-of concept game-based simulator that uses 3D graphics to replicate basic tasks executed onboard the YPs. Two missions were selected for a brief task analysis study to determine the design of the respective game scenario and requirements. The design process involve...

  14. Respiratory Care/Inhalation Therapy Occupations: Task Analysis Data. UCLA Allied Health Professions Project.

    Science.gov (United States)

    Freeland, Thomas E.; Goldsmith, Katherine L.

    This study's objectives were to explore and analyze task interrelationships among department personnel; determine what specific tasks are currently performed in inhalation therapy/respiratory care departments; propose a series of appropriate tasks for occupational titles; and report future plans of the AHPP in the area of study. Contents include…

  15. Curriculum Construction: A Critical Analysis of Rich Tasks in the Recontextualisation Field

    Science.gov (United States)

    Macdonald, Doune; Hunter, Lisa; Tinning, Richard

    2007-01-01

    Within Education Queensland's recent "new basics" curriculum initiative, Education Queensland developed 20 transdisciplinary learning and assessment tasks for Years 1 to 9, called "rich tasks". This paper critiques two of the rich tasks that were most closely aligned to knowledge and skills within the health and physical education learning area.…

  16. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 1: Analysis methods

    Science.gov (United States)

    Waszak, M. R.; Schmidt, D. S.

    1985-01-01

    As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.

  17. Hydrogen Safety Project chemical analysis support task: Window C'' volatile organic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  18. Hydrogen Safety Project chemical analysis support task: Window ``C`` volatile organic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  19. An analysis of a partial task training strategy for profoundly retarded institutionalized clients.

    Science.gov (United States)

    Cipani, E

    1985-03-01

    This study investigated the effects of a partial task training strategy on productivity and on-task behavior in three profoundly retarded institutionalized clients in a pre-skills workshop classroom. Partial task training consisted of the presentation of "mini-tasks," with reinforcement for completion of those tasks. Additionally, behavior monitors were used to provide the clients with further positive comments and prompts. The results indicated that this strategy was effective in decreasing high rates of off-task behavior and in substantially increasing the number of pieces completed during the session. However, the effect on other inappropriate behaviors was minimal. This strategy demonstrated that profoundly retarded clients could be taught to increase on-task behavior and productivity in pre-skills workshop classes.

  20. Components of Task-Based Needs Analysis of the ESP Learners with the Specialization of Business and Tourism

    Science.gov (United States)

    Poghosyan, Naira

    2016-01-01

    In the following paper we shall thoroughly analyze the target learning needs of the learners within an ESP (English for Specific Purposes) context. The main concerns of ESP have always been and remain with the needs analysis, text analysis and preparing learners to communicate effectively in the tasks prescribed by their study or work situation.…

  1. Job task and functional analysis of the Division of Reactor Projects, office of Nuclear Reactor Regulation. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, J.A.; Gilmore, W.; Hahn, H.A.

    1998-07-10

    A job task and functional analysis was recently completed for the positions that make up the regional Divisions of Reactor Projects. Among the conclusions of that analysis was a recommendation to clarify roles and responsibilities among site, regional, and headquarters personnel. As that analysis did not cover headquarters personnel, a similar analysis was undertaken of three headquarters positions within the Division of Reactor Projects: Licensing Assistants, Project Managers, and Project Directors. The goals of this analysis were to systematically evaluate the tasks performed by these headquarters personnel to determine job training requirements, to account for variations due to division/regional assignment or differences in several experience categories, and to determine how, and by which positions, certain functions are best performed. The results of this analysis include recommendations for training and for job design. Data to support this analysis was collected by a survey instrument and through several sets of focus group meetings with representatives from each position.

  2. Set-based Tasks within the Singularity-robust Multiple Task-priority Inverse Kinematics Framework: General Formulation, Stability Analysis and Experimental Results

    Directory of Open Access Journals (Sweden)

    Signe eMoe

    2016-04-01

    Full Text Available Inverse kinematics algorithms are commonly used in robotic systems to transform tasks to joint references, and several methods exist to ensure the achievement of several tasks simultaneously. The multiple task-priority inverse kinematicsframework allows tasks to be considered in a prioritized order by projecting task velocities through the nullspaces of higherpriority tasks. This paper extends this framework to handle setbased tasks, i.e. tasks with a range of valid values, in addition to equality tasks, which have a specific desired value. Examples of set-based tasks are joint limit and obstacle avoidance. The proposed method is proven to ensure asymptotic convergence of the equality task errors and the satisfaction of all high-priority set-based tasks. The practical implementation of the proposed algorithm is discussed, and experimental results are presented where a number of both set-based and equality tasks have been implemented on a 6 degree of freedom UR5 which is an industrial robotic arm from Universal Robots. The experiments validate thetheoretical results and confirm the effectiveness of the proposed approach.

  3. Assessment of solar options for small power systems applications. Volume III. Analysis of concepts

    Energy Technology Data Exchange (ETDEWEB)

    Laity, W.W.; Aase, D.T.; Apley, W.J.; Bird, S.P.; Drost, M.K.; Garrett-Price, B.A.; Williams, T.A.

    1980-09-01

    A comparative analysis of solar thermal conversion concepts that are potentially suitable for development as small electric power systems (1 to 10 MWe) is given. Seven generic types of collectors, together with associated subsystems for electric power generation, were considered. The collectors can be classified into three categories: (1) two-axis tracking (with compound-curvature reflecting surfaces; (2) one-axis tracking (with single-curvature reflecting suraces; and (3) nontracking (with low-concentration reflecting surfaces). All seven collectors were analyzed in conceptual system configurations with Rankine-cycle engines. In addition, two of the collectors (the Point Focus Central Receiver and the Point Focus Distributed Receiver) were analyzed with Brayton-cycle engines, and the latter of the two also was analyzed with Stirling-cycle engines. This volume describes the systems analyses performed on all the alternative configurations of the seven generic collector concepts and the results obtained. The SOLSTEP computer code used to determine each configuration's system cost and performance is briefly described. The collector and receiver performance calculations used are also presented. The capital investment and related costs that were obtained from the systems studies are presented, and the levelized energy costs are given as a function of capacity factor obtained from the systems studies. Included also are the values of the other attributes used in the concepts' final ranking. The comments, conclusions, and recommendations developed by the PNL study team during the concept characterization and systems analysis tasks of the study are presented. (WHK)

  4. Analysis of industrial tasks as a tool for the inclusion of people with disabilities in the work market.

    Science.gov (United States)

    Simonelli, Angela Paula; Camarotto, João Alberto

    2008-01-01

    This article describes the application of a model for analyzing industrial tasks that was developed to identify jobs that could potentially be filled by people with disabilities (DP) and to serve as a guideline for a company hiring policy. In Brazil, Law No. 8213/91 makes it obligatory to hire DP based on quotas that are established according to the number of employees in a public and private company. Using a set of methods and techniques based on ergonomic work analysis and on occupational therapy, we sought to build a model to indicate the skills required to perform industrial tasks. The model was applied at 19 workstations at a Brazilian aircraft manufacturer in 2002. The task supervisor and the operator performing the task were interviewed, the work activity was filmed, a kinesiological analysis was done, the task was observed and a checklist was applied to help recognize and systematize the skills involved in performing the job task. The last step consisted of correlating the skills required to perform the task to the potential skills of the various types of disability. It was found that 100% of the jobs could be filled by workers with low-level paraplegia, 89% by workers with general paraplegia, 0% with low-level tetraplegia, 47% with auditory impairment, 42% with hemiplegia, 68% with upper limb amputees wearing adequate prostheses, and 89% handicapped wheelchair users. The company hired 14 DP based on the results of this model. The model proved adequate for analyzing industrial tasks with a view to the inclusion of DP, and it can be applied to other sectors of industrial production.

  5. Volume analysis of supercooled water under high pressure

    OpenAIRE

    Duki, Solomon F.; Tsige, Mesfin

    2016-01-01

    Motivated by recent experimental findings on the volume of supercooled water at high pressure [O. Mishima, J. Chem. Phys. 133, 144503 (2010)] we performed atomistic molecular dynamics simulations study of bulk water in the isothermal-isobaric ensemble. Cooling and heating cycles at different isobars and isothermal compression at different temperatures are performed on the water sample with pressures that range from 0 to 1.0 GPa. The cooling simulations are done at temperatures that range from...

  6. Analysis of airborne radiometric data. Volume 3. Topical reports

    Energy Technology Data Exchange (ETDEWEB)

    Reed, J.H.; Shreve, D.C.; Sperling, M.; Woolson, W.A.

    1978-05-01

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors.

  7. Method for Determining Language Objectives and Criteria. Volume II. Methodological Tools: Computer Analysis, Data Collection Instruments.

    Science.gov (United States)

    1979-05-25

    This volume presents (1) Methods for computer and hand analysis of numerical language performance data (includes examples) (2) samples of interview, observation, and survey instruments used in collecting language data. (Author)

  8. Video Analysis and Modeling Performance Task to promote becoming like scientists in classrooms

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    This paper aims to share the use of Tracker a free open source video analysis and modeling tool that is increasingly used as a pedagogical tool for the effective learning and teaching of Physics for Grade 9 Secondary 3 students in Singapore schools to make physics relevant to the real world. We discuss the pedagogical use of Tracker, guided by the Framework for K-12 Science Education by National Research Council, USA to help students to be more like scientists. For a period of 6 to 10 weeks, students use a video analysis coupled with the 8 practices of sciences such as 1. Ask question, 2. Use models, 3. Plan and carry out investigation, 4. Analyse and interpret data, 5. Use mathematical and computational thinking, 6. Construct explanations, 7. Argue from evidence and 8. Communicate information. This papers focus in on discussing some of the performance task design ideas such as 3.1 flip video, 3.2 starting with simple classroom activities, 3.3 primer science activity, 3.4 integrative dynamics and kinematics l...

  9. Multifamily Building Operator Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Building Operator JTA identifies and catalogs all of the tasks performed by multifamily building operators, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  10. Multifamily Quality Control Inspector Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Quality Control Inspector JTA identifies and catalogs all of the tasks performed by multifamily quality control inspectors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  11. Multifamily Energy Auditor Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Energy Auditor JTA identifies and catalogs all of the tasks performed by multifamily energy auditors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  12. Multifamily Retrofit Project Manager Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Retrofit Project Manager JTA identifies and catalogs all of the tasks performed by multifamily retrofit project managers, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  13. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  14. An Analysis of 'Distinctive' Utterances in Non-task-oriented Conversational Dialogue

    Science.gov (United States)

    Tokuhisa, Ryoko; Terashima, Ryuta

    In this paper, we investigate distinctive utterances in non-task-oriented conversational dialogue through the comparison between task-oriented dialogue and non-task-oriented conversational dialogue. We then found that Indirect Responses (IRs) and Clarification Requests (CRs) are significant in non-task-oriented conversational dialogue. IRs are cooperative responses to other's question, while CRs are clarification questions. We analyzed the rhetorical relations about IRs and CRs. We then found that the IRs are generated by evidence and causal relations, while the CRs are generated by elaboration relation and causal relations.

  15. Corticospinal activity during dual tasking: a systematic review and meta-analysis of TMS literature from 1995 to 2013.

    Science.gov (United States)

    Corp, Daniel T; Lum, Jarrad A G; Tooley, Gregory A; Pearce, Alan J

    2014-06-01

    This systematic review and meta-analysis was conducted across studies using transcranial magnetic stimulation to investigate corticospinal excitability and inhibition in response to a dual task (DT). Quantitative analysis was performed on eleven controlled studies that had included healthy participants over the age of 18 years. Results showed a small effect size for increased corticospinal excitability for DT conditions (SMD=0.207; p=.217, and a small effect size (SMD=-0.253) demonstrating a significant decrease in corticospinal inhibition for DT conditions (p=.019). Meta-regression demonstrated that neither age, task type, or task prioritisation accounted for the high variability in effect sizes between studies. A number of possible sources of within study bias are identified, which reduced the level of evidence for study findings. The results show overall changes in corticospinal responses between ST and DT conditions; however further research is necessary to investigate variables that could account for differences in corticospinal responses between studies.

  16. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Tozer-Loft, S.M

    2000-12-01

    A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy is explained in detail, and some improvements proposed. These 'natural' histograms are extended to show the effects of real point sources which do not exactly follow the inverse-square law, and to demonstrate the in-target dose-volume distribution, previously unpublished. The histograms are used as a way of mathematically analysing the properties of theoretical mono-energetic radionuclides, and for demonstrating the dosimetric properties of a potential new brachytherapy source (Ytterbium-169). A new modification of the Anderson formalism is then described for producing Anderson Inverse-Square Shifted (AISS) histograms for the Gamma Knife, which are shown to be useful for demonstrating the quality of stereotactic radiosurgery dose distributions. A study is performed analysing the results of Gamma Knife treatments on 44 patients suffering from a benign brain tumour (acoustic neuroma). Follow-up data is used to estimate the volume shrinkage or growth of each tumour, and this measure of outcome is compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising

  17. Detecting Hidden Encrypted Volume Files via Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Mario Piccinelli

    2015-05-01

    Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.

  18. Synfuel program analysis. Volume 2: VENVAL users manual

    Science.gov (United States)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.

  19. Puzzle task ERP response: time-frequency and source localization analysis

    Science.gov (United States)

    Almurshedi, Ahmed; Ismail, Abd Khamim

    2015-01-01

    Perceptual decision making depends on the choices available for the presented task. Most event-related potential (ERP) experiments are designed with two options, such as YES or NO. In some cases, however, subjects may become confused about the presented task in such a way that they cannot provide a behavioral response. This study aims to put subjects into such a puzzled state in order to address the following questions: How does the brain respond during puzzling moments? And what is the brain’s response to a non-answerable task? To address these questions, ERP were acquired from the brain during a scintillation grid illusion task. The subjects were required to count the number of illusory dots, a task that was impossible to perform. The results showed the presence of N130 over the parietal area during the puzzling task. Coherency among the brain hemispheres was enhanced with the complexity of the task. The neural generators’ source localizations were projected to a multimodal complex covering the left postcentral gyrus, supramarginal gyrus, and angular gyrus. This study concludes that the brain component N130 is strongly related to perception in a puzzling task network but not the visual processing network. PMID:28123804

  20. Feasibility Analysis of Developing a Formal Performance Model of Ada Tasking

    Science.gov (United States)

    1990-12-01

    end loop; end Host; C.1.1O task Philosopher. with TextO , Calendar, Random.Nunber; use Text.O, Random.umber; separate (Dining) task body Philosopher...identifying major subjects in the report. the research, or credited with the content of the report. If editor or compiler, this should follow the name

  1. A task-based analysis of machinery entanglement injuries among Western Canadian farmers.

    Science.gov (United States)

    Narasimhan, Gopinath; Crowe, Trever G; Peng, Yingwei; Hagel, Louise; Dosman, James; Pickett, William

    2011-10-01

    Machinery entanglements are a leading cause of hospitalized injury on Canadian farms. This study evaluates the role farm tasks play in the occurrence of machinery entanglement events. A retrospective case series of 41 entanglement injuries involving 35 farm-machinery types was assembled. Only a few limited tasks were implicated in the majority of entanglements. These tasks were as follows: (1) field adjustments of machinery; (2) product handling and conveyance; and (3) driveline attachments and servicing. Hazards inherent and common to these tasks affected the behavior of farmers, leading to entanglements. This study establishes a need to identify hazards and assess risks associated with different tasks involving the use of farm machinery under actual field situations. Systemic changes are required to improve existing machinery safety practices through engineering, work methods, and work practice modifications. In addition to design solutions, occupational health and safety strategies should consider activities associated with hazardous situations to inform the content of injury prevention efforts.

  2. Price-volume multifractal analysis and its application in Chinese stock markets

    Science.gov (United States)

    Yuan, Ying; Zhuang, Xin-tian; Liu, Zhi-ying

    2012-06-01

    An empirical research on Chinese stock markets is conducted using statistical tools. First, the multifractality of stock price return series, ri(ri=ln(Pt+1)-ln(Pt)) and trading volume variation series, vi(vi=ln(Vt+1)-ln(Vt)) is confirmed using multifractal detrended fluctuation analysis. Furthermore, a multifractal detrended cross-correlation analysis between stock price return and trading volume variation in Chinese stock markets is also conducted. It is shown that the cross relationship between them is also found to be multifractal. Second, the cross-correlation between stock price Pi and trading volume Vi is empirically studied using cross-correlation function and detrended cross-correlation analysis. It is found that both Shanghai stock market and Shenzhen stock market show pronounced long-range cross-correlations between stock price and trading volume. Third, a composite index R based on price and trading volume is introduced. Compared with stock price return series ri and trading volume variation series vi, R variation series not only remain the characteristics of original series but also demonstrate the relative correlation between stock price and trading volume. Finally, we analyze the multifractal characteristics of R variation series before and after three financial events in China (namely, Price Limits, Reform of Non-tradable Shares and financial crisis in 2008) in the whole period of sample to study the changes of stock market fluctuation and financial risk. It is found that the empirical results verified the validity of R.

  3. The Emotional Stroop Task and Posttraumatic Stress Disorder: a Meta-Analysis

    Science.gov (United States)

    Cisler, Josh M.; Wolitzky-Taylor, Kate B.; Adams, Thomas G.; Babson, Kimberly A.; Badour, Christal L.; Willems, Jeffrey L.

    2011-01-01

    Posttraumatic stress disorder (PTSD) is associated with significant impairment and lowered quality of life. The emotional Stroop task (EST) has been one means of elucidating some of the core deficits in PTSD, but this literature has remained inconsistent. We conducted a meta-analysis of EST studies in PTSD populations in order to synthesize this body of research. Twenty-six studies were included with 538 PTSD participants, 254 non-trauma exposed control participants (NTC), and 276 trauma exposed control participants (TC). PTSD-relevant words impaired EST performance more among PTSD groups and TC groups compared to NTC groups. PTSD groups and TC groups did not differ. When examining within-subject effect sizes, PTSD-relevant words and generally threatening words impaired EST performance relative to neutral words among PTSD groups, and only PTSD-relevant words impaired performance among the TC groups. These patterns were not found among the NTC groups. Moderator analyses suggested that these effects were significantly greater in blocked designs compared to randomized designs, towards unmasked compared to masked stimuli, and among samples exposed to assaultive traumas compared to samples exposed to non-assaultive traumas. Theoretical and clinical implications are discussed. PMID:21545780

  4. Promoting best practice design intent in 3D CAD for engineers through a task analysis

    Directory of Open Access Journals (Sweden)

    Keelin Leahy

    2013-01-01

    Full Text Available Assessment encompasses a range of methods and techniques. At the University of Limerick, Ireland, it is an affirmed obligation to facilitate timely and useful feedback for both formative (for learning and summative (of learning assessment. However, the effectiveness of this feedback has raised concern and has a wide-ranging review of research findings. This paper presents research findings to build a picture of the extent to which the impact of feedback as a constructivist paradigm of teaching and learning can promote best practice design intent in 3D CAD Modelling. The resulting data set, comprised of 114 higher education students, is used to discuss the impact of assessment and feedback, comparing semesters Spring 2011/12 and Spring 2012/13. The 2012/13 cohort received formative assessment feedback from a task analysis. This evidenced an upsurge in understanding in best practice design intent in 3D CAD parametric modelling, supported by an effect size of 0.534.

  5. Task-based optimization of flip angle for texture analysis in MRI

    Science.gov (United States)

    Brand, Jonathan F.; Furenlid, Lars R.; Altbach, Maria I.; Galons, Jean-Phillippe; Bhattacharyya, Achyut; Sharma, Puneet; Bhattacharyya, Tulshi; Bilgin, Ali; Martin, Diego R.

    2016-03-01

    Chronic liver disease is a worldwide health problem, and hepatic fibrosis (HF) is one of the hallmarks of the disease. The current reference standard for diagnosing HF is biopsy followed by pathologist examination, however this is limited by sampling error and carries risk of complications. Pathology diagnosis of HF is based on textural change in the liver as a lobular collagen network that develops within portal triads. The scale of collagen lobules is characteristically on order of 1-5 mm, which approximates the resolution limit of in vivo gadolinium-enhanced magnetic resonance imaging in the delayed phase. We have shown that MRI of formalin fixed human ex vivo liver samples mimic the textural contrast of in vivo Gd-MRI and can be used as MRI phantoms. We have developed local texture analysis that is applied to phantom images, and the results are used to train model observers. The performance of the observer is assessed with the area-under-the-receiveroperator- characteristic curve (AUROC) as the figure of merit. To optimize the MRI pulse sequence, phantoms are scanned with multiple times at a range of flip angles. The flip angle that associated with the highest AUROC is chosen as optimal based on the task of detecting HF.

  6. Final report for confinement vessel analysis. Task 2, Safety vessel impact analyses

    Energy Technology Data Exchange (ETDEWEB)

    Murray, Y.D. [APTEK, Inc., Colorado Springs, CO (United States)

    1994-01-26

    This report describes two sets of finite element analyses performed under Task 2 of the Confinement Vessel Analysis Program. In each set of analyses, a charge is assumed to have detonated inside the confinement vessel, causing the confinement vessel to fail in either of two ways; locally around the weld line of a nozzle, or catastrophically into two hemispheres. High pressure gases from the internal detonation pressurize the inside of the safety vessel and accelerate the fractured nozzle or hemisphere into the safety vessel. The first set of analyses examines the structural integrity of the safety vessel when impacted by the fractured nozzle. The objective of these calculations is to determine if the high strength bolt heads attached to the nozzle penetrate or fracture the lower strength safety vessel, thus allowing gaseous detonation products to escape to the atmosphere. The two dimensional analyses predict partial penetration of the safety vessel beneath the tip of the penetrator. The analyses also predict maximum principal strains in the safety vessel which exceed the measured ultimate strain of steel. The second set of analyses examines the containment capability of the safety vessel closure when impacted by half a confinement vessel (hemisphere). The predicted response is the formation of a 0.6-inch gap, caused by relative sliding and separation between the two halves of the safety vessel. Additional analyses with closure designs that prevent the gap formation are recommended.

  7. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.

  8. Obtaining Content Weights for Test Specifications from Job Analysis Task Surveys: An Application of the Many-Facets Rasch Model

    Science.gov (United States)

    Wang, Ning; Stahl, John

    2012-01-01

    This article discusses the use of the Many-Facets Rasch Model, via the FACETS computer program (Linacre, 2006a), to scale job/practice analysis survey data as well as to combine multiple rating scales into single composite weights representing the tasks' relative importance. Results from the Many-Facets Rasch Model are compared with those…

  9. Enabling Dynamic Process Simulators to Perform Alternative Tasks: A Time-stepper Based Toolkit for Computer-Aided Analysis

    OpenAIRE

    Siettos, C. I.; Pantelides, C. C.; Kevrekidis, I.G.

    2003-01-01

    We discuss computational superstructures that, using repeated, appropriately initialized short calls, enable temporal process simulators to perform alternative tasks such as fixed point computation, stability analysis and projective integration. We illustrate these concepts through the acceleration of a gPROMS-based Rapid Pressure Swing Adsorption simulation, and discuss their scope and possible extensions.

  10. A neural network approach to fMRI binocular visual rivalry task analysis.

    Directory of Open Access Journals (Sweden)

    Nicola Bertolino

    Full Text Available The purpose of this study was to investigate whether artificial neural networks (ANN are able to decode participants' conscious experience perception from brain activity alone, using complex and ecological stimuli. To reach the aim we conducted pattern recognition data analysis on fMRI data acquired during the execution of a binocular visual rivalry paradigm (BR. Twelve healthy participants were submitted to fMRI during the execution of a binocular non-rivalry (BNR and a BR paradigm in which two classes of stimuli (faces and houses were presented. During the binocular rivalry paradigm, behavioral responses related to the switching between consciously perceived stimuli were also collected. First, we used the BNR paradigm as a functional localizer to identify the brain areas involved the processing of the stimuli. Second, we trained the ANN on the BNR fMRI data restricted to these regions of interest. Third, we applied the trained ANN to the BR data as a 'brain reading' tool to discriminate the pattern of neural activity between the two stimuli. Fourth, we verified the consistency of the ANN outputs with the collected behavioral indicators of which stimulus was consciously perceived by the participants. Our main results showed that the trained ANN was able to generalize across the two different tasks (i.e. BNR and BR and to identify with high accuracy the cognitive state of the participants (i.e. which stimulus was consciously perceived during the BR condition. The behavioral response, employed as control parameter, was compared with the network output and a statistically significant percentage of correspondences (p-value <0.05 were obtained for all subjects. In conclusion the present study provides a method based on multivariate pattern analysis to investigate the neural basis of visual consciousness during the BR phenomenon when behavioral indicators lack or are inconsistent, like in disorders of consciousness or sedated patients.

  11. Analysis of Cloud Network Management Using Resource Allocation and Task Scheduling Services

    Directory of Open Access Journals (Sweden)

    K.C. Okafor

    2016-01-01

    Full Text Available Network failure in cloud datacenter could result from inefficient resource allocation; scheduling and logical segmentation of physical machines (network constraints. This is highly undesirable in Distributed Cloud Computing Networks (DCCNs running mission critical services. Such failure has been identified in the University of Nigeria datacenter network situated in the south eastern part of Nigeria. In this paper, the architectural decomposition of a proposed DCCN was carried out while exploring its functionalities for grid performance. Virtualization services such as resource allocation and task scheduling were employed in heterogeneous server clusters. The validation of the DCCN performance was carried out using trace files from Riverbed Modeller 17.5 in order to ascertain the influence of virtualization on server resource pool. The QoS metrics considered in the analysis are: the service delay time, resource availability, throughput and utilization. From the validation analysis of the DCCN, the following results were obtained: average throughput (bytes/Sec for DCCN = 40.00%, DCell = 33.33% and BCube = 26.67%. Average resource availability response for DCCN = 38.46%, DCell = 33.33%, and BCube = 28.21%. DCCN density on resource utilization = 40% (when logically isolated and 60% (when not logically isolated. From the results, it was concluded that using virtualization in a cloud DataCenter servers will result in enhanced server performance offering lower average wait time even with a higher request rate and longer duration of resource use (service availability. By evaluating these recursive architectural designs for network operations, enterprises ready for Spine and leaf model could further develop their network resource management schemes for optimal performance.

  12. Measurement and analysis of grain boundary grooving by volume diffusion

    Science.gov (United States)

    Hardy, S. C.; Mcfadden, G. B.; Coriell, S. R.; Voorhees, P. W.; Sekerka, R. F.

    1991-01-01

    Experimental measurements of isothermal grain boundary grooving by volume diffusion are carried out for Sn bicrystals in the Sn-Pb system near the eutectic temperature. The dimensions of the groove increase with a temporal exponent of 1/3, and measurement of the associated rate constant allows the determination of the product of the liquid diffusion coefficient D and the capillarity length Gamma associated with the interfacial free energy of the crystal-melt interface. The small-slope theory of Mullins is generalized to the entire range of dihedral angles by using a boundary integral formulation of the associated free boundary problem, and excellent agreement with experimental groove shapes is obtained. By using the diffusivity measured by Jordon and Hunt, the present measured values of Gamma are found to agree to within 5 percent with the values obtained from experiments by Gunduz and Hunt on grain boundary grooving in a temperature gradient.

  13. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    CERN Document Server

    Tozer-Loft, S M

    2000-01-01

    compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising, but significant association with outcome. A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy i...

  14. Aerodynamic analysis of flapping foils using volume grid deformation code

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Jin Hwan [Seoul National University, Seoul (Korea, Republic of); Kim, Jee Woong; Park, Soo Hyung; Byun, Do Young [Konkuk University, Seoul (Korea, Republic of)

    2009-06-15

    Nature-inspired flapping foils have attracted interest for their high thrust efficiency, but the large motions of their boundaries need to be considered. It is challenging to develop robust, efficient grid deformation algorithms appropriate for the large motions in three dimensions. In this paper, a volume grid deformation code is developed based on finite macro-element and transfinite interpolation, which successfully interfaces to a structured multi-block Navier-Stokes code. A suitable condition that generates the macro-elements with efficiency and improves the robustness of grid regularity is presented as well. As demonstrated by an airfoil with various motions related to flapping, the numerical results of aerodynamic forces by the developed method are shown to be in good agreement with those of an experimental data or a previous numerical solution

  15. Scram discharge volume break studies accident sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harrington, R.M.; Hodge, S.A.

    1982-01-01

    This paper is a summary of a report describing the predicted response of Unit 1 at the Tennessee Valley Authority (TVA) Browns Ferry Nuclear Plant to a hypothetical small break loss of coolant accident (SBLOCA) outside of containment. The accident studied would be initiated by a break in the scram discharge volume (SDV) piping when it is pressurized to full reactor vessel pressure as a normal consequence of a reactor scram. If the scram could be reset, the scram outlet valves would close to isolate the SDV and the piping break from the reactor vessel. However, reset is possible only if the conditions that caused the scram have cleared; it has been assumed in this study that the scram signal remains in effect over a long period of time.

  16. STICAP: A linear circuit analysis program with stiff systems capability. Volume 1: Theory manual. [network analysis

    Science.gov (United States)

    Cooke, C. H.

    1975-01-01

    STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.

  17. Thermal characterization and analysis of microliter liquid volumes using the three-omega method.

    Science.gov (United States)

    Roy-Panzer, Shilpi; Kodama, Takashi; Lingamneni, Srilakshmi; Panzer, Matthew A; Asheghi, Mehdi; Goodson, Kenneth E

    2015-02-01

    Thermal phenomena in many biological systems offer an alternative detection opportunity for quantifying relevant sample properties. While there is substantial prior work on thermal characterization methods for fluids, the push in the biology and biomedical research communities towards analysis of reduced sample volumes drives a need to extend and scale these techniques to these volumes of interest, which can be below 100 pl. This work applies the 3ω technique to measure the temperature-dependent thermal conductivity and heat capacity of de-ionized water, silicone oil, and salt buffer solution droplets from 24 to 80 °C. Heater geometries range in length from 200 to 700 μm and in width from 2 to 5 μm to accommodate the size restrictions imposed by small volume droplets. We use these devices to measure droplet volumes of 2 μl and demonstrate the potential to extend this technique down to pl droplet volumes based on an analysis of the thermally probed volume. Sensitivity and uncertainty analyses provide guidance for relevant design variables for characterizing properties of interest by investigating the tradeoffs between measurement frequency regime, device geometry, and substrate material. Experimental results show that we can extract thermal conductivity and heat capacity with these sample volumes to within less than 1% of thermal properties reported in the literature.

  18. Do skeletal cephalometric characteristics correlate with condylar volume, surface and shape? A 3D analysis

    Directory of Open Access Journals (Sweden)

    Saccucci Matteo

    2012-05-01

    Full Text Available Abstract Objective The purpose of this study was to determine the condylar volume in subjects with different mandibular divergence and skeletal class using cone-beam computed tomography (CBCT and analysis software. Materials and methods For 94 patients (46 females and 48 males; mean age 24.3 ± 6.5 years, resultant rendering reconstructions of the left and right temporal mandibular joints (TMJs were obtained. Subjects were then classified on the base of ANB angle the GoGn-SN angle in three classes (I, II, III . The data of the different classes were compared. Results No significant difference was observed in the whole sample between the right and the left sides in condylar volume. The analysis of mean volume among low, normal and high mandibular plane angles revealed a significantly higher volume and surface in low angle subjects (p  Class III subjects also tended to show a higher condylar volume and surface than class I and class II subjects, although the difference was not significant. Conclusions Higher condylar volume was a common characteristic of low angle subjects compared to normal and high mandibular plane angle subjects. Skeletal class also appears to be associated to condylar volume and surface.

  19. Upper Extremity Motor Learning among Individuals with Parkinson's Disease: A Meta-Analysis Evaluating Movement Time in Simple Tasks

    Directory of Open Access Journals (Sweden)

    K. Felix

    2012-01-01

    Full Text Available Motor learning has been found to occur in the rehabilitation of individuals with Parkinson's disease (PD. Through repetitive structured practice of motor tasks, individuals show improved performance, confirming that motor learning has probably taken place. Although a number of studies have been completed evaluating motor learning in people with PD, the sample sizes were small and the improvements were variable. The purpose of this meta-analysis was to determine the ability of people with PD to learn motor tasks. Studies which measured movement time in upper extremity reaching tasks and met the inclusion criteria were included in the analysis. Results of the meta-analysis indicated that people with PD and neurologically healthy controls both demonstrated motor learning, characterized by a decrease in movement time during upper extremity movements. Movement time improvements were greater in the control group than in individuals with PD. These results support the findings that the practice of upper extremity reaching tasks is beneficial in reducing movement time in persons with PD and has important implications for rehabilitation.

  20. Impact of Dual Task on Parkinson's Disease, Stroke and Ataxia Patients' Gait: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Michelly Arjona Maciel

    2014-01-01

    Full Text Available Introduction: Performing dual task for neurological patients is complex and it can be influenced by the localization of the neurological lesion. Objective: Comparing the impact of dual task on gait in patients with Parkinson's disease, stroke and ataxia. Method: Subjects with Parkinson's disease (PD in initial phase, stroke and ataxia, with independent gait, were evaluated while doing simple gait, with cognitive, motor and cognitive-motor gait demand, assessing average speed and number of steps. Results: Ataxia and stroke patients, compared with PD, showed an increase in the number of steps and decrease the average speed on the march with cognitive demand. Subjects with PD performed better on tasks when compared to others. Conclusion: In this study the impact of dual task was lower in Parkinson's disease patients.

  1. Left ventricular pressure and volume data acquisition and analysis using LabVIEW.

    Science.gov (United States)

    Cassidy, S C; Teitel, D F

    1997-03-01

    To automate analysis of left ventricular pressure-volume data, we used LabVIEW to create applications that digitize and display data recorded from conductance and manometric catheters. Applications separate data into cardiac cycles, calculate parallel conductance, and calculate indices of left ventricular function, including end-systolic elastance, preload-recruitable stroke work, stroke volume, ejection fraction, stroke work, maximum and minimum derivative of ventricular pressure, heart rate, indices of relaxation, peak filling rate, and ventricular chamber stiffness. Pressure-volume loops can be graphically displayed. These analyses are exported to a text-file. These applications have simplified and automated the process of evaluating ventricular function.

  2. An analysis of the processing requirements of a complex perceptual-motor task

    Science.gov (United States)

    Kramer, A. F.; Wickens, C. D.; Donchin, E.

    1983-01-01

    Current concerns in the assessment of mental workload are discussed, and the event-related brain potential (ERP) is introduced as a promising mental-workload index. Subjects participated in a series of studies in which they were required to perform a target acquisition task while also covertly counting either auditory or visual probes. The effects of several task-difficulty manipulations on the P300 component of the ERP elicited by the counted stimulus probes were investigated. With sufficiently practiced subjects the amplitude of the P300 was found to decrease with increases in task difficulty. The second experiment also provided evidence that the P300 is selectively sensitive to task-relevant attributes. A third experiment demonstrated a convergence in the amplitude of the P300s elicited in the simple and difficult versions of the tracking task. The amplitude of the P300 was also found to covary with the measures of tracking performance. The results of the series of three experiments illustrate the sensitivity of the P300 to the processing requirements of a complex target acquisition task. The findings are discussed in terms of the multidimensional nature of processing resources.

  3. Recalling academic tasks

    Science.gov (United States)

    Draper, Franklin Gno

    This study was focused on what students remembered about five middle school science tasks when they were juniors and seniors in high school. Descriptions of the five tasks were reconstructed from available artifacts and teachers' records, notes and recollections. Three of the five tasks were "authentic" in the sense that students were asked to duplicate the decisions practitioners make in the adult world. The other two tasks were more typical school tasks involving note taking and preparation for a quiz. All five tasks, however, involved use of computers. Students were interviewed to examine what and how well they recalled the tasks and what forms or patterns of recall existed. Analysis of their responses indicated that different kinds of tasks produced different levels of recall. Authentically situated tasks were remembered much better than routine school tasks. Further, authentic tasks centered on design elements were recalled better than those for which design was not as pivotal. Patterns of recall indicated that participants most often recalled the decisions they made, the scenarios of the authentically situated tasks, the consequences of their tasks and the social contexts of the classroom. Task events, in other words, appeared to form a framework upon which students constructed stories of the tasks. The more salient the events, the richer the story, the deeper and more detailed the recall of the task. Thus, authentic tasks appeared to lend themselves to creating stories better than regular school tasks and therefore such tasks were recalled better. Implications of these patterns of recall are discussed with respect to issues of school learning and assessment.

  4. Lessons from a pilot project in cognitive task analysis: the potential role of intermediates in preclinical teaching in dental education.

    Science.gov (United States)

    Walker, Judith; von Bergmann, HsingChi

    2015-03-01

    The purpose of this study was to explore the use of cognitive task analysis to inform the teaching of psychomotor skills and cognitive strategies in clinical tasks in dental education. Methods used were observing and videotaping an expert at one dental school thinking aloud while performing a specific preclinical task (in a simulated environment), interviewing the expert to probe deeper into his thinking processes, and applying the same procedures to analyze the performance of three second-year dental students who had recently learned the analyzed task and who represented a spectrum of their cohort's ability to undertake the procedure. The investigators sought to understand how experts (clinical educators) and intermediates (trained students) overlapped and differed at points in the procedure that represented the highest cognitive load, known as "critical incidents." Findings from this study and previous research identified possible limitations of current clinical teaching as a result of expert blind spots. These findings coupled with the growing evidence of the effectiveness of peer teaching suggest the potential role of intermediates in helping novices learn preclinical dentistry tasks.

  5. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  6. Time course of information representation of macaque AIP neurons in hand manipulation task revealed by information analysis.

    Science.gov (United States)

    Sakaguchi, Yutaka; Ishida, Fumihiko; Shimizu, Takashi; Murata, Akira

    2010-12-01

    We used mutual information analysis of neuronal activity in the macaque anterior intraparietal area (AIP) to examine information processing during a hand manipulation task. The task was to reach-to-grasp a three-dimensional (3D) object after presentation of a go signal. Mutual information was calculated between the spike counts of individual neurons in 50-ms-wide time bins and six unique shape classifications or 15 one-versus-one classifications of these shapes. The spatiotemporal distribution of mutual information was visualized as a two-dimensional image ("information map") to better observe global profiles of information representation. In addition, a nonnegative matrix factorization technique was applied for extracting its structure. Our major finding was that the time course of mutual information differed significantly according to different classes of task-related neurons. This strongly suggests that different classes of neurons were engaged in different information processing stages in executing the hand manipulation task. On the other hand, our analysis revealed the heterogeneous nature of information representation of AIP neurons. For example, "information latency" (or information onset) varied among individual neurons even in the same neuron class and the same shape classification. Further, some neurons changed "information preference" (i.e., shape classification with the largest amount of information) across different task periods. These suggest that neurons encode different information in the different task periods. Taking the present result together with previous findings, we used a Gantt chart to propose a hypothetical scheme of the dynamic interactions between different types of AIP neurons.

  7. Defense Science Board 1996 Summer Study Task Force On Tactics and Technology for 21st Century Military Superiority. Volume 2, Part 1. Supporting Materials

    Science.gov (United States)

    1996-10-01

    Nancy Chesser • Defense Science Board member ** Members Ex Officio Volume 2, Part 1, Conops 1- Leading Edge Strike Force Volume 2, Part 1, Conops 1...the Sense and Destroy Armor (SADARM) Munition (U), RAND, 1995, MR-510-A. Matsumura, J., E. Cardenas , K. Horn, E. McDonald, Future Army Long- Range

  8. Oak Ridge Health Studies Phase 1 report, Volume 2: Part D, Dose Reconstruction Feasibility Study. Tasks 6, Hazard summaries for important materials at the Oak Ridge Reservation

    Energy Technology Data Exchange (ETDEWEB)

    Bruce, G.M.; Walker, L.B.; Widner, T.E.

    1993-09-01

    The purpose of Task 6 of Oak Ridge Phase I Health Studies is to provide summaries of current knowledge of toxic and hazardous properties of materials that are important for the Oak Ridge Reservation. The information gathered in the course of Task 6 investigations will support the task of focussing any future health studies efforts on those operations and emissions which have likely been most significant in terms of off-site health risk. The information gathered in Task 6 efforts will likely also be of value to individuals evaluating the feasibility of additional health,study efforts (such as epidemiological investigations) in the Oak Ridge area and as a resource for citizens seeking information on historical emissions.

  9. Ceramic component development analysis -- Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Boss, D.E.

    1998-06-09

    The development of advanced filtration media for advanced fossil-fueled power generating systems is a critical step in meeting the performance and emissions requirements for these systems. While porous metal and ceramic candle-filters have been available for some time, the next generation of filters will include ceramic-matrix composites (CMCs) (Techniweave/Westinghouse, Babcock and Wilcox (B and W), DuPont Lanxide Composites), intermetallic alloys (Pall Corporation), and alternate filter geometries (CeraMem Separations). The goal of this effort was to perform a cursory review of the manufacturing processes used by 5 companies developing advanced filters from the perspective of process repeatability and the ability for their processes to be scale-up to produce volumes. Given the brief nature of the on-site reviews, only an overview of the processes and systems could be obtained. Each of the 5 companies had developed some level of manufacturing and quality assurance documentation, with most of the companies leveraging the procedures from other products they manufacture. It was found that all of the filter manufacturers had a solid understanding of the product development path. Given that these filters are largely developmental, significant additional work is necessary to understand the process-performance relationships and projecting manufacturing costs.

  10. Grid-connected ICES: preliminary feasibility analysis and evaluation. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    The HEAL Complex in New Orleans will serve as a Demonstration Community for which the ICES Demonstration System will be designed. The complex is a group of hospitals, clinics, research facilities, and medical educational facilities. The five tasks reported on are: preliminary energy analysis; preliminary institutional assessment; conceptual design; firming-up of commitments; and detailed work management plan.

  11. Design and analysis of self-adapted task scheduling strategies in wireless sensor networks.

    Science.gov (United States)

    Guo, Wenzhong; Xiong, Naixue; Chao, Han-Chieh; Hussain, Sajid; Chen, Guolong

    2011-01-01

    In a wireless sensor network (WSN), the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, their counterparts for WSNs remain largely unexplored. Existing traditional high performance computing solutions cannot be directly implemented in WSNs due to the limitations of WSNs such as limited resource availability and the shared communication medium. In this paper, a self-adapted task scheduling strategy for WSNs is presented. First, a multi-agent-based architecture for WSNs is proposed and a mathematical model of dynamic alliance is constructed for the task allocation problem. Then an effective discrete particle swarm optimization (PSO) algorithm for the dynamic alliance (DPSO-DA) with a well-designed particle position code and fitness function is proposed. A mutation operator which can effectively improve the algorithm's ability of global search and population diversity is also introduced in this algorithm. Finally, the simulation results show that the proposed solution can achieve significant better performance than other algorithms.

  12. Design and Analysis of Self-Adapted Task Scheduling Strategies in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sajid Hussain

    2011-06-01

    Full Text Available In a wireless sensor network (WSN, the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and  scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, their counterparts for WSNs remain largely unexplored. Existing traditional high performance computing solutions cannot be directly implemented in WSNs due to the limitations of WSNs such as limited resource availability and the shared communication medium. In this paper, a self-adapted task scheduling strategy for WSNs is presented. First, a multi-agent-based architecture for WSNs is proposed and a mathematical model of dynamic alliance is constructed for the task allocation problem. Then an effective discrete particle swarm optimization (PSO algorithm for the dynamic alliance (DPSO-DA with a well-designed particle position code and fitness function is proposed. A mutation operator which can effectively improve the algorithm’s ability of global search and population diversity is also introduced in this algorithm. Finally, the simulation results show that the proposed solution can achieve significant better performance than other algorithms.

  13. Re: Madsen et al. "Unnecessary work tasks and mental health: a prospective analysis of Danish human service workers".

    Science.gov (United States)

    Durand-Moreau, Quentin; Loddé, Brice; Dewitte, Jean-Dominique

    2015-03-01

    Madsen et al (1) recently published a secondary analysis on data provided by the Project on Burnout, Motivation and Job Satisfaction (PUMA). The aim of their study, published in the Scandinavian Journal of Work, Environment & Health was to examine the associations between unnecessary work tasks and a decreased level of mental health. Though the topic was quite novel, reading this work proved disturbing and raised issues. Based on the results of this study, the authors stated that there is an association between unnecessary work tasks (assessed by a single question) and a decreased level of mental health, idem [assessed by the Mental Health Inventory (MHI-5)], in the specific population included in this PUMA survey. The authors point out a limitation of the study, namely that unnecessary work tasks were evaluated using one single question: "Do you sometimes have to do things in your job which appear to be unnecessary?". Semmer defines unnecessary work task as "tasks that should not be carried out at all because they do not make sense or because they could have been avoided, or could be carried out with less effort if things were organized more efficiently" (2). De facto, qualifying what an unnecessary task is requires stating or explaining whether the task makes sense. Making sense or not is not an objective notion. It is very difficult for either a manager or an employee to say if a task is necessary or not. Most important is that it makes sense from the worker's point of view. Making sense and being necessary are not synonyms. Some tasks do not make sense but are economically necessary (eg, when, as physicians, we are reporting our activity using ICD-10 on computers instead of being at patients' bedsides or reading this journal). Thus, there is a wide gap between Semmer's definition and the question used by the authors to evaluate his concept. A secondary analysis based on a single question is not adequate to evaluate unnecessary tasks. Nowadays, the general trend

  14. Analysis of Petri net model and task planning heuristic algorithms for product reconfiguration

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Reconfiguration planning is recognized as an important factor for reducing the cost of manufacturing reconfigurable products, and the associated main task is to generate a set of optimal or near-optimal reconfiguration sequences using some effect algorithms. A method is developed to generate a Petri net as the reconfiguration tree to represent two-state-transit of product, which solved the representation problem of reconfiguring interfaces replacement. Relating with this method, two heuristic algorithms are proposed to generate task sequences which considering economics to search reconfiguration paths effectively. At last,an objective evaluation is applied to compare these two heuristic algorithms to other ones. The developed reconfiguration task planning heuristic algorithms can generate better strategies and plans for reconfiguration. The research finds are exemplified with struts reconfiguration of reconfigurable parallel kinematics machine (RPKM).

  15. Empirical Analysis of EEG and ERPs for Psychophysiological Adaptive Task Allocation

    Science.gov (United States)

    Prinzel, Lawrence J., III; Pope, Alan T.; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.

    2001-01-01

    The present study was designed to test the efficacy of using Electroencephalogram (EEG) and Event-Related Potentials (ERPs) for making task allocation decisions. Thirty-six participants were randomly assigned to an experimental, yoked, or control group condition. Under the experimental condition, a tracking task was switched between task modes based upon the participant's EEG. The results showed that the use of adaptive aiding improved performance and lowered subjective workload under negative feedback as predicted. Additionally, participants in the adaptive group had significantly lower RMSE and NASA-TLX ratings than participants in either the yoked or control group conditions. Furthermore, the amplitudes of the N1 and P3 ERP components were significantly larger under the experimental group condition than under either the yoked or control group conditions. These results are discussed in terms of the implications for adaptive automation design.

  16. Passive solar design handbook. Volume 3: Passive solar design analysis

    Science.gov (United States)

    Jones, R. W.; Bascomb, J. D.; Kosiewicz, C. E.; Lazarus, G. S.; McFarland, R. D.; Wray, W. O.

    1982-07-01

    Simple analytical methods concerning the design of passive solar heating systems are presented with an emphasis on the average annual heating energy consumption. Key terminology and methods are reviewed. The solar load ratio (SLR) is defined, and its relationship to analysis methods is reviewed. The annual calculation, or Load Collector Ratio (LCR) method, is outlined. Sensitivity data are discussed. Information is presented on balancing conservation and passive solar strategies in building design. Detailed analysis data are presented for direct gain and sunspace systems, and details of the systems are described. Key design parameters are discussed in terms of their impact on annual heating performance of the building. These are the sensitivity data. The SLR correlations for the respective system types are described. The monthly calculation, or SLR method, based on the SLR correlations, is reviewed. Performance data are given for 9 direct gain systems and 15 water wall and 42 Trombe wall systems.

  17. Passive solar design handbook. Volume III. Passive solar design analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, R.W.; Balcomb, J.D.; Kosiewicz, C.E.; Lazarus, G.S.; McFarland, R.D.; Wray, W.O.

    1982-07-01

    Simple analytical methods concerning the design of passive solar heating systems are presented with an emphasis on the average annual heating energy consumption. Key terminology and methods are reviewed. The solar load ratio (SLR) is defined, and its relationship to analysis methods is reviewed. The annual calculation, or Load Collector Ratio (LCR) method, is outlined. Sensitivity data are discussed. Information is presented on balancing conservation and passive solar strategies in building design. Detailed analysis data are presented for direct gain and sunspace systems, and details of the systems are described. Key design parameters are discussed in terms of their impact on annual heating performance of the building. These are the sensitivity data. The SLR correlations for the respective system types are described. The monthly calculation, or SLR method, based on the SLR correlations, is reviewed. Performance data are given for 9 direct gain systems and 15 water wall and 42 Trombe wall systems. (LEW)

  18. Quantitative Indicators for Defense Analysis. Volume II. Technical Report

    Science.gov (United States)

    1975-06-01

    34*"WTOiw«* piB ^ r- ••’ ’ ’■’.WH""" - "«.JH QUAURANT II Hot War JIoL ]War land i Cold I |Criscs War iThreaten ed - Crisis 1...34The Political Analysis of Negotiations," World Politics 26. 3 (April). ^(1971) The Politics of Trade Negotiations Between Africa and the EEC

  19. Development of an advanced, continuous mild gasification process for the production of co-products (Tasks 2, 3, and 4. 1 to 4. 6), Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Knight, R.A.; Gissy, J.L.; Onischak, M.; Babu, S.P.; Carty, R.H. (Institute of Gas Technology, Chicago, IL (United States)); Duthie, R.G. (Bechtel Group, Inc., San Francisco, CA (United States)); Wootten, J.M. (Peabody Holding Co., Inc., St. Louis, MO (United States))

    1991-09-01

    Volume 2 contains information on the following topics: (1) Mild Gasification Technology Development: Process Research Unit Tests Using Slipstream Sampling; (2) Bench-Scale Char Upgrading Study; (3) Mild Gasification Technology Development: System Integration Studies. (VC)

  20. [Environmental investigation of ground water contamination at Wright- Patterson Air Force Base, Ohio]. Volume 4, Health and Safety Plan (HSP); Phase 1, Task 4 Field Investigation report: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This Health and Safety Plan (HSP) was developed for the Environmental Investigation of Ground-water Contamination Investigation at Wright-Patterson Air Force Base near Dayton, Ohio, based on the projected scope of work for the Phase 1, Task 4 Field Investigation. The HSP describes hazards that may be encountered during the investigation, assesses the hazards, and indicates what type of personal protective equipment is to be used for each task performed. The HSP also addresses the medical monitoring program, decontamination procedures, air monitoring, training, site control, accident prevention, and emergency response.

  1. Automatic UPDRS Evaluation in the Sit-to-Stand Task of Parkinsonians: Kinematic Analysis and Comparative Outlook on the Leg Agility Task.

    Science.gov (United States)

    Giuberti, Matteo; Ferrari, Gianluigi; Contin, Laura; Cimolin, Veronica; Azzaro, Corrado; Albani, Giovanni; Mauro, Alessandro

    2015-05-01

    In this study, we first characterize the sit-to-stand (S2S) task, which contributes to the evaluation of the degree of severity of the Parkinson's disease (PD), through kinematic features, which are then linked to the Unified Parkinson's disease rating scale (UPDRS) scores. We propose to use a single body-worn wireless inertial node placed on the chest of a patient. The experimental investigation is carried out considering 24 PD patients, comparing the obtained results directly with the kinematic characterization of the leg agility (LA) task performed by the same set of patients. We show that i) the S2S and LA tasks are rather unrelated and ii) the UPDRS distributions (for both S2S and LA tasks) across the patients have a direct impact on the observed system performance.

  2. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis. Computer program user's manual

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.; Adamczyk, John J.; Miller, Christopher J.; Arnone, Andrea; Swanson, Charles

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOACR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This report is intended to serve as a computer program user's manual for the ADPAC-AOACR codes developed under Task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  3. Eye Movement Analysis of Task and Content Commonalities in Information Processing.

    Science.gov (United States)

    Dillon, Ronna F.; Stevenson-Hicks, Randy

    A study examined the extent to which common knowledge structures and the presence of common performance components are important factors in reasoning. The eye fixations of 37 college students were recorded as they solved four types of complex inductive reasoning tasks: (1) verbal analogies, (2) figural analogies, (3) verbal series completions, and…

  4. Difficulties in solving context-based PISA mathematics tasks : An analysis of students' errors

    NARCIS (Netherlands)

    Wijaya, Ariyadi; van den Heuvel-Panhuizen, Marja; Doorman, Michiel; Robitzsch, Alexander

    2014-01-01

    The intention of this study was to clarify students' difficulties in solving context-based mathematics tasks as used in the Programme for International Student Assessment (PISA). The study was carried out with 362 Indonesian ninth- and tenth-grade students. In the study we used 34 released PISA math

  5. Development and Confirmatory Factory Analysis of the Achievement Task Value Scale for University Students

    Science.gov (United States)

    Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen

    2013-01-01

    The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…

  6. Analysis of Air Traffic Controller Workload Reduction Based on the Solution Space for the Merging Task

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Mulder, M.; Van Paassen, M.M.

    2010-01-01

    Air traffic controller workload is considered to be an important limiting factor to the growth of air traffic. The difficulty of an air traffic control task can be analyzed through examining the problem’s solution space, that is, all possible vector commands that satisfy the constraints of safety, p

  7. Path Analysis Examining Self-Efficacy and Decision-Making Performance on a Simulated Baseball Task

    Science.gov (United States)

    Hepler, Teri J.; Feltz, Deborah L.

    2012-01-01

    The purpose of this study was to examine the relationship between decision-making self-efficacy and decision-making performance in sport. Undergraduate students (N = 78) performed 10 trials of a decision-making task in baseball. Self-efficacy was measured before performing each trial. Decision-making performance was assessed by decision speed and…

  8. A meta-analysis of the impact of situationally induced achievement goals on task performance

    NARCIS (Netherlands)

    Van Yperen, Nico W.; Blaga, Monica; Postmes, Thomas

    2015-01-01

    The purpose of this research was to meta-analyze studies which experimentally induced an achieve- ment goal state to examine its causal effect on the individual’s performance at the task at hand, and to investigate the moderator effects of feedback anticipation and time pressure. The data set compri

  9. Task analysis of information technology-mediated medication management in outpatient care

    NARCIS (Netherlands)

    Stiphout, F. van; Zwart-van Rijkom, J.E.F.; Maggio, L.A.; Aarts, J.E.C.M.; Bates, D.W.; Gelder, T. van; Jansen, P.A.F.; Schraagen, J.M.C.; Egberts, A.C.G.; Braak, E.W.M.T. ter

    2015-01-01

    Aims Educating physicians in the procedural as well as cognitive skills of information technology (IT)-mediated medication management could be one of the missing links for the improvement of patient safety. We aimed to compose a framework of tasks that need to be addressed to optimize medication man

  10. Task analysis of IT-mediated medication management in outpatient care

    NARCIS (Netherlands)

    Stiphout, van F.; Zwart-van Rijkom, J.E.F.; Maggio, L.A.; Aarts, J.E.C.M.; Bates, D.W.; Gelder, van T.; Jansen, P.A.F.; Schraagen, J.M.C.; Egberts, A.C.G.; Braak, ter E.W.M.T.

    2015-01-01

    Aim Educating physicians in the procedural as well as cognitive skills of IT-mediated medication management could be one of the missing links for improvement of patient safety. We aimed to compose a framework of tasks that need to be addressed to optimize medication management in outpatient care. M

  11. Task and person-focused leadership behaviors and team performance: A meta-analysis.

    NARCIS (Netherlands)

    Ceri-Booms, Meltem; Curseu, P.L.; Oerlemans, L.A.G.

    2017-01-01

    This paper reports the results of a meta-analytic review of the relationship between person and task oriented leader behaviors, on the one hand, and team performance, on the other hand. The results, based on 89 independent samples, show a moderate positive (ρ=.33) association between both types of l

  12. An Analysis of Image Retrieval Tasks in the Field of Art History.

    Science.gov (United States)

    Chen, Hsin-liang

    2001-01-01

    Investigated undergraduate art history majors' image retrieval tasks and image query modes. Discusses gender differences; prior information retrieval experience; significant differences between the number of search terms users planned to use and the number they actually used; and implications for image indexing tools, image retrieval system…

  13. Design and classification of roads from the viewpoint of driving task analysis.

    NARCIS (Netherlands)

    Janssen, S.T.M.C.

    1976-01-01

    Together with traffic characteristics, road characteristics largely influence the effort the road user will have to make in performing his tasks. Traffic characteristics are not only linked closely to vehicle characteristics but also determined by road users relevant characteristics. In analysing tr

  14. Molecular modeling and structural analysis of two-pore domain potassium channels TASK1 interactions with the blocker A1899

    Directory of Open Access Journals (Sweden)

    David Mauricio Ramirez

    2015-03-01

    Full Text Available A1899 is a potent and highly selective blocker of the Two-pore domain potassium (K2P channel TASK-1, it acts as an antagonist blocking the K+ flux and binds to TASK-1 in the inner cavity and shows an activity in nanomolar order. This drug travels through the central cavity and finally binds in the bottom of the selectivity filter with some threonines and waters molecules forming a H-bond network and several hydrophobic interactions. Using alanine mutagenesis screens the binding site was identify involving residues in the P1 and P2 pore loops, the M2 and M4 transmembrane segments, and the halothane response element; mutations were introduced in the human TASK-1 (KCNK3, NM_002246 expressed in Oocytes from anesthetized Xenopus laevis frogs. Based in molecular modeling and structural analysis as such as molecular docking and binding free energy calculations a pose was suggested using a TASK-1 homology models. Recently, various K2P crystal structures have been obtained. We want redefined – from a structural point of view – the binding mode of A1899 in TASK-1 homology models using as a template the K2P crystal structures. By computational structural analysis we describe the molecular basis of the A1899 binding mode, how A1899 travel to its binding site and suggest an interacting pose (Figure 1. after 100 ns of molecular dynamics simulation (MDs we found an intra H-Bond (80% of the total MDs, a H-Bond whit Thr93 (42% of the total MDs, a pi-pi stacking interaction between a ring and Phe125 (88% of the total MDs and several water bridges. Our experimental and computational results allow the molecular understanding of the structural binding mechanism of the selective blocker A1899 to TASK-1 channels. We identified the structural common and divergent features of TASK-1 channel through our theoretical and experimental studies of A1899 drug action.

  15. Do depressive symptoms "blunt" effort? An analysis of cardiac engagement and withdrawal for an increasingly difficult task.

    Science.gov (United States)

    Silvia, Paul J; Mironovová, Zuzana; McHone, Ashley N; Sperry, Sarah H; Harper, Kelly L; Kwapil, Thomas R; Eddington, Kari M

    2016-07-01

    Research on depression and effort has suggested "depressive blunting"-lower cardiovascular reactivity in response to challenges and stressors. Many studies, however, find null effects or higher reactivity. The present research draws upon motivational intensity theory, a broad model of effort that predicts cases in which depressive symptoms should increase or decrease effort. Because depressive symptoms can influence task-difficulty appraisals-people see tasks as subjectively harder-people high in depressive symptoms should engage higher effort at objectively easier levels of difficulty but also quit sooner. A sample of adults completed a mental effort challenge with four levels of difficulty, from very easy to difficult-but-feasible. Depressive symptoms were assessed with the CESD and DASS; effort-related cardiac activity was assessed via markers of contractility (e.g., the cardiac pre-ejection period [PEP]) obtained with impedance cardiography. The findings supported the theory's predictions. When the task was relatively easier, people high in depressive symptoms showed higher contractility (shorter PEP), consistent with greater effort. When the task was relatively harder, people high in depressive symptoms showed diminished contractility, consistent with quitting. The results suggest that past research has been observing a small part of a larger trajectory of trying and quitting, and they illustrate the value of a theoretically grounded analysis of depressive symptoms and effort-related cardiac activity.

  16. Content Analysis of the "Journal of Counseling & Development": Volumes 74 to 84

    Science.gov (United States)

    Blancher, Adam T.; Buboltz, Walter C.; Soper, Barlow

    2010-01-01

    A content analysis of the research published in the "Journal of Counseling & Development" ("JCD") was conducted for Volumes 74 (1996) through 84 (2006). Frequency distributions were used to identify the most published authors and their institutional affiliations, as well as some basic characteristics (type of sample, gender, and ethnicity) of the…

  17. Introduction to Subject Indexing; A Programmed Text. Volume One: Subject Analysis and Practical Classification.

    Science.gov (United States)

    Brown, Alan George

    This programed text presents the basic principles and practices of subject indexing--limited to the area of precoordinate indexing. This first of two volumes deals with the subject analysis of documents, primarily at the level of summarization, and the basic elements of translation into classification schemes. The text includes regular self-tests…

  18. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2006-03-20

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  19. Waste Isolation Pilot Plant Geotechnical Analysis Report for July 2005 - June 2006, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2007-03-25

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2006. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  20. SOLVENT-BASED TO WATERBASED ADHESIVE-COATED SUBSTRATE RETROFIT - VOLUME I: COMPARATIVE ANALYSIS

    Science.gov (United States)

    This volume represents the analysis of case study facilities' experience with waterbased adhesive use and retrofit requirements. (NOTE: The coated and laminated substrate manufacturing industry was selected as part of NRMRL'S support of the 33/50 Program because of its significan...

  1. Comparison of gray matter volume and thickness for analysis of cortical changes in Alzheimer's disease

    Science.gov (United States)

    Liu, Jiachao; Li, Ziyi; Chen, Kewei; Yao, Li; Wang, Zhiqun; Li, Kunchen; Guo, Xiaojuan

    2011-03-01

    Gray matter volume and cortical thickness are two indices of concern in brain structure magnetic resonance imaging research. Gray matter volume reflects mixed-measurement information of cerebral cortex, while cortical thickness reflects only the information of distance between inner surface and outer surface of cerebral cortex. Using Scaled Subprofile Modeling based on Principal Component Analysis (SSM_PCA) and Pearson's Correlation Analysis, this study further provided quantitative comparisons and depicted both global relevance and local relevance to comprehensively investigate morphometrical abnormalities in cerebral cortex in Alzheimer's disease (AD). Thirteen patients with AD and thirteen age- and gender-matched healthy controls were included in this study. Results showed that factor scores from the first 8 principal components accounted for ~53.38% of the total variance for gray matter volume, and ~50.18% for cortical thickness. Factor scores from the fifth principal component showed significant correlation. In addition, gray matter voxel-based volume was closely related to cortical thickness alterations in most cortical cortex, especially, in some typical abnormal brain regions such as insula and the parahippocampal gyrus in AD. These findings suggest that these two measurements are effective indices for understanding the neuropathology in AD. Studies using both gray matter volume and cortical thickness can separate the causes of the discrepancy, provide complementary information and carry out a comprehensive description of the morphological changes of brain structure.

  2. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon [Hallym University College of Medicine, Seoul (Korea, Republic of)

    2002-09-15

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  3. Towards a Multi-criteria Development Distribution Model: An Analysis of Existing Task Distribution Approaches

    OpenAIRE

    2014-01-01

    Distributing development tasks in the context of global software development bears both many risks and many opportunities. Nowadays, distributed development is often driven by only a few factors or even just a single factor such as workforce costs. Risks and other relevant factors such as workforce capabilities, the innovation potential of different regions, or cultural factors are often not recognized sufficiently. This could be improved by using empirically-based multi-criteria distribution...

  4. Analysis of Load-Carrying Capacity for Redundant Free-Floating Space Manipulators in Trajectory Tracking Task

    Directory of Open Access Journals (Sweden)

    Qingxuan Jia

    2014-01-01

    Full Text Available The aim of this paper is to analyze load-carrying capacity of redundant free-floating space manipulators (FFSM in trajectory tracking task. Combined with the analysis of influential factors in load-carrying process, evaluation of maximum load-carrying capacity (MLCC is described as multiconstrained nonlinear programming problem. An efficient algorithm based on repeated line search within discontinuous feasible region is presented to determine MLCC for a given trajectory of the end-effector and corresponding joint path. Then, considering the influence of MLCC caused by different initial configurations for the starting point of given trajectory, a kind of maximum payload initial configuration planning method is proposed by using PSO algorithm. Simulations are performed for a particular trajectory tracking task of the 7-DOF space manipulator, of which MLCC is evaluated quantitatively. By in-depth research of the simulation results, significant gap between the values of MLCC when using different initial configurations is analyzed, and the discontinuity of allowable load-carrying capacity is illustrated. The proposed analytical method can be taken as theoretical foundation of feasibility analysis, trajectory optimization, and optimal control of trajectory tracking task in on-orbit load-carrying operations.

  5. Development of an advanced, continuous mild gasification process for the production of co-products (Task 4.7), Volume 3. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knight, R.A.; Gissy, J.L.; Onischak, M.; Babu, S.P.; Carty, R.H. [Institute of Gas Technology, Chicago, IL (United States); Duthie, R.G. [Bechtel Group, Inc., San Francisco, CA (United States); Wootten, J.M. [Peabody Holding Co., Inc., St. Louis, MO (United States)

    1991-09-01

    The focus of this task is the preparation of (1) preliminary piping and instrument diagrams (P&IDs) and single line electrical diagrams for a site-specific conceptual design and (2) a factored cost estimate for a 24 ton/day (tpd) capacity mild gasification process development unit (PDU) and an associated form coke preparation PDU. The intended site for this facility is the Illinois Coal Development Park at Carterville, Illinois, which is operated by Southern Illinois University at Carbondale. (VC)

  6. Oak Ridge Reservation volume I. Y-12 mercury task force files: A guide to record series of the Department of Energy and its contractors

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-17

    The purpose of this guide is to describe each of the series of records identified in the documents of the Y-12 Mercury Task Force Files that pertain to the use of mercury in the separation and enrichment of lithium isotopes at the Department of Energy`s (DOE) Y-12 Plant in Oak Ridge, Tennessee. History Associates Incorporated (HAI) prepared this guide as part of DOE`s Epidemiologic Records Inventory Project, which seeks to verify and conduct inventories of epidemiologic and health-related records at various DOE and DOE contractor sites. This introduction briefly describes the Epidemiologic Records Inventory Project and HAI`s role in the project. Specific attention will be given to the history of the DOE-Oak Ridge Reservation, the development of the Y-12 Plant, and the use of mercury in the production of nuclear weapons during the 1950s and early 1960s. This introduction provides background information on the Y-12 Mercury Task Force Files, an assembly of documents resulting from the 1983 investigation of the Mercury Task Force into the effects of mercury toxicity upon workplace hygiene and worker health, the unaccountable loss of mercury, and the impact of those losses upon the environment. This introduction also explains the methodology used in the selection and inventory of these record series. Other topics include the methodology used to produce this guide, the arrangement of the detailed record series descriptions, and information concerning access to the collection.

  7. Leukocyte telomere length and hippocampus volume: a meta-analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Gustav Nilsonne

    2015-10-01

    Full Text Available Leukocyte telomere length has been shown to correlate to hippocampus volume, but effect estimates differ in magnitude and are not uniformly positive. This study aimed primarily to investigate the relationship between leukocyte telomere length and hippocampus gray matter volume by meta-analysis and secondarily to investigate possible effect moderators. Five studies were included with a total of 2107 participants, of which 1960 were contributed by one single influential study. A random-effects meta-analysis estimated the effect to r = 0.12 [95% CI -0.13, 0.37] in the presence of heterogeneity and a subjectively estimated moderate to high risk of bias. There was no evidence that apolipoprotein E (APOE genotype was an effect moderator, nor that the ratio of leukocyte telomerase activity to telomere length was a better predictor than leukocyte telomere length for hippocampus volume. This meta-analysis, while not proving a positive relationship, also is not able to disprove the earlier finding of a positive correlation in the one large study included in analyses. We propose that a relationship between leukocyte telomere length and hippocamus volume may be mediated by transmigrating monocytes which differentiate into microglia in the brain parenchyma.

  8. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  9. Estimation of cell volume and biomass of penicillium chrysogenum using image analysis.

    Science.gov (United States)

    Packer, H L; Keshavarz-Moore, E; Lilly, M D; Thomas, C R

    1992-02-20

    A methodology for the estimation of biomass for the penicillin fermentation using image analysis is presented. Two regions of hyphae are defined to describe the growth of mycelia during fermentation: (1) the cytoplasmic region, and (2) the degenerated region including large vacuoles. The volume occupied by each of these regions in a fixed volume of sample is estimated from area measurements using image analysis. Areas are converted to volumes by treating the hyphae as solid cylinders with the hyphal diameter as the cylinder diameter. The volumes of the cytoplasmic and degenerated regions are converted into dry weight estimations using hyphal density values available from the literature. The image analysis technique is able to estimate biomass even in the presence of nondissolved solids of a concentration of up to 30 gL(-1). It is shown to estimate successfully concentrations of mycelia from 0.03 to 38 gL(-1). Although the technique has been developed for the penicillin fermentation, it should be applicable to other (nonpellected) fungal fermentations.

  10. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  11. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 2: Data

    Science.gov (United States)

    Waszak, M. R.; Schmidt, D. K.

    1985-01-01

    Two analysis methods are applied to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop modal analysis technique. This method considers the effect of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Both analyses indicated that dynamic aeroelastic effects caused a degradation in vehicle tracking performance, based on the evaluation of some simulation results. Volume 2 consists of the presentation of the state variable models of the flexible aircraft configurations used in the analysis applications mode shape plots for the structural modes, numerical results from the modal analysis frequency response plots from the pilot in the loop analysis and a listing of the modal analysis computer program.

  12. Using cognitive complexity analysis for the grading and sequencing of isiXhosa tasks in the curriculum design of a communication course for education students

    Directory of Open Access Journals (Sweden)

    Marianna Visser

    2011-09-01

    Full Text Available This article investigates the use of cognitive complexity analysis to inform the grading and sequencing of tasks for the purposes of curriculum design of a specific purposes isiXhosa course for student teachers. Two frameworks of cognitive complexity, that of Skehan and Robinson, are discussed, after which two communication tasks are analysed in terms of Robinson’s framework.

  13. Human Factors Process Task Analysis Liquid Oxygen Pump Acceptance Test Procedure for the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.

    2002-01-01

    A process task analysis effort was undertaken by Dynacs Inc. commencing in June 2002 under contract from NASA YA-D6. Funding was provided through NASA's Ames Research Center (ARC), Code M/HQ, and Industrial Engineering and Safety (IES). The John F. Kennedy Space Center (KSC) Engineering Development Contract (EDC) Task Order was 5SMA768. The scope of the effort was to conduct a Human Factors Process Failure Modes and Effects Analysis (HF PFMEA) of a hazardous activity and provide recommendations to eliminate or reduce the effects of errors caused by human factors. The Liquid Oxygen (LOX) Pump Acceptance Test Procedure (ATP) was selected for this analysis. The HF PFMEA table (see appendix A) provides an analysis of six major categories evaluated for this study. These categories include Personnel Certification, Test Procedure Format, Test Procedure Safety Controls, Test Article Data, Instrumentation, and Voice Communication. For each specific requirement listed in appendix A, the following topics were addressed: Requirement, Potential Human Error, Performance-Shaping Factors, Potential Effects of the Error, Barriers and Controls, Risk Priority Numbers, and Recommended Actions. This report summarizes findings and gives recommendations as determined by the data contained in appendix A. It also includes a discussion of technology barriers and challenges to performing task analyses, as well as lessons learned. The HF PFMEA table in appendix A recommends the use of accepted and required safety criteria in order to reduce the risk of human error. The items with the highest risk priority numbers should receive the greatest amount of consideration. Implementation of the recommendations will result in a safer operation for all personnel.

  14. Laser Raman spectroscopic analysis of polymorphic forms in microliter fluid volumes.

    Science.gov (United States)

    Anquetil, Patrick A; Brenan, Colin J H; Marcolli, Claudia; Hunter, Ian W

    2003-01-01

    Knowledge and control of the polymorphic phase of chemical compounds are important aspects of drug development in the pharmaceutical industry. We report herein in situ and real-time Raman spectroscopic polymorphic analysis of optically trapped microcrystals in a microliter volume format. The system studied in particular was the recrystallization of carbamazepine (CBZ) in methanol. Raman spectrometry enabled noninvasive measurement of the amount of dissolved CBZ in a sample as well as polymorphic characterization, whereas exclusive recrystallization of either CBZ form I or CBZ form III from saturated solutions was achieved by specific selection of sample cell cooling profiles. Additionally, using a microcell versus a macroscopic volume gives the advantage of reaching equilibrium much faster while using little compound quantity. We demonstrate that laser Raman spectral polymorphic analysis in a microliter cell is a potentially viable screening platform for polymorphic analysis and could lead to a new high throughput method for polymorph screening.

  15. Forward Air Controller: Task Analysis and Development of Team Training Measures for Close Air Support

    Science.gov (United States)

    2007-12-01

    d’AAR, ainsi que de déterminer les mesures adéquates pour évaluer le rendement de l’équipe. À cette fin, on a effectué une analyse hiérarchique des...certain nombre de recommandations visant à améliorer l’instrument avant son application aux exercices futurs de simulation répartie. Les...assessing the team’s performance as a whole. To that end, the contractor constructed hierarchical task analyses for the principal members of this team, the

  16. Second generation pressurized fluidized-bed combustion (PFBC) research and development, Phase 2 --- Task 4, carbonizer testing. Volume 2, Data reconciliation

    Energy Technology Data Exchange (ETDEWEB)

    Froehlich, R.; Robertson, A.; Vanhook, J.; Goyal, A.; Rehmat, A.; Newby, R.

    1994-11-01

    During the period beginning November 1991 and ending September 1992, a series of tests were conducted at Foster Wheeler Development Corporation in a fluidized-bed coal carbonizer to determine its performance characteristics. The carbonizer was operated for 533 hours in a jetting fluidized-bed configuration during which 36 set points (steady-state periods) were achieved. Extensive data were collected on the feed and product stream compositions, heating values, temperatures, and flow rates. With these data, elemental and energy balances were computed to evaluate and confirm accuracy of the data. The carbonizer data were not as self-consistent as could be desired (balance closure imperfection). A software package developed by Science Ventures, Inc., of California, called BALAID, was used to reconcile the carbonizer data; the details of the reconciliation have been given in Volume 1 of this report. The reconciled data for the carbonizer were rigorously analyzed, correlations were developed, and the model was updated accordingly. The model was then used in simulating each of the 36 steady-state periods achieved in the pilot plant. The details are given in this Volume one. This Volume 2 provides details of the carbonizer data reconciliation.

  17. Correcting Working Postures in Water Pump AssemblyTasks using the OVAKO Work Analysis System (OWAS

    Directory of Open Access Journals (Sweden)

    Atiya Kadhim Al-Zuheri

    2008-01-01

    Full Text Available Ovako Working Postures Analyzing System (OWAS is a widely used method for studying awkward working postures in workplaces. This study with OWAS, analyzed working postures for manual material handling of laminations at stacking workstation for water pump assembly line in Electrical Industrial Company (EICO / Baghdad. A computer program, WinOWAS, was used for the study. In real life workstation was found that more than 26% of the working postures observed were classified as either AC2 (slightly harmful, AC3 (distinctly harmful. Postures that needed to be corrected soon (AC3 and corresponding tasks, were identified. The most stressful tasks observed were grasping, handling, and positioning of the laminations from workers. The construction of real life workstation is modified simultaneously by redesign suggestions in the values of location (positioning factors for stacking workstation. The simulation workstation executed by mean of parametric CAD software. That modifications lead to improvement in the percentage of harmful postures. It was therefore recommended the use of supplementary methods is required to identify ergonomic risk factors for handling work or other hand-intensive activities on industry sites.

  18. Mathematical tasks, study approaches, and course grades in undergraduate mathematics: a year-by-year analysis

    Science.gov (United States)

    Maciejewski, Wes; Merchant, Sandra

    2016-04-01

    Students approach learning in different ways, depending on the experienced learning situation. A deep approach is geared toward long-term retention and conceptual change while a surface approach focuses on quickly acquiring knowledge for immediate use. These approaches ultimately affect the students' academic outcomes. This study takes a cross-sectional look at the approaches to learning used by students from courses across all four years of undergraduate mathematics and analyses how these relate to the students' grades. We find that deep learning correlates with grade in the first year and not in the upper years. Surficial learning has no correlation with grades in the first year and a strong negative correlation with grades in the upper years. Using Bloom's taxonomy, we argue that the nature of the tasks given to students is fundamentally different in lower and upper year courses. We find that first-year courses emphasize tasks that require only low-level cognitive processes. Upper year courses require higher level processes but, surprisingly, have a simultaneous greater emphasis on recall and understanding. These observations explain the differences in correlations between approaches to learning and course grades. We conclude with some concerns about the disconnect between first year and upper year mathematics courses and the effect this may have on students.

  19. Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise

    Science.gov (United States)

    Burge, Johannes

    2017-01-01

    Accuracy Maximization Analysis (AMA) is a recently developed Bayesian ideal observer method for task-specific dimensionality reduction. Given a training set of proximal stimuli (e.g. retinal images), a response noise model, and a cost function, AMA returns the filters (i.e. receptive fields) that extract the most useful stimulus features for estimating a user-specified latent variable from those stimuli. Here, we first contribute two technical advances that significantly reduce AMA’s compute time: we derive gradients of cost functions for which two popular estimators are appropriate, and we implement a stochastic gradient descent (AMA-SGD) routine for filter learning. Next, we show how the method can be used to simultaneously probe the impact on neural encoding of natural stimulus variability, the prior over the latent variable, noise power, and the choice of cost function. Then, we examine the geometry of AMA’s unique combination of properties that distinguish it from better-known statistical methods. Using binocular disparity estimation as a concrete test case, we develop insights that have general implications for understanding neural encoding and decoding in a broad class of fundamental sensory-perceptual tasks connected to the energy model. Specifically, we find that non-orthogonal (partially redundant) filters with scaled additive noise tend to outperform orthogonal filters with constant additive noise; non-orthogonal filters and scaled additive noise can interact to sculpt noise-induced stimulus encoding uncertainty to match task-irrelevant stimulus variability. Thus, we show that some properties of neural response thought to be biophysical nuisances can confer coding advantages to neural systems. Finally, we speculate that, if repurposed for the problem of neural systems identification, AMA may be able to overcome a fundamental limitation of standard subunit model estimation. As natural stimuli become more widely used in the study of psychophysical and

  20. Perfusion analysis using a wide coverage flat-panel volume CT: feasibility study

    Science.gov (United States)

    Grasruck, M.; Gupta, R.; Reichardt, B.; Klotz, E.; Schmidt, B.; Flohr, T.

    2007-03-01

    We developed a Flat-panel detector based Volume CT (VCT) prototype scanner with large z-coverage. In that prototype scanner a Varian 4030CB a-Si flat-panel detector was mounted in a multi slice CT-gantry (Siemens Medical Solutions) which provides a 25 cm field of view with 18 cm z-coverage at isocenter. The large volume covered in one rotation can be used for visualization of complete organs of small animals, e.g. rabbits. By implementing a mode with continuous scanning, we are able to reconstruct the complete volume at any point in time during the propagation of a contrast bolus. Multiple volumetric reconstructions over time elucidate the first pass dynamics of a bolus of contrast resulting in 4-D angiography and potentially allowing whole organ perfusion analysis. We studied to which extent pixel based permeability and blood volume calculation with a modified Patlak approach was possible. Experimental validation was performed by imaging evolution of contrast bolus in New Zealand rabbits. Despite the short circulation time of a rabbit, the temporal resolution was sufficient to visually resolve various phases of the first pass of the contrast bolus. Perfusion imaging required substantial spatial smoothing but allowed a qualitative discrimination of different types of parenchyma in brain and liver. If a true quantitative analysis is possible, requires further studies.

  1. Bayesian network analysis revealed the connectivity difference of the default mode network from the resting-state to task-state.

    Science.gov (United States)

    Wu, Xia; Yu, Xinyu; Yao, Li; Li, Rui

    2014-01-01

    Functional magnetic resonance imaging (fMRI) studies have converged to reveal the default mode network (DMN), a constellation of regions that display co-activation during resting-state but co-deactivation during attention-demanding tasks in the brain. Here, we employed a Bayesian network (BN) analysis method to construct a directed effective connectivity model of the DMN and compared the organizational architecture and interregional directed connections under both resting-state and task-state. The analysis results indicated that the DMN was consistently organized into two closely interacting subsystems in both resting-state and task-state. The directed connections between DMN regions, however, changed significantly from the resting-state to task-state condition. The results suggest that the DMN intrinsically maintains a relatively stable structure whether at rest or performing tasks but has different information processing mechanisms under varied states.

  2. Analysis of postural load during tasks related to milking cows-a case study.

    Science.gov (United States)

    Groborz, Anna; Tokarski, Tomasz; Roman-Liu, Danuta

    2011-01-01

    The aim of this study was to analyse postural load during tasks related to milking cows of 2 farmers on 2 different farms (one with a manual milk transport system, the other with a fully automated milk transport system) as a case study. The participants were full-time farmers, they were both healthy and experienced in their job. The Ovako Working Posture Analyzing System (OWAS) was used to evaluate postural load and postural risk. Postural load was medium for the farmer on the farm with a manual milk transport system and high for the farmer working on the farm with a fully automated milk transport system. Thus, it can be concluded that a higher level of farm mechanization not always mean that the farmer's postural load is lower, but limitation of OWAS should be considered.

  3. Multi-Task Learning for Food Identification and Analysis with Deep Convolutional Neural Networks

    Institute of Scientific and Technical Information of China (English)

    Xi-Jin Zhang; Yi-Fan Lu; Song-Hai Zhang

    2016-01-01

    In this paper, we proposed a multi-task system that can identify dish types, food ingredients, and cooking methods from food images with deep convolutional neural networks. We built up a dataset of 360 classes of different foods with at least 500 images for each class. To reduce the noises of the data, which was collected from the Internet, outlier images were detected and eliminated through a one-class SVM trained with deep convolutional features. We simultaneously trained a dish identifier, a cooking method recognizer, and a multi-label ingredient detector. They share a few low-level layers in the deep network architecture. The proposed framework shows higher accuracy than traditional method with handcrafted features, and the cooking method recognizer and ingredient detector can be applied to dishes which are not included in the training dataset to provide reference information for users.

  4. Development of Field Methodology and Processes for Task Analysis and Training Feedback

    Science.gov (United States)

    1978-10-31

    3I 3n 4) 4r 4 7 4 4 4’ 4* 4 4 2 Ca 3 3I I vH~~ TO ~7. _ _ §47B~EF1I-.-. AR’E TAK/ODE’ MAUA -TASKS- ___IISW INTERFACEI ___ IKS flA" 155DWI QUALITY...R A cn I- - - ŗ 4 4 - .CC 4 i 3COMBAT~ RELTE FIG C 4- -8 ’DUTY POIIOSSLD ’ MAUA TAK (ADM IN I ST -ITE FACCE ~C -SUPPLY IMAINTENJANCE 4- Ci C1 ci...8217 - C- ME m a 0.- a) d) C C) 0C . £ O " " ,i ’’ ". 4J I C FIG 4ŝ TECNIAL)INERAC 68iI) in toI T B- DUT POIIN/OD- MAUA TAK (SIL LEVE I 68t TEHICL INERAC

  5. Predicting Nonauditory Adverse Radiation Effects Following Radiosurgery for Vestibular Schwannoma: A Volume and Dosimetric Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hayhurst, Caroline; Monsalves, Eric; Bernstein, Mark; Gentili, Fred [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada); Heydarian, Mostafa; Tsao, May [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Schwartz, Michael [Radiation Oncology Program and Division of Neurosurgery, Sunnybrook Hospital, Toronto (Canada); Prooijen, Monique van [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Millar, Barbara-Ann; Menard, Cynthia [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Kulkarni, Abhaya V. [Division of Neurosurgery, Hospital for Sick Children, University of Toronto (Canada); Laperriere, Norm [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Zadeh, Gelareh, E-mail: Gelareh.Zadeh@uhn.on.ca [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada)

    2012-04-01

    Purpose: To define clinical and dosimetric predictors of nonauditory adverse radiation effects after radiosurgery for vestibular schwannoma treated with a 12 Gy prescription dose. Methods: We retrospectively reviewed our experience of vestibular schwannoma patients treated between September 2005 and December 2009. Two hundred patients were treated at a 12 Gy prescription dose; 80 had complete clinical and radiological follow-up for at least 24 months (median, 28.5 months). All treatment plans were reviewed for target volume and dosimetry characteristics; gradient index; homogeneity index, defined as the maximum dose in the treatment volume divided by the prescription dose; conformity index; brainstem; and trigeminal nerve dose. All adverse radiation effects (ARE) were recorded. Because the intent of our study was to focus on the nonauditory adverse effects, hearing outcome was not evaluated in this study. Results: Twenty-seven (33.8%) patients developed ARE, 5 (6%) developed hydrocephalus, 10 (12.5%) reported new ataxia, 17 (21%) developed trigeminal dysfunction, 3 (3.75%) had facial weakness, and 1 patient developed hemifacial spasm. The development of edema within the pons was significantly associated with ARE (p = 0.001). On multivariate analysis, only target volume is a significant predictor of ARE (p = 0.001). There is a target volume threshold of 5 cm3, above which ARE are more likely. The treatment plan dosimetric characteristics are not associated with ARE, although the maximum dose to the 5th nerve is a significant predictor of trigeminal dysfunction, with a threshold of 9 Gy. The overall 2-year tumor control rate was 96%. Conclusions: Target volume is the most important predictor of adverse radiation effects, and we identified the significant treatment volume threshold to be 5 cm3. We also established through our series that the maximum tolerable dose to the 5th nerve is 9 Gy.

  6. Effects of Intraday Patterns on Analysis of STOCK Market Index and Trading Volume

    Science.gov (United States)

    Choi, Hyung Wooc; Maeng, Seong Eun; Lee, Jae Woo

    We review the stylized properties of the stock market and consider effects of the intraday patterns on the analysis of the time series for the stock index and the trading volume in Korean stock market. In the stock market the probability distribution function (pdf) of the return and volatility followed the power law for the stock index and the change of the volume traded. The volatility of the stock index showed the long-time memory and the autocorrelation function followed a power law. We applied two eliminating methods of the intraday patterns: the intraday patterns of the time series itself, and the intraday patterns of the absolute return for the index or the absolute volume change. We scaled the index and return by two types of the intraday patterns. We considered the probability distribution function and the autocorrelation function (ACF) for the time series scaled by the intraday patterns. The cumulative probability distribution function of the returns scaled by the intraday patterns showed a power law, P>(r) r-α±, where α± corresponds to the exponent of the positive and negative fat tails. The pdf of the return scaled by intraday patterns by the absolute return decayed much steeper than that of the return scaled by intraday patterns of the index itself. The pdf for the volume change also followed the power law for both methods of eliminating intraday patterns. However, the exponents of the power law at fat tails do not depend on the intraday patterns. The ACF of the absolute return showed long-time correlation and followed the power law for the scaled index and for the scaled volume. The daily periodicity of the ACF was removed for scaled time series by the intraday patterns of the absolute return or the absolute volume change.

  7. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  8. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.

  9. GGO nodule volume-preserving nonrigid lung registration using GLCM texture analysis.

    Science.gov (United States)

    Park, Seongjin; Kim, Bohyoung; Lee, Jeongjin; Goo, Jin Mo; Shin, Yeong-Gil

    2011-10-01

    In lung cancer screening, benign and malignant nodules can be classified through nodule growth assessment by the registration and, then, subtraction between follow-up computed tomography scans. During the registration, the volume of nodule regions in the floating image should be preserved, whereas the volume of other regions in the floating image should be aligned to that in the reference image. However, ground glass opacity (GGO) nodules are very elusive to automatically segment due to their inhomogeneous interior. In other words, it is difficult to automatically define the volume-preserving regions of GGO nodules. In this paper, we propose an accurate and fast nonrigid registration method. It applies the volume-preserving constraint to candidate regions of GGO nodules, which are automatically detected by gray-level cooccurrence matrix (GLCM) texture analysis. Considering that GGO nodules can be characterized by their inner inhomogeneity and high intensity, we identify the candidate regions of GGO nodules based on the homogeneity values calculated by the GLCM and the intensity values. Furthermore, we accelerate our nonrigid registration by using Compute Unified Device Architecture (CUDA). In the nonrigid registration process, the computationally expensive procedures of the floating-image transformation and the cost-function calculation are accelerated by using CUDA. The experimental results demonstrated that our method almost perfectly preserves the volume of GGO nodules in the floating image as well as effectively aligns the lung between the reference and floating images. Regarding the computational performance, our CUDA-based method delivers about 20× faster registration than the conventional method. Our method can be successfully applied to a GGO nodule follow-up study and can be extended to the volume-preserving registration and subtraction of specific diseases in other organs (e.g., liver cancer).

  10. FINITE VOLUME NUMERICAL ANALYSIS FOR PARABOLIC EQUATION WITH ROBIN BOUNDARY CONDITION

    Institute of Scientific and Technical Information of China (English)

    Xia Cui

    2005-01-01

    In this paper, finite volume method on unstructured meshes is studied for a parabolic convection-diffusion problem on an open bounded set of Rd (d = 2 or 3) with Robin boundary condition. Upwinding approximations are adapted to treat both the convection term and Robin boundary condition. By directly getting start from the formulation of the finite volume scheme, numerical analysis is done. By using several discrete functional analysis techniques such as summation by parts, discrete norm inequality, et al, the stability and error estimates on the approximate solution are established, existence and uniqueness of the approximate solution and the 1st order temporal norm and L2 and H1 spacial norm convergence properties are obtained.

  11. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L. [Pacific Science and Engineering Group, San Diego, CA (United States)] [and others

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.

  12. Effect of varicocelectomy on testis volume and semen parameters in adolescents: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Tie Zhou

    2015-01-01

    Full Text Available Varicocele repair in adolescent remains controversial. Our aim is to identify and combine clinical trials results published thus far to ascertain the efficacy of varicocelectomy in improving testis volume and semen parameters compared with nontreatment control. A literature search was performed using Medline, Embase and Web of Science, which included results obtained from meta-analysis, randomized and nonrandomized controlled studies. The study population was adolescents with clinically palpable varicocele with or without the testicular asymmetry or abnormal semen parameters. Cases were allocated to treatment and observation groups, and testis volume or semen parameters were adopted as outcome measures. As a result, seven randomized controlled trials (RCTs and nonrandomized controlled trials studying bilateral testis volume or semen parameters in both treatment and observation groups were identified. Using a random effect model, mean difference of testis volume between the treatment group and the observation group was 2.9 ml (95% confidence interval [CI]: 0.6, 5.2; P< 0.05 for the varicocele side and 1.5 ml (95% CI: 0.3, 2.7; P< 0.05 for the healthy side. The random effect model analysis demonstrated that the mean difference of semen concentration, total semen motility, and normal morphology between the two groups was 13.7 × 10 6 ml−1 (95% CI: −1.4, 28.8; P = 0.075, 2.5% (95% CI: −3.6, 8.6; P= 0.424, and 2.9% (95% CI: −3.0, 8.7; P= 0.336 respectively. In conclusion, although varicocelectomy significantly improved bilateral testis volume in adolescents with varicocele compared with observation cases, semen parameters did not have any statistically significant difference between two groups. Well-planned, properly conducted RCTs are needed in order to confirm the above-mentioned conclusion further and to explore whether varicocele repair in adolescents could improve subsequently spontaneous pregnancy rates.

  13. Task Management in the New ATLAS Production System

    CERN Document Server

    De, K; The ATLAS collaboration; Klimentov, A; Potekhin, M; Vaniachine, A

    2014-01-01

    The ATLAS Production System is the top level workflow manager which translates physicists' needs for production level processing into actual workflows executed across about a hundred processing sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. Providing a front-end and a management layer for petascale data processing and analysis, the new Production System contains generic subsystems that can be used in a wider range of applications. The main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, the DEFT subsystem manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. Th...

  14. Task Management in the New ATLAS Production System

    CERN Document Server

    De, K; The ATLAS collaboration; Klimentov, A; Potekhin, M; Vaniachine, A

    2013-01-01

    The ATLAS Production System is the top level workflow manager which translates physicists' needs for production level processing into actual workflows executed across about a hundred processing sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. Providing a front-end and a management layer for petascale data processing and analysis, the new Production System contains generic subsystems that can be used in a wider range of applications. The main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, the DEFT subsystem manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. Th...

  15. Magnetic resonance velocity imaging derived pressure differential using control volume analysis

    Directory of Open Access Journals (Sweden)

    Cohen Benjamin

    2011-03-01

    Full Text Available Abstract Background Diagnosis and treatment of hydrocephalus is hindered by a lack of systemic understanding of the interrelationships between pressures and flow of cerebrospinal fluid in the brain. Control volume analysis provides a fluid physics approach to quantify and relate pressure and flow information. The objective of this study was to use control volume analysis and magnetic resonance velocity imaging to non-invasively estimate pressure differentials in vitro. Method A flow phantom was constructed and water was the experimental fluid. The phantom was connected to a high-resolution differential pressure sensor and a computer controlled pump producing sinusoidal flow. Magnetic resonance velocity measurements were taken and subsequently analyzed to derive pressure differential waveforms using momentum conservation principles. Independent sensor measurements were obtained for comparison. Results Using magnetic resonance data the momentum balance in the phantom was computed. The measured differential pressure force had amplitude of 14.4 dynes (pressure gradient amplitude 0.30 Pa/cm. A 12.5% normalized root mean square deviation between derived and directly measured pressure differential was obtained. These experiments demonstrate one example of the potential utility of control volume analysis and the concepts involved in its application. Conclusions This study validates a non-invasive measurement technique for relating velocity measurements to pressure differential. These methods may be applied to clinical measurements to estimate pressure differentials in vivo which could not be obtained with current clinical sensors.

  16. Knowledge and skills of the lamaze certified childbirth educator: results of a job task analysis.

    Science.gov (United States)

    Budin, Wendy C; Gross, Leon; Lothian, Judith A; Mendelson, Jeanne

    2014-01-01

    Content validity of certification examinations is demonstrated over time with comprehensive job analyses conducted and analyzed by experts, with data gathered from stakeholders. In November 2011, the Lamaze International Certification Council conducted a job analysis update of the 2002 job analysis survey. This article presents the background, methodology, and findings of the job analysis. Changes in the test blueprint based on these findings are presented.

  17. Crucial role of detailed function, task, timeline, link and human vulnerability analyses in HRA. [Human Reliability Analysis (HRA)

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, T.G.; Haney, L.N.; Ostrom, L.T.

    1992-01-01

    This paper addresses one major cause for large uncertainties in human reliability analysis (HRA) results, that is, an absence of detailed function, task, timeline, link and human vulnerability analyses. All too often this crucial step in the HRA process is done in a cursory fashion using word of mouth or written procedures which themselves may incompletely or inaccurately represent the human action sequences and human error vulnerabilities being analyzed. The paper examines the potential contributions these detailed analyses can make in achieving quantitative and qualitative HRA results which are: (1) creditable, that is, minimize uncertainty, (2) auditable, that is, systematically linking quantitative results and qualitative information from which the results are derived, (3) capable of supporting root cause analyses on human reliability factors determined to be major contributors to risk, and (4) capable of repeated measures and being combined with similar results from other analyses to examine HRA issues transcending individual systems and facilities. Based on experience analyzing test and commercial nuclear reactors, and medical applications of nuclear technology, an iterative process is suggested for doing detailed function, task, timeline, link and human vulnerability analyses using documentation reviews, open-ended and structured interviews, direct observations, and group techniques. Finally, the paper concludes that detailed analyses done in this manner by knowledgeable human factors practitioners, can contribute significantly to the credibility, auditability, causal factor analysis, and combining goals of the HRA.

  18. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    Science.gov (United States)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  19. Big Cat Coalitions: A Comparative Analysis of Regional Brain Volumes in Felidae

    Science.gov (United States)

    Sakai, Sharleen T.; Arsznov, Bradley M.; Hristova, Ani E.; Yoon, Elise J.; Lundrigan, Barbara L.

    2016-01-01

    Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of four focal species: lions (Panthera leo), leopards (Panthera pardus), cougars (Puma concolor), and cheetahs (Acinonyx jubatus). These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography. Skulls (n = 75) were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC) volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in four focal species revealed that lions and leopards, while not significantly different from one another, have relatively larger AC volumes

  20. Big Cat Coalitions: A comparative analysis of regional brain volumes in Felidae

    Directory of Open Access Journals (Sweden)

    Sharleen T Sakai

    2016-10-01

    Full Text Available Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of 4 focal species: lions (Panthera leo, leopards (Panthera pardus, cougars (Puma concolor, and cheetahs (Acinonyx jubatus. These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography (CT. Skulls (n=75 were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares (PGLS regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in 4 focal species revealed that lions and leopards, while not significantly different from one another, have relatively

  1. Fabrication, testing, and analysis of anisotropic carbon/glass hybrid composites: volume 1: technical report.

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, Kyle K. (Wetzel Engineering, Inc. Lawrence, Kansas); Hermann, Thomas M. (Wichita state University, Wichita, Kansas); Locke, James (Wichita state University, Wichita, Kansas)

    2005-11-01

    o} from the long axis for approximately two-thirds of the laminate volume (discounting skin layers), with reinforcing carbon fibers oriented axially comprising the remaining one-third of the volume. Finite element analysis of each laminate has been performed to examine first ply failure. Three failure criteria--maximum stress, maximum strain, and Tsai-Wu--have been compared. Failure predicted by all three criteria proves generally conservative, with the stress-based criteria the most conservative. For laminates that respond nonlinearly to loading, large error is observed in the prediction of failure using maximum strain as the criterion. This report documents the methods and results in two volumes. Volume 1 contains descriptions of the laminates, their fabrication and testing, the methods of analysis, the results, and the conclusions and recommendations. Volume 2 contains a comprehensive summary of the individual test results for all laminates.

  2. Performance Prediction Modelling for Flexible Pavement on Low Volume Roads Using Multiple Linear Regression Analysis

    Directory of Open Access Journals (Sweden)

    C. Makendran

    2015-01-01

    Full Text Available Prediction models for low volume village roads in India are developed to evaluate the progression of different types of distress such as roughness, cracking, and potholes. Even though the Government of India is investing huge quantum of money on road construction every year, poor control over the quality of road construction and its subsequent maintenance is leading to the faster road deterioration. In this regard, it is essential that scientific maintenance procedures are to be evolved on the basis of performance of low volume flexible pavements. Considering the above, an attempt has been made in this research endeavor to develop prediction models to understand the progression of roughness, cracking, and potholes in flexible pavements exposed to least or nil routine maintenance. Distress data were collected from the low volume rural roads covering about 173 stretches spread across Tamil Nadu state in India. Based on the above collected data, distress prediction models have been developed using multiple linear regression analysis. Further, the models have been validated using independent field data. It can be concluded that the models developed in this study can serve as useful tools for the practicing engineers maintaining flexible pavements on low volume roads.

  3. Coupled Structural, Thermal, Phase-Change and Electromagnetic Analysis for Superconductors. Volume 1

    Science.gov (United States)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Militello, C.; Schuler, J. J.

    1996-01-01

    Described are the theoretical development and computer implementation of reliable and efficient methods for the analysis of coupled mechanical problems that involve the interaction of mechanical, thermal, phase-change and electromagnetic subproblems. The focus application has been the modeling of superconductivity and associated quantum-state phase-change phenomena. In support of this objective the work has addressed the following issues: (1) development of variational principles for finite elements, (2) finite element modeling of the electromagnetic problem, (3) coupling of thermal and mechanical effects, and (4) computer implementation and solution of the superconductivity transition problem. The main accomplishments have been: (1) the development of the theory of parametrized and gauged variational principles, (2) the application of those principled to the construction of electromagnetic, thermal and mechanical finite elements, and (3) the coupling of electromagnetic finite elements with thermal and superconducting effects, and (4) the first detailed finite element simulations of bulk superconductors, in particular the Meissner effect and the nature of the normal conducting boundary layer. The theoretical development is described in two volumes. This volume, Volume 1, describes mostly formulations for specific problems. Volume 2 describes generalization of those formulations.

  4. Independent Orbiter Assessment (IOA): Analysis of the Electrical Power Distribution and Control Subsystem, Volume 2

    Science.gov (United States)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.

  5. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 1-Summary

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original ''fresh'' composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized- water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Isotopic densities for spent fuel assemblies in the core were calculated using the SAS2H analytical sequence in SCALE-4. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code sequence was used to extract the necessary isotopic densities from SAS2H results and to provide the data in the format required for SCALE-4 criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) for the critical configuration. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for analysis of each critical configuration. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power

  6. Leveraging Data Analysis for Domain Experts: An Embeddable Framework for Basic Data Science Tasks

    Science.gov (United States)

    Lohrer, Johannes-Y.; Kaltenthaler, Daniel; Kröger, Peer

    2016-01-01

    In this paper, we describe a framework for data analysis that can be embedded into a base application. Since it is important to analyze the data directly inside the application where the data is entered, a tool that allows the scientists to easily work with their data, supports and motivates the execution of further analysis of their data, which…

  7. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 2, Task 3, Testing of process improvement concepts: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    This final report, Volume 2, on ``Process Improvement Concepts`` presents the results of work conducted by the Institute of Gas Technology (IGT), the Illinois Institute of Technology (IIT), and the Ohio State University (OSU) to develop three novel approaches for desulfurization that have shown good potential with coal and could be cost-effective for oil shales. These are (1) In-Bed Sulfur Capture using different sorbents (IGT), (2) Electrostatic Desulfurization (IIT), and (3) Microbial Desulfurization and Denitrification (OSU and IGT). Results of work on electroseparation of shale oil and fines conducted by IIT is included in this report, as well as work conducted by IGT to evaluate the restricted pipe discharge system. The work was conducted as part of the overall program on ``Pressurized Fluidized-Bed Hydroretorting of Eastern Oil Shales.``

  8. Operator function modeling: Cognitive task analysis, modeling and intelligent aiding in supervisory control systems

    Science.gov (United States)

    Mitchell, Christine M.

    1990-01-01

    The design, implementation, and empirical evaluation of task-analytic models and intelligent aids for operators in the control of complex dynamic systems, specifically aerospace systems, are studied. Three related activities are included: (1) the models of operator decision making in complex and predominantly automated space systems were used and developed; (2) the Operator Function Model (OFM) was used to represent operator activities; and (3) Operator Function Model Expert System (OFMspert), a stand-alone knowledge-based system was developed, that interacts with a human operator in a manner similar to a human assistant in the control of aerospace systems. OFMspert is an architecture for an operator's assistant that uses the OFM as its system and operator knowledge base and a blackboard paradigm of problem solving to dynamically generate expectations about upcoming operator activities and interpreting actual operator actions. An experiment validated the OFMspert's intent inferencing capability and showed that it inferred the intentions of operators in ways comparable to both a human expert and operators themselves. OFMspert was also augmented with control capabilities. An interface allowed the operator to interact with OFMspert, delegating as much or as little control responsibility as the operator chose. With its design based on the OFM, OFMspert's control capabilities were available at multiple levels of abstraction and allowed the operator a great deal of discretion over the amount and level of delegated control. An experiment showed that overall system performance was comparable for teams consisting of two human operators versus a human operator and OFMspert team.

  9. Analysis of Heart Rate Variability in Chinese PLA Navy Global Visiting Task Group

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yong-sheng; CHU Li-yun; GONG Ting

    2014-01-01

    Objective: To analyze the heart rate variability in Chinese PLA navy global visiting task group. Methods:We analyzed the heart rate variability in Holter in 77 men and 4 women for 5-15 days before voyage, and 65-75 and 115-125 days after voyage, and 29 men and 3 women for 5-15 days after having finished voyage. Results:NN50 and VLF were lower in 77 men and 4 women for 65-75 days after voyage than that was 5-15 days before voyage (P<0.01). SDANN was lower in 77 men and 4 women for 115-125 days after voyage than that was 65-75 days after voyage (P<0.01). SDNN, SDANN, SDNN index, RMSSD, NN50, PNN50,Triangular index, VLF, HLF, VAI and VLI were lower in 77 men and 4 women for 65-75 days after voyage than that was 5-15 days before voyage (P<0.01).Conclusion: These findings suggest that voyage may reduce heart rate variability for a long time.

  10. Complex network analysis of brain functional connectivity under a multi-step cognitive task

    Science.gov (United States)

    Cai, Shi-Min; Chen, Wei; Liu, Dong-Bai; Tang, Ming; Chen, Xun

    2017-01-01

    Functional brain network has been widely studied to understand the relationship between brain organization and behavior. In this paper, we aim to explore the functional connectivity of brain network under a multi-step cognitive task involving consecutive behaviors, and further understand the effect of behaviors on the brain organization. The functional brain networks are constructed based on a high spatial and temporal resolution fMRI dataset and analyzed via complex network based approach. We find that at voxel level the functional brain network shows robust small-worldness and scale-free characteristics, while its assortativity and rich-club organization are slightly restricted to the order of behaviors performed. More interestingly, the functional connectivity of brain network in activated ROIs strongly correlates with behaviors and is obviously restricted to the order of behaviors performed. These empirical results suggest that the brain organization has the generic properties of small-worldness and scale-free characteristics, and its diverse functional connectivity emerging from activated ROIs is strongly driven by these behavioral activities via the plasticity of brain.

  11. Effectiveness of the random sequential absorption algorithm in the analysis of volume elements with nanoplatelets

    DEFF Research Database (Denmark)

    Pontefisso, Alessandro; Zappalorto, Michele; Quaresimin, Marino

    2016-01-01

    In this work, a study of the Random Sequential Absorption (RSA) algorithm in the generation of nanoplatelet Volume Elements (VEs) is carried out. The effect of the algorithm input parameters on the reinforcement distribution is studied through the implementation of statistical tools, showing...... that the platelet distribution is systematically affected by these parameters. The consequence is that a parametric analysis of the VE input parameters may be biased by hidden differences in the filler distribution. The same statistical tools used in the analysis are implemented in a modified RSA algorithm...

  12. A control-volume method for analysis of unsteady thrust augmenting ejector flows

    Science.gov (United States)

    Drummond, Colin K.

    1988-01-01

    A method for predicting transient thrust augmenting ejector characteristics is presented. The analysis blends classic self-similar turbulent jet descriptions with a control volume mixing region discretization to solicit transient effects in a new way. Division of the ejector into an inlet, diffuser, and mixing region corresponds with the assumption of viscous-dominated phenomenon in the latter. Inlet and diffuser analyses are simplified by a quasi-steady analysis, justified by the assumptions that pressure is the forcing function in those regions. Details of the theoretical foundation, the solution algorithm, and sample calculations are given.

  13. A regional comparative analysis of empirical and theoretical flood peak-volume relationships

    Directory of Open Access Journals (Sweden)

    Szolgay Ján

    2016-12-01

    Full Text Available This paper analyses the bivariate relationship between flood peaks and corresponding flood event volumes modelled by empirical and theoretical copulas in a regional context, with a focus on flood generation processes in general, the regional differentiation of these and the effect of the sample size on reliable discrimination among models. A total of 72 catchments in North-West of Austria are analysed for the period 1976–2007. From the hourly runoff data set, 25 697 flood events were isolated and assigned to one of three flood process types: synoptic floods (including long- and short-rain floods, flash floods or snowmelt floods (both rain-on-snow and snowmelt floods. The first step of the analysis examines whether the empirical peak-volume copulas of different flood process types are regionally statistically distinguishable, separately for each catchment and the role of the sample size on the strength of the statements. The results indicate that the empirical copulas of flash floods tend to be different from those of the synoptic and snowmelt floods. The second step examines how similar are the empirical flood peak-volume copulas between catchments for a given flood type across the region. Empirical copulas of synoptic floods are the least similar between the catchments, however with the decrease of the sample size the difference between the performances of the process types becomes small. The third step examines the goodness-of-fit of different commonly used copula types to the data samples that represent the annual maxima of flood peaks and the respective volumes both regardless of flood generating processes (the traditional engineering approach and also considering the three process-based classes. Extreme value copulas (Galambos, Gumbel and Hüsler-Reiss show the best performance both for synoptic and flash floods, while the Frank copula shows the best performance for snowmelt floods. It is concluded that there is merit in treating flood

  14. Linear and volume measurements of pulmonary nodules at different CT dose levels. Interscan and interscan analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hein, P.A.; Romano, V.C.; Rogalla, P.; Klessen, C.; Lembcke, A.; Bauknecht, H.C. [Charite-Universitaetsmedizin Berlin (Germany). Inst. fuer Radiologie; Dicken, V.; Bornemann, L. [MeVis Research, Bremen (Germany)

    2009-01-15

    Purpose: To compare the interobserver variability of the unidimensional diameter and volume measurements of pulmonary nodules in an intrascan and interscan analysis using semi-automated segmentation software on ultra-low-dose computed tomography (ULD-CT) and standard dose CT (SD-CT) data. Materials and Methods: In 33 patients with pulmonary nodules, two chest multi-slice CT (MSCT) datasets (1 mm slice thickness; 20 % reconstruction overlap) had been consecutively acquired with an ultra-low dose (120 kV, 5 mAs) and standard dose technique (120 kV, 75 mAs). MSCT data was retrospectively analyzed using the segmentation software OncoTREAT (MeVis, Bremen, Germany, version 1.3). The volume of 229 solid pulmonary nodules included in the analysis as well as the largest diameter according to RECIST (Response Evaluation Criteria for Solid Tumors) were measured by two radiologists. Interobserver variability was calculated and SD-CT and ULD-CT data compared in an intrascan and interscan analysis. Results: The median nodule diameter (n = 229 nodules) was registered with 8.2 mm (range: 2.8 to 43.6 mm, mean: 10.8 mm). The nodule volume ranged between 0.01 and 49.1 ml (median 0.1 ml, mean 1.5 ml). With respect to interobserver variability, the intrascan analysis did not reveal statistically significant differences (p > 0.05) between ULD-CT and SD-CT with broader limits of agreement for relative differences of RECIST measurements (-31.0 % + 27.0 % mean -2.0 % for SD-CT; -27.0 % + 38.6 %, mean 5.8 % for ULD-CT) than for volume measurements (-9.4 %, 8.0 %, mean 0.7 % for SD-CT; -13 %, 13 %, mean 0.0 % for ULD-CT). The interscan analysis showed broadened 95 % confidence intervals for volume measurements (-26.5 % 29.1 % mean 1.3 %, and -25.2 %, 29.6 %, mean 2.2 %) but yielded comparable limits of agreement for RECIST measurements. Conclusion: The variability of nodule volumetry assessed by semi-automated segmentation software as well as nodule size determination by RECIST appears to be

  15. ABOUT THE SYSTEM ANALYSIS OF UNEMPLOYMENT OF YOUTH: GENERAL TASKS AND PRIVATE MODELS OF MARKET INDICATORS

    Directory of Open Access Journals (Sweden)

    Natalia V. Kontsevaya

    2016-01-01

    Full Text Available In this work attempt of system approach to the analysis of labor market of youth is made, the place and a role of youth labor exchange are dened, opportunities and methods of state regulation are opened, contradictions in the analysis of the main market indicators are designated.Within system approach to the analysis of dynamics of market processes modeling of the main indicators of labor market in regional scale is shown.This approach can be useful when developing effective and economically reasonable mechanisms of employment of youth, both at the level of regional services of employment, and in the state scale

  16. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    Science.gov (United States)

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  17. PRESSURE-VOLUME ANALYSIS OF THE LUNG WITH AN EXPONENTIAL AND LINEAR-EXPONENTIAL MODEL IN ASTHMA AND COPD

    NARCIS (Netherlands)

    BOGAARD, JM; OVERBEEK, SE; VERBRAAK, AFM; VONS, C; FOLGERING, HTM; VANDERMARK, TW; ROOS, CM; STERK, PJ

    1995-01-01

    The prevalence of abnormalities in lung elasticity in patients with asthma or chronic obstructive pulmonary disease (COPD) is still unclear, This might be due to uncertainties concerning the method of analysis of quasistatic deflation long pressure-volume curves. Pressure-volume curves were obtained

  18. Fast implementation of kernel simplex volume analysis based on modified Cholesky factorization for endmember extraction

    Institute of Scientific and Technical Information of China (English)

    Jing LI; Xiao-run LI; Li-jiao WANG; Liao-ying ZHAO

    2016-01-01

    Endmember extraction is a key step in the hyperspectral image analysis process. The kernel new simplex growing algorithm (KNSGA), recently developed as a nonlinear alternative to the simplex growing algorithm (SGA), has proven a prom-ising endmember extraction technique. However, KNSGA still suffers from two issues limiting its application. First, its random initialization leads to inconsistency in final results; second, excessive computation is caused by the iterations of a simplex volume calculation. To solve the first issue, the spatial pixel purity index (SPPI) method is used in this study to extract the first endmember, eliminating the initialization dependence. A novel approach tackles the second issue by initially using a modified Cholesky fac-torization to decompose the volume matrix into triangular matrices, in order to avoid directly computing the determinant tauto-logically in the simplex volume formula. Theoretical analysis and experiments on both simulated and real spectral data demon-strate that the proposed algorithm significantly reduces computational complexity, and runs faster than the original algorithm.

  19. Final safety analysis report for the Galileo Mission: Volume 1, Reference design document

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    The Galileo mission uses nuclear power sources called Radioisotope Thermoelectric Generators (RTGs) to provide the spacecraft's primary electrical power. Because these generators contain nuclear material, a Safety Analysis Report (SAR) is required. A preliminary SAR and an updated SAR were previously issued that provided an evolving status report on the safety analysis. As a result of the Challenger accident, the launch dates for both Galileo and Ulysses missions were later rescheduled for November 1989 and October 1990, respectively. The decision was made by agreement between the DOE and the NASA to have a revised safety evaluation and report (FSAR) prepared on the basis of these revised vehicle accidents and environments. The results of this latest revised safety evaluation are presented in this document (Galileo FSAR). Volume I, this document, provides the background design information required to understand the analyses presented in Volumes II and III. It contains descriptions of the RTGs, the Galileo spacecraft, the Space Shuttle, the Inertial Upper Stage (IUS), the trajectory and flight characteristics including flight contingency modes, and the launch site. There are two appendices in Volume I which provide detailed material properties for the RTG.

  20. An ERP analysis of recognition and categorization decisions in a prototype-distortion task.

    Directory of Open Access Journals (Sweden)

    Richard J Tunney

    Full Text Available BACKGROUND: Theories of categorization make different predictions about the underlying processes used to represent categories. Episodic theories suggest that categories are represented in memory by storing previously encountered exemplars in memory. Prototype theories suggest that categories are represented in the form of a prototype independently of memory. A number of studies that show dissociations between categorization and recognition are often cited as evidence for the prototype account. These dissociations have compared recognition judgements made to one set of items to categorization judgements to a different set of items making a clear interpretation difficult. Instead of using different stimuli for different tests this experiment compares the processes by which participants make decisions about category membership in a prototype-distortion task and with recognition decisions about the same set of stimuli by examining the Event Related Potentials (ERPs associated with them. METHOD: Sixty-three participants were asked to make categorization or recognition decisions about stimuli that either formed an artificial category or that were category non-members. We examined the ERP components associated with both kinds of decision for pre-exposed and control participants. CONCLUSION: In contrast to studies using different items we observed no behavioural differences between the two kinds of decision; participants were equally able to distinguish category members from non-members, regardless of whether they were performing a recognition or categorisation judgement. Interestingly, this did not interact with prior-exposure. However, the ERP data demonstrated that the early visual evoked response that discriminated category members from non-members was modulated by which judgement participants performed and whether they had been pre-exposed to category members. We conclude from this that any differences between categorization and recognition reflect

  1. Effects of Physical Exercise Interventions on Gait-Related Dual-Task Interference in Older Adults: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Plummer, Prudence; Zukowski, Lisa A; Giuliani, Carol; Hall, Amber M; Zurakowski, David

    2015-01-01

    Dual-task interference during walking can substantially limit mobility and increase the risk of falls among community-dwelling older adults. Previous systematic reviews examining intervention effects on dual-task gait and mobility have not assessed relative dual-task costs (DTC) or investigated whether there are differences in treatment-related changes based on the type of dual task or the type of control group. The purpose of this systematic review was to examine the effects of physical exercise interventions on dual-task performance during walking in older adults. A meta-analysis of randomized controlled trials (RCTs) compared treatment effects between physical exercise intervention and control groups on single- and dual-task gait speed and relative DTC on gait speed. A systematic search of the literature was conducted using the electronic databases PubMed, CINAHL, EMBASE, Web of Science, and PsycINFO searched up to September 19, 2014. Randomized, nonrandomized, and uncontrolled studies published in English and involving older adults were selected. Studies had to include a physical exercise intervention protocol and measure gait parameters during continuous, unobstructed walking in single- and dual-task conditions before and after the intervention. Of 614 abstracts, 21 studies met the inclusion criteria and were included in the systematic review. Fourteen RCTs were included in the meta-analysis. The mean difference between the intervention and control groups significantly favored the intervention for single-task gait speed (mean difference: 0.06 m/s, 95% CI: 0.03, 0.10, p gait speed (mean difference: 0.11 m/s, 95% CI 0.07, 0.15, p gait speed (mean difference: 5.23%, 95% CI 1.40, 9.05, p = 0.007). Evidence from subgroup comparisons showed no difference in treatment-related changes between cognitive-motor and motor-motor dual tasks, or when interventions were compared to active or inactive controls. In summary, physical exercise interventions can improve dual-task

  2. Does bioimpedance analysis or measurement of natriuretic peptides aid volume assessment in peritoneal dialysis patients?

    Science.gov (United States)

    Davenport, Andrew

    2013-01-01

    Cardiovascular mortality remains the commonest cause of death for peritoneal dialysis patients. As such, preventing persistent hypervolemia is important. On the other hand, hypovolemia may potentially risk episodes of acute kidney injury and loss of residual renal function, a major determinant of peritoneal dialysis technique survival. Bioimpedance has developed from a single-frequency research tool to a multi-frequency bioelectrical impedance analysis readily available in the clinic and capable of measuring extracellular, intracellular, and total body water. Similarly, natriuretic peptides released from the heart because of myocardial stretch and increased intracardiac volume have also been variously reported to be helpful in assessing volume status in peritoneal dialysis patients. The question then arises whether these newer technologies and biomarkers have supplanted the time-honored clinical assessment of hydration status or whether they are merely adjuncts that aid the experienced clinician.

  3. Georgetown University Integrated Community Energy System (GU-ICES). Phase III, Stage I. Feasibility analysis. Final report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-10-01

    This Feasibility Analysis covers a wide range of studies and evaluations. The Report is divided into five parts. Section 1 contains all material relating to the Institutional Assessment including consideration of the requirements and position of the Potomac Electric Co. as they relate to cogeneration at Georgetown in parallel with the utility (Task 1). Sections 2 through 7 contain all technical information relating to the Alternative Subsystems Analysis (Task 4). This includes the energy demand profiles upon which the evaluations were based (Task 3). It further includes the results of the Life-Cycle-Cost Analyses (Task 5) which are developed in detail in the Appendix for evaluation in the Technical Report. Also included is the material relating to Incremental Savings and Optimization (Task 6) and the Conceptual Design for candidate alternate subsystems (Task 7). Section 8 contains all material relating to the Environmental Impact Assessment (Task 2). The Appendix contains supplementary material including the budget cost estimates used in the life-cycle-cost analyses, the basic assumptions upon which the life-cycle analyses were developed, and the detailed life-cycle-cost anlysis for each subsystem considered in detail.

  4. A Meta-Analysis of the Wisconsin Card Sort Task in Autism

    Science.gov (United States)

    Landry, Oriane; Al-Taie, Shems

    2016-01-01

    We conducted a meta-analysis of 31 studies, spanning 30 years, utilizing the WCST in participants with autism. We calculated Cohen's d effect sizes for four measures of performance: sets completed, perseveration, failure-to-maintain-set, and non-perseverative errors. The average weighted effect size ranged from 0.30 to 0.74 for each measure, all…

  5. Cognitive Task Analysis of Experts in Designing Multimedia Learning Object Guideline (M-LOG)

    Science.gov (United States)

    Razak, Rafiza Abdul; Palanisamy, Punithavathy

    2013-01-01

    The purpose of this study was to design and develop a set of guidelines for multimedia learning objects to inform instructional designers (IDs) about the procedures involved in the process of content analysis. This study was motivated by the absence of standardized procedures in the beginning phase of the multimedia learning object design which is…

  6. Impaired executive control and reward circuit in Internet gaming addicts under a delay discounting task: independent component analysis.

    Science.gov (United States)

    Wang, Yifan; Wu, Lingdan; Zhou, Hongli; Lin, Xiao; Zhang, Yifen; Du, Xiaoxia; Dong, Guangheng

    2017-04-01

    This study utilized independent component analysis to explore the abnormal functional connectivity (FC) in male participants with Internet gaming disorder (IGD). Functional magnetic resonance imaging and behavioral data were collected from 21 healthy controls (HC) and 18 IGD patients when they were performing a delay discounting task. Behavioral results revealed that the IGD patients showed higher delay discounting rates than HC. Two networks were found to be associated with IGD: (1) the executive control network containing the anterior cingulate cortex and the medial and superior frontal gyrus, and (2) the basal ganglia network containing the lentiform nucleus. Comparing to HC, IGD exhibited stronger FC when selecting small and now options. In addition, the delay discounting rates were positively correlated with the modulation of the two networks and the reaction time. The results suggested that the IGD patients have enhanced sensitivity to reward and decreased ability to control their impulsivity effectively, which leads to myopic decision making.

  7. JV Task 99-Integrated Risk Analysis and Contaminant Reduction, Watford City, North Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Jaroslav Solc; Barry W. Botnen

    2007-05-31

    The Energy & Environmental Research Center (EERC) conducted a limited site investigation and risk analyses for hydrocarbon-contaminated soils and groundwater at a Construction Services, Inc., site in Watford City, North Dakota. Site investigation confirmed the presence of free product and high concentrations of residual gasoline-based contaminants in several wells, the presence of 1,2-dichloroethane, and extremely high levels of electrical conductivity indicative of brine residuals in the tank area south of the facility. The risk analysis was based on compilation of information from the site-specific geotechnical investigation, including multiphase extraction pilot test, laser induced fluorescence probing, evaluation of contaminant properties, receptor survey, capture zone analysis and evaluation of well head protection area for municipal well field. The project results indicate that the risks associated with contaminant occurrence at the Construction Services, Inc. site are low and, under current conditions, there is no direct or indirect exposure pathway between the contaminated groundwater and soils and potential receptors.

  8. Improved Duct Systems Task Report with StageGate 2 Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moyer, Neil [Florida Solar Energy Center, Cocoa, FL (United States); Stroer, Dennis [Calcs-Plus, Venice, FL (United States)

    2007-12-31

    This report is about Building America Industrialized Housing Partnership's work with two industry partners, Davalier Homes and Southern Energy Homes, in constructing and evaluating prototype interior duct systems. Issues of energy performance, comfort, DAPIA approval, manufacturability and cost is addressed. A stage gate 2 analysis addresses the current status of project showing that there are still refinements needed to the process of incorporating all of the ducts within the air and thermal boundaries of the envelope.

  9. Guidelines on routine cerebrospinal fluid analysis. Report from an EFNS task force

    DEFF Research Database (Denmark)

    Deisenhammer, F; Bartos, A; Egg, R

    2006-01-01

    A great variety of neurological diseases require investigation of cerebrospinal fluid (CSF) to prove the diagnosis or to rule out relevant differential diagnoses. The objectives were to evaluate the theoretical background and provide guidelines for clinical use in routine CSF analysis including...... be evaluated whenever pleocytosis is found or leptomeningeal metastases or pathological bleeding is suspected. Computed tomography-negative intrathecal bleeding should be investigated by bilirubin detection....

  10. Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report

    Science.gov (United States)

    2007-02-05

    07 planning conference 14 Dec 06 II Marine Expeditionary Force (MEF) meeting with Major Smith 14 Dec 06 Gulf of Mexico Tyndall Air Force Base Missile...Restructured action item spreadsheet " Reviewed the following storyboards (functional flow, graphics and text): 1. 050101 Main Rotor System components 2... storyboards (functional flow, graphics, and text): o 050101 Main Rotor System components. Reliability Information Analysis Center 6000 Flanagan Road

  11. Fatigue monitoring and analysis of orthotropic steel deck considering traffic volume and ambient temperature

    Institute of Scientific and Technical Information of China (English)

    SONG; YongSheng; DING; YouLiang

    2013-01-01

    Fatigue has gradually become a serious issue for orthotropic steel deck used for long-span bridges. Two fatigue effects, namely number of stress cycles and equivalent stress amplitude, were introduced as investigated parameters in this paper. Investigation was focused on their relationships with traffic volume and ambient temperature by using 7-months fatigue monitoring data of an actual bridge. A fatigue analytical model considering temperature-induced changes in material property of asphalt pavement was established for verifying these relationships. The analysis results revealed that the number of stress cycles and equivalent stress amplitude showed a linear correlation with the traffic volume and ambient temperature, respectively, and that the rib-to-deck welded joint was much more sensitive to the traffic volume and ambient temperature than the rib-to-rib welded joint. The applicability of the code-recommended model for fatigue vehicle loading was also discussed, which revealed that the deterministic vehicle loading model requires improvement to account for significant randomness of the actual traffic conditions.

  12. Coupled Structural, Thermal, Phase-change and Electromagnetic Analysis for Superconductors, Volume 2

    Science.gov (United States)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Militello, C.; Schuler, J. J.

    1996-01-01

    Described are the theoretical development and computer implementation of reliable and efficient methods for the analysis of coupled mechanical problems that involve the interaction of mechanical, thermal, phase-change and electromag subproblems. The focus application has been the modeling of superconductivity and associated quantum-state phase change phenomena. In support of this objective the work has addressed the following issues: (1) development of variational principles for finite elements, (2) finite element modeling of the electromagnetic problem, (3) coupling of thermel and mechanical effects, and (4) computer implementation and solution of the superconductivity transition problem. The main accomplishments have been: (1) the development of the theory of parametrized and gauged variational principles, (2) the application of those principled to the construction of electromagnetic, thermal and mechanical finite elements, and (3) the coupling of electromagnetic finite elements with thermal and superconducting effects, and (4) the first detailed finite element simulations of bulk superconductors, in particular the Meissner effect and the nature of the normal conducting boundary layer. The theoretical development is described in two volumes. Volume 1 describes mostly formulation specific problems. Volume 2 describes generalization of those formulations.

  13. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  14. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    Science.gov (United States)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  15. Low-volume multiplexed proteolytic activity assay and inhibitor analysis through a pico-injector array.

    Science.gov (United States)

    Ng, Ee Xien; Miller, Miles A; Jing, Tengyang; Lauffenburger, Doug A; Chen, Chia-Hung

    2015-02-21

    Secreted active proteases, from families of enzymes such as matrix metalloproteinases (MMPs) and ADAMs (a disintegrin and metalloproteinases), participate in diverse pathological processes. To simultaneously measure multiple specific protease activities, a series of parallel enzyme reactions combined with a series of inhibitor analyses for proteolytic activity matrix analysis (PrAMA) are essential but limited due to the sample quantity requirements and the complexity of performing multiple reactions. To address these issues, we developed a pico-injector array to generate 72 different reactions in picoliter-volume droplets by controlling the sequence of combinational injections, which allowed simultaneous recording of a wide range of multiple enzyme reactions and measurement of inhibitor effects using small sample volumes (~10 μL). Multiple MMP activities were simultaneously determined by 9 different substrates and 2 inhibitors using injections from a pico-injector array. Due to the advantages of inhibitor analysis, the MMP/ADAM activities of MDA-MB-231, a breast cancer cell line, were characterized with high MMP-2, MMP-3 and ADAM-10 activity. This platform could be customized for a wide range of applications that also require multiple reactions with inhibitor analysis to enhance the sensitivity by encapsulating different chemical sensors.

  16. HTGR accident initiation and progression analysis status report. Volume V. AIPA fission product source terms

    Energy Technology Data Exchange (ETDEWEB)

    Alberstein, D.; Apperson, C.E. Jr.; Hanson, D.L.; Myers, B.F.; Pfeiffer, W.W.

    1976-02-01

    The primary objective of the Accident Initiation and Progression Analysis (AIPA) Program is to provide guidance for high-temperature gas-cooled reactor (HTGR) safety research and development. Among the parameters considered in estimating the uncertainties in site boundary doses are uncertainties in fission product source terms generated under normal operating conditions, i.e., fuel body inventories, circulating coolant activity, total plateout activity in the primary circuit, and plateout distributions. The volume presented documents the analyses of these source term uncertainties. The results are used for the detailed consequence evaluations, and they provide the basis for evaluation of fission products important for HTGR maintenance and shielding.

  17. Do skeletal cephalometric characteristics correlate with condylar volume, surface and shape? A 3D analysis

    OpenAIRE

    Saccucci Matteo; Polimeni Antonella; Festa Felice; Tecco Simona

    2012-01-01

    Abstract Objective The purpose of this study was to determine the condylar volume in subjects with different mandibular divergence and skeletal class using cone-beam computed tomography (CBCT) and analysis software. Materials and methods For 94 patients (46 females and 48 males; mean age 24.3 ± 6.5 years), resultant rendering reconstructions of the left and right temporal mandibular joints (TMJs) were obtained. Subjects were then classified on the base of ANB angle the GoGn-SN angle in three ...

  18. Study of the free volume fraction in polylactic acid (PLA) by thermal analysis

    Science.gov (United States)

    Abdallah, A.; Benrekaa, N.

    2015-10-01

    The poly (lactic acid) or polylactide (PLA) is a biodegradable polymer with high modulus, strength and thermoplastic properties. In this work, the evolution of various properties of PLA is studied, such as glass transition temperature, mechanical modules and elongation percentage with the aim of investigating the free volume fraction. To do so, two thermal techniques have been used: the dynamic mechanical analysis (DMA) and dilatometry. The results obtained by these techniques are combined to go back to the structural properties of the studied material.

  19. VOLUME STUDY WITH HIGH DENSITY OF PARTICLES BASED ON CONTOUR AND CORRELATION IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tatyana Yu. Nikolaeva

    2014-11-01

    Full Text Available The subject of study is the techniques of particle statistics evaluation, in particular, processing methods of particle images obtained by coherent illumination. This paper considers the problem of recognition and statistical accounting for individual images of small scattering particles in an arbitrary section of the volume in case of high concentrations. For automatic recognition of focused particles images, a special algorithm for statistical analysis based on contouring and thresholding was used. By means of the mathematical formalism of the scalar diffraction theory, coherent images of the particles formed by the optical system with high numerical aperture were simulated. Numerical testing of the method proposed for the cases of different concentrations and distributions of particles in the volume was performed. As a result, distributions of density and mass fraction of the particles were obtained, and the efficiency of the method in case of different concentrations of particles was evaluated. At high concentrations, the effect of coherent superposition of the particles from the adjacent planes strengthens, which makes it difficult to recognize images of particles using the algorithm considered in the paper. In this case, we propose to supplement the method with calculating the cross-correlation function of particle images from adjacent segments of the volume, and evaluating the ratio between the height of the correlation peak and the height of the function pedestal in the case of different distribution characters. The method of statistical accounting of particles considered in this paper is of practical importance in the study of volume with particles of different nature, for example, in problems of biology and oceanography. Effective work in the regime of high concentrations expands the limits of applicability of these methods for practically important cases and helps to optimize determination time of the distribution character and

  20. Ocean thermal energy conversion cold water pipe preliminary design project. Task 2. Analysis for concept selection

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-04-01

    The successful performance of the CWP is of crucial importance to the overall OTEC system; the pipe itself is considered the most critical part of the entire operation. Because of the importance the CWP, a project for the analysis and design of CWP's was begun in the fall of 1978. The goals of this project were to study a variety of concepts for delivering cold water to an OTEC plant, to analyze and rank these concepts based on their relative cost and risk, and to develop preliminary design for those concepts which seemed most promising. Two representative platforms and sites were chosen: a spar buoy of a Gibbs and Cox design to be moored at a site off Punta Tuna, Puerto Rico, and a barge designed by APL/Johns Hopkins University, grazing about a site approximately 200 miles east of the coast of Brazil. The approach was to concentrate on the most promising concepts and on those which were either of general interest or espoused by others (e.g., steel and concrete concepts). Much of the overall attention, therefore, focused on analyzing rigid and compliant wall design, while stockade (except for the special case of the FRP stockade) and bottom-mounted concepts received less attention. A total of 67 CWP concepts were initially generated and subjected to a screening process. Of these, 16 were carried through design analysis, costing, and ranking. Study results are presented in detail. (WHK)

  1. Image structural analysis in the tasks of automatic navigation of unmanned vehicles and inspection of Earth surface

    Science.gov (United States)

    Lutsiv, Vadim; Malyshev, Igor

    2013-10-01

    The automatic analysis of images of terrain is urgent for several decades. On the one hand, such analysis is a base of automatic navigation of unmanned vehicles. On the other hand, the amount of information transferred to the Earth by modern video-sensors increases, thus a preliminary classification of such data by onboard computer becomes urgent. We developed an object-independent approach to structural analysis of images. While creating the methods of image structural description, we did our best to abstract away from the partial peculiarities of scenes. Only the most general limitations were taken into account, that were derived from the laws of organization of observable environment and from the properties of image formation systems. The practical application of this theoretic approach enables reliable matching the aerospace photographs acquired from differing aspect angles, in different day-time and seasons by sensors of differing types. The aerospace photographs can be matched even with the geographic maps. The developed approach enabled solving the tasks of automatic navigation of unmanned vehicles. The signs of changes and catastrophes can be detected by means of matching and comparison of aerospace photographs acquired at different time. We present the theoretical proofs of chosen strategy of structural description and matching of images. Several examples of matching of acquired images with template pictures and maps of terrain are shown within the frameworks of navigation of unmanned vehicles or detection of signs of disasters.

  2. Vigilance task-related change in brain functional connectivity as revealed by wavelet phase coherence analysis of near-infrared spectroscopy signals

    Directory of Open Access Journals (Sweden)

    Wang Wei

    2016-08-01

    Full Text Available This study aims to assess the vigilance task-related change in connectivity in healthy adults using wavelet phase coherence (WPCO analysis of near-infrared spectroscopy signals (NIRS. NIRS is a non-invasive neuroimaging technique for assessing brain activity. Continuous recordings of the NIRS signals were obtained from the prefrontal cortex (PFC and sensorimotor cortical areas of 20 young healthy adults (24.9±3.3 years during a 10-min resting state and a 20-min vigilance task state. The vigilance task was used to simulate driving mental load by judging three random numbers (i.e., whether odd numbers. The task was divided into two sessions: the first 10 minutes (Task t1 and the second 10 minutes (Task t2. The WPCO of six channel pairs were calculated in five frequency intervals: 0.6–2 Hz (I, 0.145–0.6 Hz (II, 0.052–0.145 Hz (III, 0.021–0.052 Hz (IV, and 0.0095–0.021 Hz (V. The significant WPCO formed global connectivity (GC maps in intervals I and II and functional connectivity (FC maps in intervals III to V. Results show that the GC levels in interval I and FC levels in interval III were significantly lower in the Task t2 than in the resting state (p < 0.05, particularly between the left PFC and bilateral sensorimotor regions. Also, the reaction time shows an increase in Task t2 compared with that in Task t1. However, no significant difference in WPCO was found between Task t1 and resting state. The results showed that the change in FC at the range of 0.6-2 Hz was not attributed to the vigilance task pe se, but the interaction effect of vigilance task and time factors. The findings suggest that the decreased attention level might be partly attributed to the reduced GC levels between the left prefrontal region and sensorimotor area. The present results provide a new insight into the vigilance task-related brain activity.

  3. The 1999-2000 ACC task analysis of nurse-midwifery/midwifery practice: a consideration of the concept of professional issues.

    Science.gov (United States)

    Johnson, P G; Oshio, S; Fisher, M C; Fullerton, J T

    2001-01-01

    The American College of Nurse-Midwives (ACNM) Certification Council periodically conducts a task analysis study as evidence supporting the content validity of the national certification examination in nurse-midwifery and midwifery. The purpose of this article is to report findings related to the examination of the relationship between professional issues and safe beginning-level midwifery as measured by the 1999-2000 Task Analysis of American Nurse Midwifery and Midwifery Practice. Study findings suggest that newly certified midwives place strong emphasis on the importance of tasks related to the ACNM "Hallmarks of Midwifery," which characterize the art and science of the profession: these include tasks dealing with health promotion and cultural competency. The beginning midwives, however, gave consistently low ratings to tasks related to ACNM "Core Competencies" that mirror the professional responsibilities of midwives; these include tasks related to the history of midwifery, research, or health policy. The study has implications for nurse-midwifery/midwifery educators, experienced midwifery mentors, and other persons interested in reinforcing the relevance of these important professional issues to the new midwife.

  4. Principal components analysis of reward prediction errors in a reinforcement learning task.

    Science.gov (United States)

    Sambrook, Thomas D; Goslin, Jeremy

    2016-01-01

    Models of reinforcement learning represent reward and punishment in terms of reward prediction errors (RPEs), quantitative signed terms describing the degree to which outcomes are better than expected (positive RPEs) or worse (negative RPEs). An electrophysiological component known as feedback related negativity (FRN) occurs at frontocentral sites 240-340ms after feedback on whether a reward or punishment is obtained, and has been claimed to neurally encode an RPE. An outstanding question however, is whether the FRN is sensitive to the size of both positive RPEs and negative RPEs. Previous attempts to answer this question have examined the simple effects of RPE size for positive RPEs and negative RPEs separately. However, this methodology can be compromised by overlap from components coding for unsigned prediction error size, or "salience", which are sensitive to the absolute size of a prediction error but not its valence. In our study, positive and negative RPEs were parametrically modulated using both reward likelihood and magnitude, with principal components analysis used to separate out overlying components. This revealed a single RPE encoding component responsive to the size of positive RPEs, peaking at ~330ms, and occupying the delta frequency band. Other components responsive to unsigned prediction error size were shown, but no component sensitive to negative RPE size was found.

  5. Composite materials. Volume 3 - Engineering applications of composites. Volume 4 - Metallic matrix composites. Volume 8 - Structural design and analysis, Part 2

    Science.gov (United States)

    Noton, B. R. (Editor); Kreider, K. G.; Chamis, C. C.

    1974-01-01

    This volume discusses a vaety of applications of both low- and high-cost composite materials in a number of selected engineering fields. The text stresses the use of fiber-reinforced composites, along with interesting material systems used in the electrical and nuclear industries. As to technology transfer, a similarity is noted between many of the reasons responsible for the utilization of composites and those problems requiring urgent solution, such as mechanized fabrication processes and design for production. Features topics include road transportation, rail transportation, civil aircraft, space vehicles, builing industry, chemical plants, and appliances and equipment. The laminate orientation code devised by Air Force materials laboratory is included. Individual items are announced in this issue.

  6. Dissociated multi-unit activity and local field potentials: a theory inspired analysis of a motor decision task.

    Science.gov (United States)

    Mattia, Maurizio; Ferraina, Stefano; Del Giudice, Paolo

    2010-09-01

    Local field potentials (LFP) and multi-unit activity (MUA) recorded in vivo are known to convey different information about the underlying neural activity. Here we extend and support the idea that single-electrode LFP-MUA task-related modulations can shed light on the involved large-scale, multi-modular neural dynamics. We first illustrate a theoretical scheme and associated simulation evidence, proposing that in a multi-modular neural architecture local and distributed dynamic properties can be extracted from the local spiking activity of one pool of neurons in the network. From this new perspective, the spectral features of the field potentials reflect the time structure of the ongoing fluctuations of the probed local neuronal pool on a wide frequency range. We then report results obtained recording from the dorsal premotor (PMd) cortex of monkeys performing a countermanding task, in which a reaching movement is performed, unless a visual stop signal is presented. We find that the LFP and MUA spectral components on a wide frequency band (3-2000 Hz) are very differently modulated in time for successful reaching, successful and wrong stop trials, suggesting an interplay of local and distributed components of the underlying neural activity in different periods of the trials and for different behavioural outcomes. Besides, the MUA spectral power is shown to possess a time-dependent structure, which we suggest could help in understanding the successive involvement of different local neuronal populations. Finally, we compare signals recorded from PMd and dorso-lateral prefrontal (PFCd) cortex in the same experiment, and speculate that the comparative time-dependent spectral analysis of LFP and MUA can help reveal patterns of functional connectivity in the brain.

  7. Network analysis of returns and volume trading in stock markets: The Euro Stoxx case

    Science.gov (United States)

    Brida, Juan Gabriel; Matesanz, David; Seijas, Maria Nela

    2016-02-01

    This study applies network analysis to analyze the structure of the Euro Stoxx market during the long period from 2002 up to 2014. The paper generalizes previous research on stock market networks by including asset returns and volume trading as the main variables to study the financial market. A multidimensional generalization of the minimal spanning tree (MST) concept is introduced, by adding the role of trading volume to the traditional approach which only includes price returns. Additionally, we use symbolization methods to the raw data to study the behavior of the market structure in different, normal and critical, situations. The hierarchical organization of the network is derived, and the MST for different sub-periods of 2002-2014 is created to illustrate how the structure of the market evolves over time. From the structural topologies of these trees, different clusters of companies are identified and analyzed according to their geographical and economic links. Two important results are achieved. Firstly, as other studies have highlighted, at the time of the financial crisis after 2008 the network becomes a more centralized one. Secondly and most important, during our second period of analysis, 2008-2014, we observe that hierarchy becomes more country-specific where different sub-clusters of stocks belonging to France, Germany, Spain or Italy are found apart from their business sector group. This result may suggest that during this period of time financial investors seem to be worried most about country specific economic circumstances.

  8. Analysis of volume expansion data for periclase, lime, corundum and spinel at high temperatures

    Indian Academy of Sciences (India)

    B P Singh; H Chandra; R Shyam; A Singh

    2012-08-01

    We have presented an analysis of the volume expansion data for periclase (MgO), lime (CaO), corundum (Al2O3) and spinel (MgAl2O4) determined experimentally by Fiquet et al (1999) from 300K up to 3000K. The thermal equation of state due to Suzuki et al (1979) and Shanker et al (1997) are used to study the relationships between thermal pressure and volume expansion for the entire range of temperatures starting from room temperature up to the melting temperatures of the solids under study. Comparison of the results obtained in the present study with the corresponding experimental data reveal that the thermal pressure changes with temperature almost linearly up to quite high temperatures. At extremely high temperatures close to the melting temperatures thermal pressure deviates significantly from linearity. This prediction is consistent with other recent investigations. A quantitative analysis based on the theory of anharmonic effects has been presented to account for the nonlinear variation of the thermal pressure at high temperatures.

  9. Volume and structural analysis of super-cooled water under high pressure

    Science.gov (United States)

    Duki, Solomon F.; Tsige, Mesfin

    2012-02-01

    Motivated by recent experimental study of super-cooled water at high pressure [1], we performed atomistic molecular dynamic simulations study on bulk water molecules at isothermal-isobaric ensemble. These simulations are performed at temperatures that range from 40 K to 380 K using two different cooling rates, 10K/ns and 10K/5ns, and pressure that ranges from 1atm to 10000 atm. Our analysis for the variation of the volume of the bulk sample against temperature indicates a downward concave shape for pressures above certain values, as reported in [1]. The same downward concave behavior is observed at high pressure on the mean-squared-displacements (MSD) of the water molecules when the MSD is plotted against time. To get further insight on the effect of the pressure on the sample we have also performed a structural analysis of the sample.[4pt] [1] O. Mishima, J. Chem. Phys. 133, 144503 (2010);

  10. Effects of elevated vacuum on in-socket residual limb fluid volume: Case study results using bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Joan E. Sanders, PhD

    2011-12-01

    Full Text Available Bioimpedance analysis was used to measure the residual limb fluid volume of seven transtibial amputee subjects using elevated vacuum sockets and nonelevated vacuum sockets. Fluid volume changes were assessed during sessions with the subjects sitting, standing, and walking. In general, fluid volume losses during 3 or 5 min walks and losses over the course of the 30 min test session were less for elevated vacuum than for suction. Numerous variables, including the time of day that data were collected, soft tissue consistency, socket-to-limb size and shape differences, and subject health, may have affected the results and had an equivalent or greater effect on limb fluid volume compared with elevated vacuum. Researchers should well consider these variables in the study design of future investigations on the effects of elevated vacuum on residual limb volume.

  11. The history of NATO TNF policy: The role of studies, analysis and exercises conference proceedings. Volume 1, Introduction and summary

    Energy Technology Data Exchange (ETDEWEB)

    Rinne, R.L. [ed.

    1994-02-01

    This conference was organized to study and analyze the role of simulation, analysis, modeling, and exercises in the history of NATO policy. The premise was not that the results of past studies will apply to future policy, but rather that understanding what influenced the decision process -- and how -- would be of value. The structure of the conference was built around discussion panels. The panels were augmented by a series of papers and presentations focusing on particular TNF events, issues, studies or exercise. The conference proceedings consist of three volumes. This volume, Volume 1, contains the conference introduction, agenda, biographical sketches of principal participants, and analytical summary of the presentations and discussion panels. Volume 2 contains a short introduction and the papers and presentations from the conference. Volume 3 contains selected papers by Brig. Gen. Robert C. Richardson III (Ret.).

  12. The history of NATO TNF policy: The role of studies, analysis and exercises conference proceedings. Volume 2: Papers and presentations

    Energy Technology Data Exchange (ETDEWEB)

    Rinne, R.L.

    1994-02-01

    This conference was organized to study and analyze the role of simulation, analysis, modeling, and exercises in the history of NATO policy. The premise was not that the results of past studies will apply to future policy, but rather that understanding what influenced the decision process -- and how -- would be of value. The structure of the conference was built around discussion panels. The panels were augmented by a series of papers and presentations focusing on particular TNF events, issues, studies, or exercises. The conference proceedings consist of three volumes. Volume 1 contains the conference introduction, agenda, biographical sketches of principal participants, and analytical summary of the presentations and panels. This volume contains a short introduction and the papers and presentations from the conference. Volume 3 contains selected papers by Brig. Gen. Robert C. Richardson III (Ret.). Individual papers in this volume were abstracted and indexed for the database.

  13. Open Educational Resources from Performance Task using Video Analysis and Modeling - Tracker and K12 science education framework

    CERN Document Server

    Wee, Loo Kang

    2014-01-01

    This invited paper discusses why Physics performance task by grade 9 students in Singapore is worth participating in for two reasons; 1) the video analysis and modeling are open access, licensed creative commons attribution for advancing open educational resources in the world and 2) allows students to be like physicists, where the K12 science education framework is adopted. Personal reflections on how physics education can be made more meaningful in particular Practice 1: Ask Questions, Practice 2: Use Models and Practice 5: Mathematical and Computational Thinking using Video Modeling supported by evidence based data from video analysis. This paper hopes to spur fellow colleagues to look into open education initiatives such as our Singapore Tracker community open educational resources curate on http://weelookang.blogspot.sg/p/physics-applets-virtual-lab.html as well as digital libraries http://iwant2study.org/lookangejss/ directly accessible through Tracker 4.86, EJSS reader app on Android and iOS and EJS 5....

  14. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Henriksen, K.; Kaye, R.D.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate.

  15. Measurement of fluid viscosity at microliter volumes using quartz impedance analysis.

    Science.gov (United States)

    Saluja, Atul; Kalonia, Devendra S

    2004-08-05

    The purpose of this work was to measure viscosity of fluids at low microliter volumes by means of quartz crystal impedance analysis. To achieve this, a novel setup was designed that allowed for measurement of viscosity at volumes of 8 to 10 microL. The technique was based on the principle of electromechanical coupling of piezoelectric quartz crystals. The arrangement was simple with measurement times ranging from 2 to 3 minutes. The crystal setup assembly did not impose any unwanted initial stress on the unloaded quartz crystal. Quartz crystals of 5- and 10-MHz fundamental frequency were calibrated with glycerol-water mixtures of known density and viscosity prior to viscosity measurements. True frequency shifts, for the purpose of this work, were determined followed by viscosity measurement of aqueous solutions of sucrose, urea, PEG-400, glucose, and ethylene glycol at 25 degrees C +/- 0.5 degrees C. The measured viscosities were found to be reproducible and consistent with the values reported in the literature. Minor inconsistencies in the measured resistance and frequency shifts did not affect the results significantly, and were found to be experimental in origin rather than due to electrode surface roughness. Besides, as expected for a viscoelastic fluid, PEG 8000 solutions, the calculated viscosities were found to be less than the reported values due to frequency dependence of storage and loss modulus components of complex viscosity. From the results, it can be concluded that the present setup can provide accurate assessment of viscosity of Newtonian fluids and also shows potential for analyzing non-Newtonian fluids at low microliter volumes.

  16. Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    Science.gov (United States)

    Davenport, David A.

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than +/- 3%. The average uncertainty was shown to be less than +/- 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any major software

  17. Preoperative determination of prostate cancer tumor volume: analysis through biopsy fragments

    Directory of Open Access Journals (Sweden)

    Alberto A. Antunes

    2007-08-01

    Full Text Available OBJECTIVE: Preoperative determination of prostate cancer (PCa tumor volume (TV is still a big challenge. We have assessed variables obtained in prostatic biopsy aiming at determining which is the best method to predict the TV in radical prostatectomy (RP specimens. MATERIALS AND METHODS: Biopsy findings of 162 men with PCa submitted to radical prostatectomy were revised. Preoperative characteristics, such as PSA, the percentage of positive fragments (PPF, the total percentage of cancer in the biopsy (TPC, the maximum percentage of cancer in a fragment (MPC, the presence of perineural invasion (PNI and the Gleason score were correlated with postoperative surgical findings through an univariate analysis of a linear regression model. RESULTS: The TV correlated significantly to the PPF, TPC, MPC, PSA and to the presence of PNI (p < 0.001. However, the Pearson correlation analysis test showed an R2 of only 24%, 12%, 17% and 9% for the PPF, TPC, MPC, and PSA respectively. The combination of the PPF with the PSA and the PNI analysis showed to be a better model to predict the TV (R2 of 32.3%. The TV could be determined through the formula: Volume = 1.108 + 0.203 x PSA + 0.066 x PPF + 2.193 x PNI. CONCLUSIONS: The PPF seems to be better than the TPC and the MPC to predict the TV in the surgical specimen. Due to the weak correlation between those variables and the TV, the PSA and the presence of PNI should be used together.

  18. Calcium isolation from large-volume human urine samples for 41Ca analysis by accelerator mass spectrometry.

    Science.gov (United States)

    Miller, James J; Hui, Susanta K; Jackson, George S; Clark, Sara P; Einstein, Jane; Weaver, Connie M; Bhattacharyya, Maryka H

    2013-08-01

    Calcium oxalate precipitation is the first step in preparation of biological samples for (41)Ca analysis by accelerator mass spectrometry. A simplified protocol for large-volume human urine samples was characterized, with statistically significant increases in ion current and decreases in interference. This large-volume assay minimizes cost and effort and maximizes time after (41)Ca administration during which human samples, collected over a lifetime, provide (41)Ca:Ca ratios that are significantly above background.

  19. Calcium Isolation from Large-Volume Human Urine Samples for 41Ca Analysis by Accelerator Mass Spectrometry

    Science.gov (United States)

    Miller, James J; Hui, Susanta K; Jackson, George S; Clark, Sara P; Einstein, Jane; Weaver, Connie M; Bhattacharyya, Maryka H

    2013-01-01

    Calcium oxalate precipitation is the first step in preparation of biological samples for 41Ca analysis by accelerator mass spectrometry. A simplified protocol for large-volume human urine samples was characterized, with statistically significant increases in ion current and decreases in interference. This large-volume assay minimizes cost and effort and maximizes time after 41Ca administration during which human samples, collected over a lifetime, provide 41Ca:Ca ratios that are significantly above background. PMID:23672965

  20. A Typology of Tasks for Mobile-Assisted Language Learning: Recommendations from a Small-Scale Needs Analysis

    Science.gov (United States)

    Park, Moonyoung; Slater, Tammy

    2014-01-01

    In response to the research priorities of members of TESOL (Teachers of English to Speakers of Other Languages), this study investigated language learners' realworld tasks in mobile-assisted language learning (MALL) to inform the future development of pedagogic tasks for academic English as a second language (ESL) courses. The data included…

  1. Reflective Analysis as a Tool for Task Redesign: The Case of Prospective Elementary Teachers Solving and Posing Fraction Comparison Problems

    Science.gov (United States)

    Thanheiser, Eva; Olanoff, Dana; Hillen, Amy; Feldman, Ziv; Tobias, Jennifer M.; Welder, Rachael M.

    2016-01-01

    Mathematical task design has been a central focus of the mathematics education research community over the last few years. In this study, six university teacher educators from six different US institutions formed a community of practice to explore key aspects of task design (planning, implementing, reflecting, and modifying) in the context of…

  2. Precise measurement of liquid petroleum tank volume based on data cloud analysis

    Science.gov (United States)

    Wang, Jintao; Liu, Ziyong; Zhang, Long; Guo, Ligong; Bao, Xuesong; Tong, Lin

    2010-08-01

    Metal tanks are generally used for the measurement of liquid petroleum products for fiscal or custody transfer application. One tank volume precise measurement method based on data cloud analysis was studied, which was acquired by laser scanning principle. Method of distance measurement by laser phase shift and angular measurement by optical grating were applied to acquire coordinates of points in tank shell under the control of a servo system. Direct Iterative Method (DIM) and Section Area Method (SAM) were used to process measured data for vertical and horizontal tanks respectively. In comparison experiment, one 1000m3 vertical tank and one 30m3 horizontal tank were used as test objects. In the vertical tank experiment, the largest measured radius difference between the new laser method and strapping method (international arbitrary standard) is 2.8mm. In the horizontal tank experiment, the calibration result from laser scanning method is more close to reference than manual geometric method, and the mean deviation in full-scale range of the former and latter method are 75L and 141L respectively; with the increase of liquid level, the relative errors of laser scanning method and manual geometric method become smaller, and the mean relative errors are 0.6% and 1.5% respectively. By using the method discussed, the calibration efficiency of tank volume can be improved.

  3. Performance Analysis of Fractured Wells with Stimulated Reservoir Volume in Coal Seam Reservoirs

    Directory of Open Access Journals (Sweden)

    Yu-long Zhao

    2016-01-01

    Full Text Available CoalBed Methane (CBM, as one kind of unconventional gas, is an important energy resource, attracting industry interest in research and development. Using the Langmuir adsorption isotherm, Fick’s law in the matrix and Darcy flow in cleat fractures, and treating the Stimulated Reservoir Volume (SRV induced by hydraulic fracturing as a radial composite model, the continuous linear source function with constant production is derived by the methods of the Laplace transform and Duhamel theory. Based on the linear source function, semi-analytical solutions are obtained for a fractured vertical well producing at a constant production rate or constant bottom-hole pressure. With the help of the Stehfest numerical algorithm and computer programing, the well test and rate decline type curves are obtained, and the key flow regimes of fractured CBM wells are: wellbore storage, linear flow in SRV region, diffusion flow and later pseudo-radial flow. Finally, we analyze the effect of various parameters, such as the Langmuir volume, radius and permeability in the SRV region, on the production performance. The research results concluded in this paper have significant importance in terms of the development, well test interpretations and production performance analysis of unconventional gas.

  4. Wind-electric icemaking project: Analysis and dynamometer testing. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Holz, R; Gervorgian, V; Drouilhet, S; Muljadi, E

    1998-07-01

    The wind/hybrid systems group at the National Renewable Energy Laboratory has been researching the most practical and cost-effective methods for producing ice from off-grid wind-electric power systems. The first phase of the project, conducted in 1993--1994, included full-scale dynamometer and field testing of two different electric ice makers directly connected to a permanent magnet alternator. The results of that phase were encouraging and the second phase of the project was launched in which steady-state and dynamic numerical models of these systems were developed and experimentally validated. The third phase of the project was the dynamometer testing of the North Star ice maker, which is powered by a 12-kilowatt Bergey Windpower Company, Inc., alternator. This report describes both the second and third project phases. Also included are detailed economic analyses and a discussion of the future prospects of wind-electric ice-making systems. The main report is contained in Volume 1. Volume 2 consists of the report appendices, which include the actual computer programs used in the analysis and the detailed test results.

  5. Analysis of triangular C-grid finite volume scheme for shallow water flows

    Science.gov (United States)

    Shirkhani, Hamidreza; Mohammadian, Abdolmajid; Seidou, Ousmane; Qiblawey, Hazim

    2015-08-01

    In this paper, a dispersion relation analysis is employed to investigate the finite volume triangular C-grid formulation for two-dimensional shallow-water equations. In addition, two proposed combinations of time-stepping methods with the C-grid spatial discretization are investigated. In the first part of this study, the C-grid spatial discretization scheme is assessed, and in the second part, fully discrete schemes are analyzed. Analysis of the semi-discretized scheme (i.e. only spatial discretization) shows that there is no damping associated with the spatial C-grid scheme, and its phase speed behavior is also acceptable for long and intermediate waves. The analytical dispersion analysis after considering the effect of time discretization shows that the Leap-Frog time stepping technique can improve the phase speed behavior of the numerical method; however it could not damp the shorter decelerated waves. The Adams-Bashforth technique leads to slower propagation of short and intermediate waves and it damps those waves with a slower propagating speed. The numerical solutions of various test problems also conform and are in good agreement with the analytical dispersion analysis. They also indicate that the Adams-Bashforth scheme exhibits faster convergence and more accurate results, respectively, when the spatial and temporal step size decreases. However, the Leap-Frog scheme is more stable with higher CFL numbers.

  6. Use of Human Modeling Simulation Software in the Task Analysis of the Environmental Control and Life Support System Component Installation Procedures

    Science.gov (United States)

    Estes, Samantha; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    Virtual reality and simulation applications are becoming widespread in human task analysis. These programs have many benefits for the Human Factors Engineering field. Not only do creating and using virtual environments for human engineering analyses save money and time, this approach also promotes user experimentation and provides increased quality of analyses. This paper explains the human engineering task analysis performed on the Environmental Control and Life Support System (ECLSS) space station rack and its Distillation Assembly (DA) subsystem using EAI's human modeling simulation software, Jack. When installed on the International Space Station (ISS), ECLSS will provide the life and environment support needed to adequately sustain crew life. The DA is an Orbital Replaceable Unit (ORU) that provides means of wastewater (primarily urine from flight crew and experimental animals) reclamation. Jack was used to create a model of the weightless environment of the ISS Node 3, where the ECLSS is housed. Computer aided drawings of the ECLSS rack and DA system were also brought into the environment. Anthropometric models of a 95th percentile male and 5th percentile female were used to examine the human interfaces encountered during various ECLSS and DA tasks. The results of the task analyses were used in suggesting modifications to hardware and crew task procedures to improve accessibility, conserve crew time, and add convenience for the crew. This paper will address some of those suggested modifications and the method of presenting final analyses for requirements verification.

  7. Latent growth curve analysis of fear during a speech task before and after treatment for social phobia.

    Science.gov (United States)

    Price, Matthew; Anderson, Page L

    2011-11-01

    Models of social phobia highlight the importance of anticipatory anxiety in the experience of fear during a social situation. Anticipatory anxiety has been shown to be highly correlated with performance anxiety for a variety of social situations. A few studies show that average ratings of anxiety during the anticipation and performance phases of a social situation decline following treatment. Evidence also suggests that the point of confrontation with the feared stimulus is the peak level of fear. No study to date has evaluated the pattern of anxious responding across the anticipation, confrontation, and performance phases before and after treatment, which is the focus of the current study. Socially phobic individuals (N = 51) completed a behavioral avoidance task before and after two types of manualized cognitive behavioral therapy, and gave ratings of fear during the anticipation and performance phases. Results from latent growth curve analysis were the same for the two treatments and suggested that before treatment, anxiety sharply increased during the anticipation phase, was highly elevated at the confrontation, and gradually increased during the performance phase. After treatment, anxiety increased during the anticipation phase, although at a much slower rate than at pretreatment, peaked at confrontation, and declined during the performance phase. The findings suggest that anticipatory experiences are critical to the experience of fear for public speaking and should be incorporated into exposures.

  8. A Review of Models of Cost and Training Effectiveness Analysis (CTEA). Volume 1. Training Effectiveness Analysis

    Science.gov (United States)

    1987-10-01

    use of motion, effective graphics , audiovisual presentation and use of hardware as means of gaining attention and maintaining interest in the material... desing guide TDDSS Training development decision support system TD/S Trainin device/simulator TEA Training effectiveness analysis TEC Traininq

  9. Space shuttle/payload interface analysis (study 2.4). Volume 2: Space shuttle traffic analysis

    Science.gov (United States)

    Plough, J. A.

    1973-01-01

    The transfer is reported of the capability to perform capture/cost analyses to MSFC. Space shuttle performance and direct costs, tug characteristics, reliability, and cost data were provided by NASA. The launch vehicle, mission models, payloads, and computer programs are discussed along with capture/cost analysis, and cost estimates. For Vol. 1, see N74-12493.

  10. The analysis of deviations on measured volumes between OSBRA pipeline tank farms and its customers

    Energy Technology Data Exchange (ETDEWEB)

    Kotchetkoff Neto, Andre Paulo; Kawamoto, Fabio Yoshikazu [Petrobas Transporte S.A. - Transpetro, (Brazil)

    2010-07-01

    The Sao Paulo - Brasilia pipeline (OSBRA) is a very long multi product pipeline in Brazil. It has a network of several tank farms, pump stations and truck loadings to deliver oil, gas and LPG to customers. The volume of these fuels is usually measured before delivery at two measure points. This paper reports the use of statistical tools to analyze the measurement data. The purpose was to understand the computed differences between the OSBRA tank farm installations and the customer tank farm installations. The use of statistical tools brought new insights, such as the existence of systematic error or the variability of each individual system. These tools were also used to verify the accuracy of operational measurement devices. An analysis based on in-field data was carried out between two OSBRA tank farms. This paper showed that the use of statistical tools, rather than fixed limits, can provide more precise information about measurement systems behaviors.

  11. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  12. Multiplexed Volume Bragg Gratings in Narrowand Broad-band Spectral Systems: Analysis and Application

    Science.gov (United States)

    Ingersoll, Gregory B.

    Volume Bragg gratings (VBGs) are important holographic optical elements in many spectral systems. Using multiple volume gratings, whether multiplexed or arranged sequentially, provides advantages to many types of systems in overall efficiency, dispersion performance, flexibility of design, etc. However, the use of multiple gratings---particularly when the gratings are multiplexed in a single holographic optical element (HOE)---is subject to inter-grating coupling effects that ultimately limit system performance. Analyzing these coupling effects requires a more complex mathematical model than the straightforward analysis of a single volume grating. We present a matrix-based algorithm for determining diffraction efficiencies of significant coupled waves in these multiplexed grating holographic optical elements (HOEs). Several carefully constructed experiments with spectrally multiplexed gratings in dichromated gelatin verify our conclusions. Applications of this theory to broad- and narrow-band systems are explored in detailed simulations. Broadband systems include spectrum splitters for diverse-bandgap photovoltaic (PV) cells. Volume Bragg gratings can serve as effective spectrum splitters, but the inherent dispersion of a VBG can be detrimental given a broad-spectrum input. The performance of a holographic spectrum splitter element can be improved by utilizing multiple volume gratings, each operating in a slightly different spectral band. However, care must be taken to avoid inter-grating coupling effects that limit ultimate performance. We explore broadband multi-grating holographic optical elements (HOEs) in sandwiched arrangements where individual single-grating HOEs are placed in series, and in multiplexed arrangements where multiple gratings are recorded in a single HOE. Particle swarm optimization (PSO) is used to tailor these systems to the solar spectrum taking into account both efficiency and dispersion. Both multiplexed and sandwiched two-grating systems

  13. Jumping to Conclusions About the Beads Task? A Meta-analysis of Delusional Ideation and Data-Gathering

    Science.gov (United States)

    Ross, Robert Malcolm; McKay, Ryan; Coltheart, Max; Langdon, Robyn

    2015-01-01

    It has been claimed that delusional and delusion-prone individuals have a tendency to gather less data before forming beliefs. Most of the evidence for this “jumping to conclusions” (JTC) bias comes from studies using the “beads task” data-gathering paradigm. However, the evidence for the JTC bias is mixed. We conducted a random-effects meta-analysis of individual participant data from 38 clinical and nonclinical samples (n = 2,237) to investigate the relationship between data gathering in the beads task (using the “draws to decision” measure) and delusional ideation (as indexed by the “Peters et al Delusions Inventory”; PDI). We found that delusional ideation is negatively associated with data gathering (r s = −0.10, 95% CI [−0.17, −0.03]) and that there is heterogeneity in the estimated effect sizes (Q-stat P = .03, I 2 = 33). Subgroup analysis revealed that the negative association is present when considering the 23 samples (n = 1,754) from the large general population subgroup alone (r s = −0.10, 95% CI [−0.18, −0.02]) but not when considering the 8 samples (n = 262) from the small current delusions subgroup alone (r s = −0.12, 95% CI [−0.31, 0.07]). These results provide some provisional support for continuum theories of psychosis and cognitive models that implicate the JTC bias in the formation and maintenance of delusions. PMID:25616503

  14. Temporal discrimination thresholds in adult-onset primary torsion dystonia: an analysis by task type and by dystonia phenotype.

    LENUS (Irish Health Repository)

    Bradley, D

    2012-01-01

    Adult-onset primary torsion dystonia (AOPTD) is an autosomal dominant disorder with markedly reduced penetrance. Sensory abnormalities are present in AOPTD and also in unaffected relatives, possibly indicating non-manifesting gene carriage (acting as an endophenotype). The temporal discrimination threshold (TDT) is the shortest time interval at which two stimuli are detected to be asynchronous. We aimed to compare the sensitivity and specificity of three different TDT tasks (visual, tactile and mixed\\/visual-tactile). We also aimed to examine the sensitivity of TDTs in different AOPTD phenotypes. To examine tasks, we tested TDT in 41 patients and 51 controls using visual (2 lights), tactile (non-painful electrical stimulation) and mixed (1 light, 1 electrical) stimuli. To investigate phenotypes, we examined 71 AOPTD patients (37 cervical dystonia, 14 writer\\'s cramp, 9 blepharospasm, 11 spasmodic dysphonia) and 8 musician\\'s dystonia patients. The upper limit of normal was defined as control mean +2.5 SD. In dystonia patients, the visual task detected abnormalities in 35\\/41 (85%), the tactile task in 35\\/41 (85%) and the mixed task in 26\\/41 (63%); the mixed task was less sensitive than the other two (p = 0.04). Specificity was 100% for the visual and tactile tasks. Abnormal TDTs were found in 36 of 37 (97.3%) cervical dystonia, 12 of 14 (85.7%) writer\\'s cramp, 8 of 9 (88.8%) blepharospasm, 10 of 11 (90.1%) spasmodic dysphonia patients and 5 of 8 (62.5%) musicians. The visual and tactile tasks were found to be more sensitive than the mixed task. Temporal discrimination threshold results were comparable across common adult-onset primary torsion dystonia phenotypes, with lower sensitivity in the musicians.

  15. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  16. Specimen Preparation for Metal Matrix Composites with a High Volume Fraction of Reinforcing Particles for EBSD Analysis

    Science.gov (United States)

    Smirnov, A. S.; Belozerov, G. A.; Smirnova, E. O.; Konovalov, A. V.; Shveikin, V. P.; Muizemnek, O. Yu.

    2016-07-01

    The paper deals with a procedure of preparing a specimen surface for the EBSD analysis of a metal matrix composite (MMC) with a high volume fraction of reinforcing particles. Unlike standard procedures of preparing a specimen surface for the EBSD analysis, the proposed procedure is iterative with consecutive application of mechanical and electrochemical polishing. This procedure significantly improves the results of an indexed MMC matrix in comparison with the standard procedure of specimen preparation. The procedure was verified on a MMC with pure aluminum (99.8% Al) as the matrix, SiC particles being used as reinforcing elements. The average size of the SiC particles is 14 μm, and their volume fraction amounts to 50% of the total volume of the composite. It has been experimentally found that, for making the EBSD analysis of a material matrix near reinforcing particles, the difference in height between the particles and the matrix should not exceed 2 µm.

  17. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  18. Conceptual design and systems analysis of photovoltaic power systems. Final report. Volume V. Additional studies

    Energy Technology Data Exchange (ETDEWEB)

    Pittman, P.F.

    1977-03-01

    In the first of four tasks, the performances of autonomous (stand-alone) residences were determined in seven locations throughout the country. A non-autonomous residence must obtain its supplemental energy from a utility. The second task dealt with considerations of the rate to be charged by the utility for this energy in an effort to define the pertinent issues of this utility/residence interface. In the third task, the configuration of a fixed linear Fresnel lens provided with a tracking absorber was analyzed optically. The fourth task explored utility Loss-of-Load probability methodology.

  19. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients.

    Science.gov (United States)

    Lledó, Luis D; Díez, Jorge A; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J; Sabater-Navarro, José M; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  20. A comparative analysis of 2D and 3D tasks for virtual reality therapies based on robotic-assisted neurorehabilitation for post-stroke patients

    Directory of Open Access Journals (Sweden)

    Luis Daniel Lledó

    2016-08-01

    Full Text Available Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding

  1. Industrial Fuel Gas Demonstration Plant Program. Volume 1. Demonstration plant environmental analysis (Deliverable No. 27)

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Robert W.; Swift, Richard J.; Krause, Arthur J.; Berkey, Edgar

    1979-08-01

    This environmental report describes the proposed action to construct, test and operate a coal gasification demonstration plant in Memphis, Tennessee, under the co-sponsorship of the Memphis Light, Gas and Water Division (MLGW) and the US Department of Energy (DOE). This document is Volume I of a three-volume Environmental Report. Volume I consists of the Summary, Introduction and the Description of the Proposed Action. Volume II consists of the Description of the Existing Environment. Volume III contains the Environmental Impacts of the Proposed Action, Mitigating Measures and Alternatives to the Proposed Action.

  2. Industrial Fuel Gas Demonstration Plant Program. Volume III. Demonstration plant environmental analysis (Deliverable No. 27)

    Energy Technology Data Exchange (ETDEWEB)

    1979-08-01

    An Environmental Report on the Memphis Light, Gas and Water Division Industrial Fuel Demonstration Plant was prepared for submission to the US Department of Energy under Contract ET-77-C-01-2582. This document is Volume III of a three-volume Environmental Report. Volume I consists of the Summary, Introduction and the Description of the Proposed Action. Volume II consists of the Description of the Existing Environment. Volume III contains the Environmental Impacts of the Proposed Action, Mitigating Measures and Alternatives to the Proposed Action.

  3. Solar Pilot Plant, Phase I. Preliminary design report. Volume II. System description and system analysis. CDRL item 2

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-05-01

    Honeywell conducted a parametric analysis of the 10-MW(e) solar pilot plant requirements and expected performance and established an optimum system design. The main analytical simulation tools were the optical (ray trace) and the dynamic simulation models. These are described in detail in Books 2 and 3 of this volume under separate cover. In making design decisions, available performance and cost data were used to provide a design reflecting the overall requirements and economics of a commercial-scale plant. This volume contains a description of this analysis/design process and resultant system/subsystem design and performance.

  4. Quantitative gait analysis under dual-task in older people with mild cognitive impairment: a reliability study

    Directory of Open Access Journals (Sweden)

    Gutmanis Iris

    2009-09-01

    Full Text Available Abstract Background Reliability of quantitative gait assessment while dual-tasking (walking while doing a secondary task such as talking in people with cognitive impairment is unknown. Dual-tasking gait assessment is becoming highly important for mobility research with older adults since better reflects their performance in the basic activities of daily living. Our purpose was to establish the test-retest reliability of assessing quantitative gait variables using an electronic walkway in older adults with mild cognitive impairment (MCI under single and dual-task conditions. Methods The gait performance of 11 elderly individuals with MCI was evaluated using an electronic walkway (GAITRite® System in two sessions, one week apart. Six gait parameters (gait velocity, step length, stride length, step time, stride time, and double support time were assessed under two conditions: single-task (sG: usual walking and dual-task (dG: counting backwards from 100 while walking. Test-retest reliability was determined using intra-class correlation coefficient (ICC. Gait variability was measured using coefficient of variation (CoV. Results Eleven participants (average age = 76.6 years, SD = 7.3 were assessed. They were high functioning (Clinical Dementia Rating Score = 0.5 with a mean Mini-Mental Status Exam (MMSE score of 28 (SD = 1.56, and a mean Montreal Cognitive Assessment (MoCA score of 22.8 (SD = 1.23. Under dual-task conditions, mean gait velocity (GV decreased significantly (sGV = 119.11 ± 20.20 cm/s; dGV = 110.88 ± 19.76 cm/s; p = 0.005. Additionally, under dual-task conditions, higher gait variability was found on stride time, step time, and double support time. Test-retest reliability was high (ICC>0.85 for the six parameters evaluated under both conditions. Conclusion In older people with MCI, variability of time-related gait parameters increased with dual-tasking suggesting cognitive control of gait performance. Assessment of quantitative gait

  5. SemEval-2013 Task 7: The Joint Student Response Analysis and 8th Recognizing Textual Entailment Challenge

    Science.gov (United States)

    2013-06-01

    to bring to- gether researchers in educational NLP tech- nology and textual entailment. The task of giving feedback on student answers requires...from 9 partic- ipating teams, and discuss future directions. 1 Introduction One of the tasks in educational NLP systems is pro- viding feedback to...level (Petersen and Ostendorf, 2009; Shee- han et al., 2010; Nelson et al., 2012). In these appli- cations, NLP methods based on shallow features and

  6. Density-viscosity product of small-volume ionic liquid samples using quartz crystal impedance analysis.

    Science.gov (United States)

    McHale, Glen; Hardacre, Chris; Ge, Rile; Doy, Nicola; Allen, Ray W K; MacInnes, Jordan M; Bown, Mark R; Newton, Michael I

    2008-08-01

    Quartz crystal impedance analysis has been developed as a technique to assess whether room-temperature ionic liquids are Newtonian fluids and as a small-volume method for determining the values of their viscosity-density product, rho eta. Changes in the impedance spectrum of a 5-MHz fundamental frequency quartz crystal induced by a water-miscible room-temperature ionic liquid, 1-butyl-3-methylimiclazolium trifluoromethylsulfonate ([C4mim][OTf]), were measured. From coupled frequency shift and bandwidth changes as the concentration was varied from 0 to 100% ionic liquid, it was determined that this liquid provided a Newtonian response. A second water-immiscible ionic liquid, 1-butyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide [C4mim][NTf2], with concentration varied using methanol, was tested and also found to provide a Newtonian response. In both cases, the values of the square root of the viscosity-density product deduced from the small-volume quartz crystal technique were consistent with those measured using a viscometer and density meter. The third harmonic of the crystal was found to provide the closest agreement between the two measurement methods; the pure ionic liquids had the largest difference of approximately 10%. In addition, 18 pure ionic liquids were tested, and for 11 of these, good-quality frequency shift and bandwidth data were obtained; these 12 all had a Newtonian response. The frequency shift of the third harmonic was found to vary linearly with square root of viscosity-density product of the pure ionic liquids up to a value of square root(rho eta) approximately 18 kg m(-2) s(-1/2), but with a slope 10% smaller than that predicted by the Kanazawa and Gordon equation. It is envisaged that the quartz crystal technique could be used in a high-throughput microfluidic system for characterizing ionic liquids.

  7. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  8. Whole-brain, time-locked activation with simple tasks revealed using massive averaging and model-free analysis

    Science.gov (United States)

    Gonzalez-Castillo, Javier; Saad, Ziad S.; Handwerker, Daniel A.; Inati, Souheil J.; Brenowitz, Noah; Bandettini, Peter A.

    2012-01-01

    The brain is the body's largest energy consumer, even in the absence of demanding tasks. Electrophysiologists report on-going neuronal firing during stimulation or task in regions beyond those of primary relationship to the perturbation. Although the biological origin of consciousness remains elusive, it is argued that it emerges from complex, continuous whole-brain neuronal collaboration. Despite converging evidence suggesting the whole brain is continuously working and adapting to anticipate and actuate in response to the environment, over the last 20 y, task-based functional MRI (fMRI) have emphasized a localizationist view of brain function, with fMRI showing only a handful of activated regions in response to task/stimulation. Here, we challenge that view with evidence that under optimal noise conditions, fMRI activations extend well beyond areas of primary relationship to the task; and blood-oxygen level-dependent signal changes correlated with task-timing appear in over 95% of the brain for a simple visual stimulation plus attention control task. Moreover, we show that response shape varies substantially across regions, and that whole-brain parcellations based on those differences produce distributed clusters that are anatomically and functionally meaningful, symmetrical across hemispheres, and reproducible across subjects. These findings highlight the exquisite detail lying in fMRI signals beyond what is normally examined, and emphasize both the pervasiveness of false negatives, and how the sparseness of fMRI maps is not a result of localized brain function, but a consequence of high noise and overly strict predictive response models. PMID:22431587

  9. Predictability and Market Efficiency in Agricultural Futures Markets: a Perspective from Price-Volume Correlation Based on Wavelet Coherency Analysis

    Science.gov (United States)

    He, Ling-Yun; Wen, Xing-Chun

    2015-12-01

    In this paper, we use a time-frequency domain technique, namely, wavelet squared coherency, to examine the associations between the trading volumes of three agricultural futures and three different forms of these futures' daily closing prices, i.e. prices, returns and volatilities, over the past several years. These agricultural futures markets are selected from China as a typical case of the emerging countries, and from the US as a representative of the developed economies. We investigate correlations and lead-lag relationships between the trading volumes and the prices to detect the predictability and efficiency of these futures markets. The results suggest that the information contained in the trading volumes of the three agricultural futures markets in China can be applied to predict the prices or returns, while that in US has extremely weak predictive power for prices or returns. We also conduct the wavelet analysis on the relationships between the volumes and returns or volatilities to examine the existence of the two "stylized facts" proposed by Karpoff [J. M. Karpoff, The relation between price changes and trading volume: A survey, J. Financ. Quant. Anal.22(1) (1987) 109-126]. Different markets in the two countries perform differently in reproducing the two stylized facts. As the wavelet tools can decode nonlinear regularities and hidden patterns behind price-volume relationship in time-frequency space, different from the conventional econometric framework, this paper offers a new perspective into the market predictability and efficiency.

  10. Age-related changes in task related functional network connectivity.

    Directory of Open Access Journals (Sweden)

    Jason Steffener

    Full Text Available Aging has a multi-faceted impact on brain structure, brain function and cognitive task performance, but the interaction of these different age-related changes is largely unexplored. We hypothesize that age-related structural changes alter the functional connectivity within the brain, resulting in altered task performance during cognitive challenges. In this neuroimaging study, we used independent components analysis to identify spatial patterns of coordinated functional activity involved in the performance of a verbal delayed item recognition task from 75 healthy young and 37 healthy old adults. Strength of functional connectivity between spatial components was assessed for age group differences and related to speeded task performance. We then assessed whether age-related differences in global brain volume were associated with age-related differences in functional network connectivity. Both age groups used a series of spatial components during the verbal working memory task and the strength and distribution of functional network connectivity between these components differed across the age groups. Poorer task performance, i.e. slower speed with increasing memory load, in the old adults was associated with decreases in functional network connectivity between components comprised of the supplementary motor area and the middle cingulate and between the precuneus and the middle/superior frontal cortex. Advancing age also led to decreased brain volume; however, there was no evidence to support the hypothesis that age-related alterations in functional network connectivity were the result of global brain volume changes. These results suggest that age-related differences in the coordination of neural activity between brain regions partially underlie differences in cognitive performance.

  11. ‘Do I like this person?’ A network analysis of midline cortex during a social preference task

    Science.gov (United States)

    Chen, Ashley C.; Welsh, Robert C.; Liberzon, Israel; Taylor, Stephan F.

    2010-01-01

    Human communication and survival depend on effective social information processing. Abundant behavioral evidence has shown that humans efficiently judge preferences for other individuals, a critical task in social interaction, yet the neural mechanism of this basic social evaluation, remains less than clear. Using a social-emotional preference task and connectivity analyses (psycho-physiological interaction) of fMRI data, we first demonstrated that cortical midline structures (medial prefrontal and posterior cingulate cortices) and the task-positive network typically implicated in carrying out goal-directed tasks (pre-supplementary motor area, dorsal anterior cingulate and bilateral frontoparietal cortices) were both recruited when subjects made a preference judgment, relative to gender identification, to human faces. Connectivity analyses further showed network interactions among these cortical midline structures, and with the task-positive network, both of which vary as a function of social preference. Overall, the data demonstrate the involvement of cortical midline structures in forming social preference, and provide evidence of network interactions which might reflect a mechanism by which an individual regularly forms and expresses this fundamental decision. PMID:20188190

  12. Molar Volume Analysis of Molten Ni-Al-Co Alloy by Measuring the Density

    Institute of Scientific and Technical Information of China (English)

    XIAO Feng; FANG Liang; FU Yuechao; YANG Lingchuan

    2004-01-01

    The density of molten Ni-Al-Co alloys was measured in the temperature range of 1714~1873K using a modified pycnometric method, and the molar volume of molten alloys was analyzed. The density of molten Ni-Al-Co alloys was found to decrease with increasing temperature and Co concentration in alloys. The molar volume of molten Ni-Al-Co alloys increases with increasing Co concentration in alloys. The molar volume of molten Ni-Al-Co alloys shows a negative deviation from the linear molar volume.

  13. EFFORTS Sub-task report on task 4.2: Cold forming

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; Christensen, Thomas Vennick; Bay, Niels

    1999-01-01

    Task 4.2 is a sub-task of task 4: Physical modelling validation. In sub-task 4.2 experimental analysis of cold forming as regards form filling, interface stresses and forces and moments using sof model materials have been carried out.......Task 4.2 is a sub-task of task 4: Physical modelling validation. In sub-task 4.2 experimental analysis of cold forming as regards form filling, interface stresses and forces and moments using sof model materials have been carried out....

  14. Study on Task Analysis of Human Factors Engineering in Nuclear Power Plant%核电厂人因工程设计的任务分析研究

    Institute of Scientific and Technical Information of China (English)

    褚雪芹; 陈洁; 崔泽朋; 齐旭

    2016-01-01

    任务分析作为人因工程设计的要素之一,确定了操纵员为完成系统功能所必须执行的操作以及与这些操作相关的控制和状态反馈需求,进而实现对人机界面的优化。本文基于人因工程审查大纲NUREG-0711,结合海南昌江核电站设备冷却水房间通风系统,研究了任务分析的实施方法。明确的任务分析能够将人员因素更有效地整合在系统设计和运行中,使人机界面设计更加科学及合理。%Task analysis as an important part of Human Factors .Engineering determines the operations performed to complete the system functions for the operator, and the requirements of controls and indications related to the operation, and finally to optimize the Human-System Interface design. A methodology for the implementation of Task Analysis is studied here based on NUREG-0711. It is illustrated combined with DVI system of Hainan nuclear power plant project. Explicit task analysis leads to more efficient integration of the human element into system design and operation. With the systematic analysis of task analysis, the Human-System interface design would be more scientific and rational.

  15. Effects of chronic neck-shoulder pain on normalized mutual information analysis of surface electromyography during functional tasks

    DEFF Research Database (Denmark)

    Madeleine, Pascal; Xie, Yanfei; Szeto, Grace P. Y.;

    2016-01-01

    OBJECTIVE: To investigate the effects neck-shoulder pain on the connectivity of surface electromyography (SEMG) signals during functional tasks. METHODS: Twenty adults suffering from chronic neck-shoulder pain and 20 healthy controls were recruited. The SEMG signals from the left and right proximal....... Moreover, NMI values in homonymous proximal muscles were higher during texting compared with computer typing with both hands. CONCLUSIONS: Our results show for the first time that chronic neck-shoulder pain affects the functional connectivity of muscle pairs. SIGNIFICANCE: The study furnishes novel...... information about the effects of chronic neck-shoulder pain on the interplay of muscle pairs during functional tasks....

  16. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 3-Surry Unit 1 Cycle 2

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using selected critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations in this report is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and to provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of two reactor critical configurations for Surry Unit 1 Cycle 2. This unit and cycle were chosen for a previous analysis using a different methodology because detailed isotopics from multidimensional reactor calculations were available from the Virginia Power Company. These data permitted a direct comparison of criticality calculations using the utility-calculated isotopics with those using the isotopics generated by the SCALE-4

  17. Development of a novel high volume band compression injector for the analysis of complex samples like toxaphene pesticide.

    Science.gov (United States)

    Gagné, Jean-Pierre; Gouteux, Bruno; Bertrand, Michel J

    2009-01-16

    A new type of injector has been developed for gas chromatographic analysis. The injector has high volume and band compression (HVBC) capabilities useful for the analysis of complex samples. The injector consists essentially of a packed liner operated at room temperature while a narrow heated zone is used to axially scan the liner selectively desorbing the compounds of interest. The scanning speed, distance and temperature of the zone are precisely controlled. The liner is connected to an interface which can vent the solvent or any undesirable compounds, and transfer the analytes to an analytical column for separation and quantification. The injector is designed to be compatible with injection volumes from 1 to more than 250microL. At a low sample volume of 1microL, the injector has competitive performances compared to those of the "on-column" and "split/splitless" injectors for the fatty acid methyl esters and toxaphene compounds tested. For higher volumes, the system produces a linear response according to the injected volume. In this explorative study, the maximum volume injected seems to be limited by the saturation of the chromatographic system instead of being defined by the design of the injector. The HVBC injector can also be used to conduct "in situ" pretreatment of the sample before its transfer to the analytical column. For instance, a toxaphene sample was successively fractionated, using the HVBC injector, in six sub-fractions characterized by simpler chromatograms than the chromatogram of the original mixture. Finally, the ability of the HVBC injector to "freeze" the separation in time allowing the analyst to complete the analysis at a later time is also discussed.

  18. Case Study: Using Task Analysis to Determine the Status of Education and Practice of Medical Licentiates for the Provision of Anesthesia in Zambia.

    Science.gov (United States)

    Lwatula, Lastina Tembo; Johnson, Peter; Bowa, Anel; Lusale, David; Nikisi, Joseph; Ndhlovu, Martha; Carr, Catherine

    2015-01-01

    Task analysis methodology was used to identify gaps in the education and practice of Medical Licentiates, a cadre of primary care health providers in Zambia, related to the provision of anesthesia. Findings of the analysis indicate that Medical Licentiates who work in facilities where there are no fully qualified anesthesiologists or physicians often feel obligated to provide these services in order to save lives although they lack sufficient formal education or clinical practice opportunities. The government translated the findings into immediate modifications to the education, training and practice of anesthetic tasks by Medical Licentiates by developing an elective course within the pre-service education program and upgrading the certification of Medical Licentiates to a bachelor's degree.

  19. Cost Analysis of Correctional Standards: Institutional-Based Programs and Parole. Volume II.

    Science.gov (United States)

    Singer, Neil M.; Wright, Virginia B.

    This second of two volumes provides cost guidelines and cost estimation techniques for use by jurisdictions in assessing costs of their own ongoing or contemplated correctional program activities. (Volume I is a companion summary published as a separate document for use by criminal justice policy-makers in need of a reference to the policy issues…

  20. Office for Analysis and Evaluation of Operational Data 1996 annual report. Volume 10, Number 1: Reactors

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-01

    This annual report of the US Nuclear Regulatory Commission`s Office for Analysis and Evaluation of Operational Data (AEOD) describes activities conducted during 1996. The report is published in three parts. NUREG-1272, Vol. 10, No. 1, covers power reactors and presents an overview of the operating experience of the nuclear power industry from the NRC perspective, including comments about trends of some key performance measures. The report also includes the principal findings and issues identified in AEOD studies over the past year and summarizes information from such sources as licensee event reports and reports to the NRC`s Operations Center. NUREG-1272, Vol. 10, No. 2, covers nuclear materials and presents a review of the events and concerns during 1996 associated with the use of licensed material in nonreactor applications, such as personnel overexposures and medical misadministrations. Both reports also contain a discussion of the Incident Investigation Team program and summarize both the Incident Investigation Team and Augmented Inspection Team reports. Each volume contains a list of the AEOD reports issued from CY 1980 through 1996. NUREG-1272, Vol. 10, No. 3, covers technical training and presents the activities of the Technical Training Center in support of the NRC`s mission in 1996.

  1. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  4. Office for Analysis and Evaluation of Operational Data. 1992 annual report: Nonreactors: Volume 7, No. 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-10-01

    The annual report of the US Nuclear Regulatory Commission`s Office for Analysis and Evaluation of Operational Data (AEOD) is devoted to the activities performed during 1992. The report is published in two separate parts. NUREG-1272, Vol. 7, No. 1, covers power reactors and presents an overview of the operating experience of the nuclear power industry from the NRC perspective, including comments about the trends of some key performance measures. The report also includes the principal findings and issues identified in AEOD studies over the past year and summarizes information from such sources as licensee event reports, diagnostic evaluations, and reports to the NRC`s Operations Center. NUREG-1272, Vol. 7, No. 2, covers nonreactors and presents a review of the events and concerns during 1992 associated with the use of licensed material in nonreactor applications, such as personnel overexposures and medical misadministrations. Both reports also contain a discussion of the Incident Investigation Team program and summarize both the Incident Investigation Team and Augmented Inspection Team reports. Each volume contains a list of the AEOD reports issued for 1981--1992.

  5. A statistical approach to the initial volume problem in Single Particle Analysis by Electron Microscopy.

    Science.gov (United States)

    Sorzano, C O S; Vargas, J; de la Rosa-Trevín, J M; Otón, J; Álvarez-Cabrera, A L; Abrishami, V; Sesmero, E; Marabini, R; Carazo, J M

    2015-03-01

    Cryo Electron Microscopy is a powerful Structural Biology technique, allowing the elucidation of the three-dimensional structure of biological macromolecules. In particular, the structural study of purified macromolecules -often referred as Single Particle Analysis(SPA)- is normally performed through an iterative process that needs a first estimation of the three-dimensional structure that is progressively refined using experimental data. It is well-known the local optimisation nature of this refinement, so that the initial choice of this first structure may substantially change the final result. Computational algorithms aiming to providing this first structure already exist. However, the question is far from settled and more robust algorithms are still needed so that the refinement process can be performed with sufficient guarantees. In this article we present a new algorithm that addresses the initial volume problem in SPA by setting it in a Weighted Least Squares framework and calculating the weights through a statistical approach based on the cumulative density function of different image similarity measures. We show that the new algorithm is significantly more robust than other state-of-the-art algorithms currently in use in the field. The algorithm is available as part of the software suite Xmipp (http://xmipp.cnb.csic.es) and Scipion (http://scipion.cnb.csic.es) under the name "Significant".

  6. Two-dimensional thermal analysis of a fuel rod by finite volume method

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Rhayanne Y.N.; Silva, Mario A.B. da; Lira, Carlos A.B. de O., E-mail: ryncosta@gmail.com, E-mail: mabs500@gmail.com, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamaento de Energia Nuclear

    2015-07-01

    In a nuclear reactor, the amount of power generation is limited by thermal and physic limitations rather than by nuclear parameters. The operation of a reactor core, considering the best heat removal system, must take into account the fact that the temperatures of fuel and cladding shall not exceed safety limits anywhere in the core. If such considerations are not considered, damages in the fuel element may release huge quantities of radioactive materials in the coolant or even core meltdown. Thermal analyses for fuel rods are often accomplished by considering one-dimensional heat diffusion equation. The aim of this study is to develop the first paper to verify the temperature distribution for a two-dimensional heat transfer problem in an advanced reactor. The methodology is based on the Finite Volume Method (FVM), which considers a balance for the property of interest. The validation for such methodology is made by comparing numerical and analytical solutions. For the two-dimensional analysis, the results indicate that the temperature profile agree with expected physical considerations, providing quantitative information for the development of advanced reactors. (author)

  7. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  8. Organizational analysis and safety for utilities with nuclear power plants: an organizational overview. Volume 1. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, R.N.; Olson, J.; Sommers, P.E.; McLaughlin, S.D.; Jackson, M.S.; Scott, W.G.; Connor, P.E.

    1983-08-01

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. A model is introduced for the purposes of organizing the literature review and showing key relationships among identified organizational factors and nuclear power plant safety. Volume I of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety.

  9. Task modeling for collaborative authoring

    NARCIS (Netherlands)

    Veer, van der Gerrit; Kulyk, Olga; Vyas, Dhaval; Kubbe, Onno; Ebert, Achim; Dittmar, A.; Forbrig, P.

    2011-01-01

    Motivation –Task analysis for designing modern collaborative work needs a more fine grained approach. Especially in a complex task domain, like collaborative scientific authoring, when there is a single overall goal that can only be accomplished only by collaboration between multiple roles, each req

  10. Volume analysis of heat-induced cracks in human molars: A preliminary study

    Directory of Open Access Journals (Sweden)

    Michael A. Sandholzer

    2014-01-01

    Full Text Available Context: Only a few methods have been published dealing with the visualization of heat-induced cracks inside bones and teeth. Aims : As a novel approach this study used nondestructive X-ray microtomography (micro-CT for volume analysis of heat-induced cracks to observe the reaction of human molars to various levels of thermal stress. Materials and Methods: Eighteen clinically extracted third molars were rehydrated and burned under controlled temperatures (400, 650, and 800°C using an electric furnace adjusted with a 25°C increase/min. The subsequent high-resolution scans (voxel-size 17.7 μm were made with a compact micro-CT scanner (SkyScan 1174. In total, 14 scans were automatically segmented with Definiens XD Developer 1.2 and three-dimensional (3D models were computed with Visage Imaging Amira 5.2.2. The results of the automated segmentation were analyzed with an analysis of variance (ANOVA and uncorrected post hoc least significant difference (LSD tests using Statistical Package for Social Sciences (SPSS 17. A probability level of P < 0.05 was used as an index of statistical significance. Results: A temperature-dependent increase of heat-induced cracks was observed between the three temperature groups (P < 0.05, ANOVA post hoc LSD. In addition, the distributions and shape of the heat-induced changes could be classified using the computed 3D models. Conclusion: The macroscopic heat-induced changes observed in this preliminary study correspond with previous observations of unrestored human teeth, yet the current observations also take into account the entire microscopic 3D expansions of heat-induced cracks within the dental hard tissues. Using the same experimental conditions proposed in the literature, this study confirms previous results, adds new observations, and offers new perspectives in the investigation of forensic evidence.

  11. Parameter Analysis of the VPIN (Volume synchronized Probability of Informed Trading) Metric

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jung Heon; Wu, Kesheng; Simon, Horst D.

    2014-03-01

    VPIN (Volume synchronized Probability of Informed trading) is a leading indicator of liquidity-induced volatility. It is best known for having produced a signal more than hours before the Flash Crash of 2010. On that day, the market saw the biggest one-day point decline in the Dow Jones Industrial Average, which culminated to the market value of $1 trillion disappearing, but only to recover those losses twenty minutes later (Lauricella 2010). The computation of VPIN requires the user to set up a handful of free parameters. The values of these parameters significantly affect the effectiveness of VPIN as measured by the false positive rate (FPR). An earlier publication reported that a brute-force search of simple parameter combinations yielded a number of parameter combinations with FPR of 7%. This work is a systematic attempt to find an optimal parameter set using an optimization package, NOMAD (Nonlinear Optimization by Mesh Adaptive Direct Search) by Audet, le digabel, and tribes (2009) and le digabel (2011). We have implemented a number of techniques to reduce the computation time with NOMAD. Tests show that we can reduce the FPR to only 2%. To better understand the parameter choices, we have conducted a series of sensitivity analysis via uncertainty quantification on the parameter spaces using UQTK (Uncertainty Quantification Toolkit). Results have shown dominance of 2 parameters in the computation of FPR. Using the outputs from NOMAD optimization and sensitivity analysis, We recommend A range of values for each of the free parameters that perform well on a large set of futures trading records.

  12. Microconcentric ring electrode/injector assembly for sensitive voltammetric analysis in single droplets of ultrasmall volumes.

    Science.gov (United States)

    Kai, Tianhan; Chen, Shu; Monterroso, Estuardo; Hailu, Amanuel; Zhou, Feimeng

    2014-08-19

    This paper describes the construction of a microring electrode concentric to an inner injection capillary for voltammetric determination of trace analytes in nanoliter- to picoliter-sized droplets. The gold microring is sandwiched between a pulled fused-silica capillary and borosilicate glass tubing. Compared to polymer-coated microring electrodes, the glass-encapsulated electrode is more robust and does not swell in organic solvents. Consequently, the microring electrode is suitable for voltammetric studies of redox-active species and their accompanying ion transfers between two immiscible solvents. Droplets of variable sizes can be conveniently dispensed from front-loaded sample plugs into an immiscible liquid, greatly simplifying the experimental procedure and facilitating analysis of samples of limited availability. The size of the microring and the volume of the droplet deduced from well-defined voltammograms correlate well with those estimated from their geometric dimensions. The thin-layer cell behavior can be attained with well-defined voltammetric peaks and small capacitive current. Exhaustive electrolysis in single droplets can be accomplished in short times (e.g., ∼85 s in a droplet of 1.42 nL at a microring of 11.4 μm in radius). Anodic stripping voltammetry of Ag deposited onto the microring electrode resulted in a detection limit of 0.13 fmol (14 fg) of Ag(+). The microring electrode/injector assembly can be polished repeatedly and is versatile for various applications (e.g., sample plugs can also be back-loaded via a rotary injection valve and an HPLC pump for flow injection analysis).

  13. Integrating Information: An Analysis of the Processes Involved and the Products Generated in a Written Synthesis Task

    Science.gov (United States)

    Sole, Isabel; Miras, Mariana; Castells, Nuria; Espino, Sandra; Minguela, Marta

    2013-01-01

    The case study reported here explores the processes involved in producing a written synthesis of three history texts and their possible relation to the characteristics of the texts produced and the degree of comprehension achieved following the task. The processes carried out by 10 final-year compulsory education students (15 and 16 years old) to…

  14. Cognitive Task Analysis and Intelligent Computer-Based Training Systems: Lessons Learned from Coached Practice Environments in Air Force Avionics.

    Science.gov (United States)

    Katz, Sandra N.; Hall, Ellen; Lesgold, Alan

    This paper describes some results of a collaborative effort between the University of Pittsburgh and the Air Force to develop advanced troubleshooting training for F-15 maintenance technicians. The focus is on the cognitive task methodology used in the development of three intelligent tutoring systems to inform their instructional content and…

  15. Exploring General versus Task-Specific Assessments of Metacognition in University Chemistry Students: A Multitrait-Multimethod Analysis

    Science.gov (United States)

    Wang, Chia-Yu

    2015-01-01

    The purpose of this study was to use multiple assessments to investigate the general versus task-specific characteristics of metacognition in dissimilar chemistry topics. This mixed-method approach investigated the nature of undergraduate general chemistry students' metacognition using four assessments: a self-report questionnaire, assessment of…

  16. When Negotiation of Meaning is Also Negotiation of Task: Analysis of the Communication in an Applied Mathematics High School Course.

    Science.gov (United States)

    Christiansen, Iben Maj

    1997-01-01

    The negotiation of meaning presupposes a taken-to-be-shared understanding of a situation. Uses an example to illustrate how negotiation of meaning and task may be linked in ways inappropriate to the learning of modeling and critical reflections. Contains 16 references. (Author/ASK)

  17. Investigating the Validity of an Integrated Listening-Speaking Task: A Discourse-Based Analysis of Test Takers' Oral Performances

    Science.gov (United States)

    Frost, Kellie; Elder, Catherine; Wigglesworth, Gillian

    2012-01-01

    Performance on integrated tasks requires candidates to engage skills and strategies beyond language proficiency alone, in ways that can be difficult to define and measure for testing purposes. While it has been widely recognized that stimulus materials impact test performance, our understanding of the way in which test takers make use of these…

  18. Analysis of High-Frequency Electroencephalographic-Electromyographic Coherence Elicited by Speech and Oral Nonspeech Tasks in Parkinson's Disease

    Science.gov (United States)

    Caviness, John N.; Liss, Julie M.; Adler, Charles; Evidente, Virgilio

    2006-01-01

    Purpose: Corticomuscular electroencephalographic-electromyographic (EEG-EMG) coherence elicited by speech and nonspeech oromotor tasks in healthy participants and those with Parkinson's disease (PD) was examined. Hypotheses were the following: (a) corticomuscular coherence is demonstrable between orbicularis oris (OO) muscles' EMG and scalp EEG…

  19. A Field-Tested Task Analysis for Creating Single-Subject Graphs Using Microsoft[R] Office Excel

    Science.gov (United States)

    Lo, Ya-yu; Konrad, Moira

    2007-01-01

    Creating single-subject (SS) graphs is challenging for many researchers and practitioners because it is a complex task with many steps. Although several authors have introduced guidelines for creating SS graphs, many users continue to experience frustration. The purpose of this article is to minimize these frustrations by providing a field-tested…

  20. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 2-Sequoyah Unit 2 Cycle 3

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations reported herein is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of three reactor critical configurations for the Sequoyah Unit 2 Cycle 3. This unit and cycle were chosen because of the relevance in spent fuel benchmark applications: (1) the unit had a significantly long downtime of 2.7 years during the middle of cycle (MOC) 3, and (2) the core consisted entirely of burned fuel at the MOC restart. The first benchmark critical calculation was the MOC restart at hot, full-power (HFP) critical conditions. The

  1. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, R.N.; Olson, J.; Sommers, P.E.; McLaughlin, S.D.; Jackson, M.S.; Nadel, M.V.; Scott, W.G.; Connor, P.E.; Kerwin, N.; Kennedy, J.K. Jr.

    1983-08-01

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators.

  2. Theoretical and experimental analysis of a multiphase screw pump, handling gas-liquid mixtures with very high gas volume fractions

    Energy Technology Data Exchange (ETDEWEB)

    Raebiger, K. [LEISTRITZ Pumpen GmbH, Nuremberg (Germany); Faculty of Advanced Technology, University of Glamorgan, Pontypridd, Wales (United Kingdom); Maksoud, T.M.A.; Ward, J. [Faculty of Advanced Technology, University of Glamorgan, Pontypridd, Wales (United Kingdom); Hausmann, G. [Department of Mechanical Engineering and Building Services Engineering, University of Applied Sciences, Nuremberg (Germany)

    2008-09-15

    In the investigation of the pumping behaviour of multiphase screw pumps, handling gas-liquid mixtures with very high gas volume fractions, theoretical and experimental analyses were performed. A new theoretical screw pump model was developed, which calculates the time-dependent conditions inside the several chambers of a screw pump as well as the exchange of mass and energy between these chambers. By means of the performed experimental analysis, the screw pump model was verified, especially at very high gas volume fractions from 90% to 99%. The experiments, which were conducted with the reference fluids water and air, can be divided mainly into the determination of the steady state pumping behaviour on the one hand and into the analysis of selected transient operating conditions on the other hand, whereas the visualisation of the leakage flows through the circumferential gaps was rounded off the experimental analysis. (author)

  3. Impact of Non-Gaussian Error Volumes on Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Ghrist, Richard W.; Plakalovic, Dragan

    2012-01-01

    An understanding of how an initially Gaussian error volume becomes non-Gaussian over time is an important consideration for space-vehicle conjunction assessment. Traditional assumptions applied to the error volume artificially suppress the true non-Gaussian nature of the space-vehicle position uncertainties. For typical conjunction assessment objects, representation of the error volume by a state error covariance matrix in a Cartesian reference frame is a more significant limitation than is the assumption of linearized dynamics for propagating the error volume. In this study, the impact of each assumption is examined and isolated for each point in the volume. Limitations arising from representing the error volume in a Cartesian reference frame is corrected by employing a Monte Carlo approach to probability of collision (Pc), using equinoctial samples from the Cartesian position covariance at the time of closest approach (TCA) between the pair of space objects. A set of actual, higher risk (Pc >= 10 (exp -4)+) conjunction events in various low-Earth orbits using Monte Carlo methods are analyzed. The impact of non-Gaussian error volumes on Pc for these cases is minimal, even when the deviation from a Gaussian distribution is significant.

  4. Analysis of the situation of the vacuum in FTU; Analisi della situazione del vuoto di FTU (Resoconto del lavoro svolto dalla task force)

    Energy Technology Data Exchange (ETDEWEB)

    Alessandrini, C.; Angelini, B.; Apicella, M.L.; Mazzitelli, G.; Pirani, S.; Zanza, V. [ENEA, Centro Ricerche Frascati, Rome (Italy). Dipt. Energia

    1999-01-01

    To analyze the situation of the vacuum in the FTU tokamak, on 22/5/96 was set up a task force to identify the problem(s) and to settle the operative and cleaning procedures. The main actions of the task force were: leak tests, automatic procedure to monitor on line the state of the machine vacuum and an exhaustive analysis of the work done before. The task force reviewed the outgassing measurements of the plastic materials inserted into the machine and was decided to repeat the test on the thermocouples. The results pointed out that the thermocouples are a practically infinite reservoir of water. The outcome of the task force was a set of new procedures and recommendations during both the operation of FTU and the shutdown periods. FTU is now operating at more acceptable plasma purity. [Italiano] Il 22/5/96 fu costituita una Task Force (TF) per analizzare le cause della `non pulizia` di FTU e per identificare le nuove procedure da seguire per la pulizia della camera da vuoto. Le azioni che la TF intraprese furono: leak tests, monitoraggio continuo dello stato del vuoto e revisione critica del lavoro fatto precedentemente. Vennero poi analizzate delle misure fatte a suo tempo sui materiali presenti nella camera da vuoto di FTU e, nel caso delle termocoppie, si decise di ripetere il test. Da questa seconda prova emerse che i cilindretti di queste termocoppie sono delle riserve inesauribili di acqua che vengono alimentate da ogni riapertura della macchina. Il lavoro della TF si concluse con una serie di procedure e raccomandazioni, da seguire prima di ogni riapertura della macchina e durante le campagne sperimentali a macchina fredda; tali procedure tengono conto del fatto che negli anni e` aumentata la quantita` di materiali plastici presenti nella camera da vuoto, soprattutto nei ports. FTU sta attualmente operando in condizioni piu` accettabili di pulizia della camera.

  5. Conceptual design and systems analysis of photovoltaic systems. Volume II. Study results. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Kirpich, A.

    1977-03-19

    This investigation of terrestrial PV systems considered the technical and economic feasibility for systems in three size categories: a small system of about 12 kW peak output for on-site residential use; a large 1500 MW central power plant contributing to the bulk energy of a utility system power grid; and an intermediate size system of about 250 kW for use on public or commercial buildings. In each category, conceptual designs were developed, performance was analyzed for a range of climatic regions, economic analyses were performed, and assessments were made of pertinent institutional issues. The report consists of three volumes. Volume I contains a Study Summary of the major study results. This volume contains the detailed results pertaining to on-site residential photovoltaic systems, central power plant photovoltaic systems, and intermediate size systems applied to commercial and public buildings. Volume III contains supporting appendix material. (WHK)

  6. Initial Northwest Power Act Power Sales Contracts : Final Environmental Impact Statement. Volume 1, Environmental Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1992-01-01

    This is volume 1 of the final environmental impact statement of the Bonneville Power Administration Information is included on the following: Purpose of and need for action; alternatives including the proposed action; affected environment; and environmental consequences.

  7. [Modeling and analysis of volume conduction based on field-circuit coupling].

    Science.gov (United States)

    Tang, Zhide; Liu, Hailong; Xie, Xiaohui; Chen, Xiufa; Hou, Deming

    2012-08-01

    Numerical simulations of volume conduction can be used to analyze the process of energy transfer and explore the effects of some physical factors on energy transfer efficiency. We analyzed the 3D quasi-static electric field by the finite element method, and developed A 3D coupled field-circuit model of volume conduction basing on the coupling between the circuit and the electric field. The model includes a circuit simulation of the volume conduction to provide direct theoretical guidance for energy transfer optimization design. A field-circuit coupling model with circular cylinder electrodes was established on the platform of the software FEM3.5. Based on this, the effects of electrode cross section area, electrode distance and circuit parameters on the performance of volume conduction system were obtained, which provided a basis for optimized design of energy transfer efficiency.

  8. Limb volume measurement: from the past methods to optoelectronic technologies, bioimpedance analysis and laser based devices.

    Science.gov (United States)

    Cavezzi, A; Schingale, F; Elio, C

    2010-10-01

    Accurate measurement of limb volume is considered crucial to lymphedema management. Various non-invasive methods may be used and have been validated in recent years, though suboptimal standardisation has been highlighted in different publications.

  9. A CFD Analysis of Hydrogen Leakage During On-Pad Purge in the ORION/ARES I Shared Volume

    Science.gov (United States)

    Ajmani, Kumud; Edwards, Daryl A.

    2011-01-01

    A common open volume is created by the stacking of the Orion vehicle onto the Ares I Upper Stage. Called the Shared Volume, both vehicles contribute to its gas, fluid, and thermal environment. One of these environments is related to hazardous hydrogen gas. While both vehicles use inert purge gas to mitigate any hazardous gas buildup, there are concerns that hydrogen gas may still accumulate and that the Ares I Hazardous Gas Detection System will not be sufficient for monitoring the integrated volume. This Computational Fluid Dynamics (CFD) analysis has been performed to examine these topics. Results of the analysis conclude that the Ares I Hazardous Gas Detection System will be able to sample the vent effluent containing the highest hydrogen concentrations. A second conclusion is that hydrogen does not accumulate under the Orion Service Module (SM) avionics ring as diffusion and purge flow mixing sufficiently dilute the hydrogen to safe concentrations. Finally the hydrogen concentrations within the Orion SM engine nozzle may slightly exceed the 1 percent volume fraction when the entire worse case maximum full leak is directed vertically into the engine nozzle.

  10. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis

    Science.gov (United States)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku

    2014-03-01

    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.

  11. Quantitative analysis of the corpus callosum in children with cerebral palsy and developmental delay: correlation with cerebral white matter volume

    Energy Technology Data Exchange (ETDEWEB)

    Panigrahy, Ashok [Childrens Hospital Los Angeles, Department of Radiology, Los Angeles, CA (United States); Barnes, Patrick D. [Stanford University Medical Center, Department of Radiology, Lucile Salter Packard Children' s Hospital, Palo Alto, CA (United States); Robertson, Robert L. [Children' s Hospital Boston, Department of Radiology, Boston, MA (United States); Sleeper, Lynn A. [New England Research Institute, Watertown, MA (United States); Sayre, James W. [UCLA Medical Center, Departments of Radiology and Biostatistics, Los Angeles, CA (United States)

    2005-12-01

    This study was conducted to quantitatively correlate the thickness of the corpus callosum with the volume of cerebral white matter in children with cerebral palsy and developmental delay. Material and methods: A clinical database of 70 children with cerebral palsy and developmental delay was established with children between the ages of 1 and 5 years. These children also demonstrated abnormal periventricular T2 hyperintensities associated with and without ventriculomegaly. Mid-sagittal T1-weighted images were used to measure the thickness (genu, mid-body, and splenium) and length of the corpus callosum. Volumes of interest were digitized based on gray-scale densities to define the hemispheric cerebral white matter on axial T2-weighted and FLAIR images. The thickness of the mid-body of the corpus callosum was correlated with cerebral white matter volume. Subgroup analysis was also performed to examine the relationship of this correlation with both gestational age and neuromotor outcome. Statistical analysis was performed using analysis of variance and Pearson correlation coefficients. There was a positive correlation between the thickness of the mid-body of the corpus callosum and the volume of cerebral white matter across all children studied (R=0.665, P=0.0001). This correlation was not dependent on gestational age. The thickness of the mid-body of the corpus callosum was decreased in the spastic diplegia group compared to the two other groups (hypotonia and developmental delay only; P<0.0001). Within each neuromotor subgroup, there was a positive correlation between thickness of the mid-body of the corpus callosum and volume of the cerebral white matter. (orig.)

  12. Analysis, Evaluation and Improvement of Sequential Single-Item Auctions for the Cooperative Real-Time Allocation of Tasks

    Science.gov (United States)

    2013-03-30

    eds., Distributed Autonomous Robotic Systems 7. Springer . 21– 30. Clearwater, S.; Costanza, R.; Dixon, M.; and Schroeder, B. 1996. Saving energy...rescue simulation domain: A short note. In Birk, A., and Coradeschi, S., eds., RoboCup-2001: Robot Soccer World Cup V, Lecture Notes in Computer Science... Springer . Nanjanath, M., and Gini, M. 2008. Repeated auctions for robust task execution by a robot team. Technical Report 08-032, University of

  13. The analysis of subsidence associated with geothermal development. Volume 1. Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Atherton, R.W.; Finnemore, E.J.; Gillam, M.L.

    1976-09-01

    This study evaluates the state of knowledge of subsidence associated with geothermal development, and provides preliminary methods to assess the potential of land subsidence for any specific geothermal site. The results of this study are presented in three volumes. Volume 1 is designed to serve as a concise reference, a handbook, for the evaluation of the potential for land subsidence from the development of geothermal resources.

  14. An Analysis of the Effects of Smartphone Push Notifications on Task Performance with regard to Smartphone Overuse Using ERP

    Directory of Open Access Journals (Sweden)

    Seul-Kee Kim

    2016-01-01

    Full Text Available Smartphones are used ubiquitously worldwide and are essential tools in modern society. However, smartphone overuse is an emerging social issue, and limited studies have objectively assessed this matter. The majority of previous studies have included surveys or behavioral observation studies. Since a previous study demonstrated an association between increased push notifications and smartphone overuse, we investigated the effects of push notifications on task performance. We detected changes in brainwaves generated by smartphone push notifications using the N200 and P300 components of event-related potential (ERP to investigate both concentration and cognitive ability. ERP assessment indicated that, in both risk and nonrisk groups, the lowest N200 amplitude and the longest latency during task performance were found when push notifications were delivered. Compared to the nonrisk group, the risk group demonstrated lower P300 amplitudes and longer latencies. In addition, the risk group featured a higher rate of error in the Go-Nogo task, due to the negative influence of smartphone push notifications on performance in both risk and nonrisk groups. Furthermore, push notifications affected subsequent performance in the risk group.

  15. Computational modelling and analysis of hippocampal-prefrontal information coding during a spatial decision-making task

    Directory of Open Access Journals (Sweden)

    Thomas eJahans-Price

    2014-03-01

    Full Text Available We introduce a computational model describing rat behaviour and the interactions of neural populations processing spatial and mnemonic information during a maze-based, decision-making task. The model integrates sensory input and implements a working memory to inform decisions at a choice point, reproducing rat behavioural data and predicting the occurrence of turn- and memory-dependent activity in neuronal networks supporting task performance. We tested these model predictions using a new software toolbox (Maze Query Language, MQL to analyse activity of medial prefrontal cortical (mPFC and dorsal hippocampal (dCA1 neurons recorded from 6 adult rats during task performance. The firing rates of dCA1 neurons discriminated context (i.e. the direction of the previous turn, whilst a subset of mPFC neurons was selective for current turn direction or context, with some conjunctively encoding both. mPFC turn-selective neurons displayed a ramping of activity on approach to the decision turn and turn-selectivity in mPFC was significantly reduced during error trials. These analyses complement data from neurophysiological recordings in non-human primates indicating that firing rates of cortical neurons correlate with integration of sensory evidence used to inform decision-making.

  16. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository - Volume 3: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R. (INEEL); Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K. (SNL); Rath, J.S. (New Mexico Engineering Research Institute)

    1998-10-01

    The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  17. Finite volume analysis of temperature effects induced by active MRI implants: 2. Defects on active MRI implants causing hot spots

    Directory of Open Access Journals (Sweden)

    Grönemeyer Dietrich HW

    2006-05-01

    investigations. The finite volume analysis calculates the time developing temperature maps for the model of a broken linear metallic wire embedded in tissue. Half of the total hot spot power loss is assumed to diffuse into both wire parts at the location of a defect. The energy is distributed from there by heat conduction. Additionally the effect of blood perfusion and blood flow is respected in some simulations because the simultaneous appearance of all worst case conditions, especially the absence of blood perfusion and blood flow near the hot spot, is very unlikely for vessel implants. Results The analytical solution as worst case scenario as well as the finite volume analysis for near worst case situations show not negligible volumes with critical temperature increases for part of the modeled hot spot situations. MR investigations with a high rf-pulse density lasting below a minute can establish volumes of several cubic millimeters with temperature increases high enough to start cell destruction. Longer exposure times can involve volumes larger than 100 mm3. Even temperature increases in the range of thermal ablation are reached for substantial volumes. MR sequence exposure time and hot spot power loss are the primary factors influencing the volume with critical temperature increases. Wire radius, wire material as well as the physiological parameters blood perfusion and blood flow inside larger vessels reduce the volume with critical temperature increases, but do not exclude a volume with critical tissue heating for resonators with a large product of resonator volume and quality factor. Conclusion The worst case scenario assumes thermal equilibrium for a hot spot embedded in homogeneous tissue without any cooling due to blood perfusion or flow. The finite volume analysis can calculate the results for near and not close to worst case conditions. For both cases a substantial volume can reach a critical temperature increase in a short time. The analytical solution, as absolute

  18. Functional Data Analysis in NTCP Modeling: A New Method to Explore the Radiation Dose-Volume Effects

    Energy Technology Data Exchange (ETDEWEB)

    Benadjaoud, Mohamed Amine, E-mail: mohamedamine.benadjaoud@gustaveroussy.fr [Center for Research in Epidemiology and Population Health (CESP) INSERM 1018 Radiation, Epidemiology Group, Villejuif (France); Université Paris sud, Le Kremlin-Bicêtre (France); Institut Gustave Roussy, Villejuif (France); Blanchard, Pierre [Université Paris sud, Le Kremlin-Bicêtre (France); Department of Radiation Oncology, Institut Gustave Roussy, Villejuif (France); Schwartz, Boris [Center for Research in Epidemiology and Population Health (CESP) INSERM 1018 Radiation, Epidemiology Group, Villejuif (France); Université Paris sud, Le Kremlin-Bicêtre (France); Institut Gustave Roussy, Villejuif (France); Champoudry, Jérôme [Department of Radiation Oncology, CHU de la Timone, Marseille (France); Bouaita, Ryan [Department of Radiation Oncology, CHU Henri Mondor, Creteil (France); Lefkopoulos, Dimitri [Department of Radiation Physics, Institut Gustave Roussy, Villejuif (France); Deutsch, Eric [Université Paris sud, Le Kremlin-Bicêtre (France); Department of Radiation Oncology, Institut Gustave Roussy, Villejuif (France); INSERM 1030, Molecular Radiotherapy, Villejuif (France); Diallo, Ibrahima [Center for Research in Epidemiology and Population Health (CESP) INSERM 1018 Radiation, Epidemiology Group, Villejuif (France); Université Paris sud, Le Kremlin-Bicêtre (France); Institut Gustave Roussy, Villejuif (France); Cardot, Hervé [Institut de Mathématiques de Bourgogne, Université de Bourgogne, Dijon (France); and others

    2014-11-01

    Purpose/Objective(s): To describe a novel method to explore radiation dose-volume effects. Functional data analysis is used to investigate the information contained in differential dose-volume histograms. The method is applied to the normal tissue complication probability modeling of rectal bleeding (RB) for patients irradiated in the prostatic bed by 3-dimensional conformal radiation therapy. Methods and Materials: Kernel density estimation was used to estimate the individual probability density functions from each of the 141 rectum differential dose-volume histograms. Functional principal component analysis was performed on the estimated probability density functions to explore the variation modes in the dose distribution. The functional principal components were then tested for association with RB using logistic regression adapted to functional covariates (FLR). For comparison, 3 other normal tissue complication probability models were considered: the Lyman-Kutcher-Burman model, logistic model based on standard dosimetric parameters (LM), and logistic model based on multivariate principal component analysis (PCA). Results: The incidence rate of grade ≥2 RB was 14%. V{sub 65Gy} was the most predictive factor for the LM (P=.058). The best fit for the Lyman-Kutcher-Burman model was obtained with n=0.12, m = 0.17, and TD50 = 72.6 Gy. In PCA and FLR, the components that describe the interdependence between the relative volumes exposed at intermediate and high doses were the most correlated to the complication. The FLR parameter function leads to a better understanding of the volume effect by including the treatment specificity in the delivered mechanistic information. For RB grade ≥2, patients with advanced age are significantly at risk (odds ratio, 1.123; 95% confidence interval, 1.03-1.22), and the fits of the LM, PCA, and functional principal component analysis models are significantly improved by including this clinical factor. Conclusion: Functional

  19. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  20. High Statistics Analysis using Anisotropic Clover Lattices: (IV) The Volume Dependence of the Light Hadron Masses

    Energy Technology Data Exchange (ETDEWEB)

    Beane, S R; Detmold, W; Lin, H W; Luu, T C; Orginos, K; Parreno, A; Savage, M J; Torok, A; Walker-Loud, A

    2011-07-01

    The volume dependence of the octet baryon masses and relations among them are explored with Lattice QCD. Calculations are performed with nf = 2 + 1 clover fermion discretization in four lattice volumes, with spatial extent L ? 2.0, 2.5, 3.0 and 4.0 fm, with an anisotropic lattice spacing of b_s ? 0.123 fm in the spatial direction, and b_t = b_s/3.5 in the time direction, and at a pion mass of m_\\pi ? 390 MeV. The typical precision of the ground-state baryon mass determination is volume dependence of the masses, the Gell-Mann Okubo mass-relation, and of other mass combinations. A comparison with the predictions of heavy baryon chiral perturbation theory is performed in both the SU(2)L ? SU(2)R and SU(3)L ? SU(3)R expansions. Predictions of the three-flavor expansion for the hadron masses are found to describe the observed volume dependences reasonably well. Further, the ?N? axial coupling constant is extracted from the volume dependence of the nucleon mass in the two-flavor expansion, with only small modifications in the three-flavor expansion from the inclusion of kaons and eta's. At a given value of m?L, the finite-volume contributions to the nucleon mass are predicted to be significantly smaller at m_\\pi ? 140 MeV than at m_\\pi ? 390 MeV due to a coefficient that scales as ? m_\\pi^3. This is relevant for the design of future ensembles of lattice gauge-field configurations. Finally, the volume dependence of the pion and kaon masses are analyzed with two-flavor and three-flavor chiral perturbation theory.

  1. A macroecological analysis of SERA derived forest heights and implications for forest volume remote sensing.

    Science.gov (United States)

    Brolly, Matthew; Woodhouse, Iain H; Niklas, Karl J; Hammond, Sean T

    2012-01-01

    Individual trees have been shown to exhibit strong relationships between DBH, height and volume. Often such studies are cited as justification for forest volume or standing biomass estimation through remote sensing. With resolution of common satellite remote sensing systems generally too low to resolve individuals, and a need for larger coverage, these systems rely on descriptive heights, which account for tree collections in forests. For remote sensing and allometric applications, this height is not entirely understood in terms of its location. Here, a forest growth model (SERA) analyzes forest canopy height relationships with forest wood volume. Maximum height, mean, H₁₀₀, and Lorey's height are examined for variability under plant number density, resource and species. Our findings, shown to be allometrically consistent with empirical measurements for forested communities world-wide, are analyzed for implications to forest remote sensing techniques such as LiDAR and RADAR. Traditional forestry measures of maximum height, and to a lesser extent H₁₀₀ and Lorey's, exhibit little consistent correlation with forest volume across modeled conditions. The implication is that using forest height to infer volume or biomass from remote sensing requires species and community behavioral information to infer accurate estimates using height alone. SERA predicts mean height to provide the most consistent relationship with volume of the height classifications studied and overall across forest variations. This prediction agrees with empirical data collected from conifer and angiosperm forests with plant densities ranging between 10²-10⁶ plants/hectare and heights 6-49 m. Height classifications investigated are potentially linked to radar scattering centers with implications for allometry. These findings may be used to advance forest biomass estimation accuracy through remote sensing. Furthermore, Lorey's height with its specific relationship to remote sensing

  2. A macroecological analysis of SERA derived forest heights and implications for forest volume remote sensing.

    Directory of Open Access Journals (Sweden)

    Matthew Brolly

    Full Text Available Individual trees have been shown to exhibit strong relationships between DBH, height and volume. Often such studies are cited as justification for forest volume or standing biomass estimation through remote sensing. With resolution of common satellite remote sensing systems generally too low to resolve individuals, and a need for larger coverage, these systems rely on descriptive heights, which account for tree collections in forests. For remote sensing and allometric applications, this height is not entirely understood in terms of its location. Here, a forest growth model (SERA analyzes forest canopy height relationships with forest wood volume. Maximum height, mean, H₁₀₀, and Lorey's height are examined for variability under plant number density, resource and species. Our findings, shown to be allometrically consistent with empirical measurements for forested communities world-wide, are analyzed for implications to forest remote sensing techniques such as LiDAR and RADAR. Traditional forestry measures of maximum height, and to a lesser extent H₁₀₀ and Lorey's, exhibit little consistent correlation with forest volume across modeled conditions. The implication is that using forest height to infer volume or biomass from remote sensing requires species and community behavioral information to infer accurate estimates using height alone. SERA predicts mean height to provide the most consistent relationship with volume of the height classifications studied and overall across forest variations. This prediction agrees with empirical data collected from conifer and angiosperm forests with plant densities ranging between 10²-10⁶ plants/hectare and heights 6-49 m. Height classifications investigated are potentially linked to radar scattering centers with implications for allometry. These findings may be used to advance forest biomass estimation accuracy through remote sensing. Furthermore, Lorey's height with its specific relationship to

  3. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sperling, M.; Shreve, D.C.

    1978-12-01

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code.

  4. The interpretation of X-ray computed microtomography images of rocks as an application of volume image processing and analysis

    OpenAIRE

    Kaczmarczyk, J.; Dohnalik, M; Zalewska, J; Cnudde, Veerle

    2010-01-01

    X-ray computed microtomography (CMT) is a non-destructive method of investigating internal structure of examined objects. During the reconstruction of CMT measurement data, large volume images are generated. Therefore, the image processing and analysis are very important steps in CMT data interpretation. The first step in analyzing the rocks is image segmentation. The differences in density are shown on the reconstructed image as the differences in gray level of voxel, so the proper threshold...

  5. Analysis of nuclear waste disposal in space, phase 3. Volume 1: Executive summary of technical report

    Science.gov (United States)

    Rice, E. E.; Miller, N. E.; Yates, K. R.; Martin, W. E.; Friedlander, A. L.

    1980-01-01

    The objectives, approach, assumptions, and limitations of a study of nuclear waste disposal in space are discussed with emphasis on the following: (1) payload characterization; (2) safety assessment; (3) health effects assessment; (4) long-term risk assessment; and (5) program planning support to NASA and DOE. Conclusions are presented for each task.

  6. Close Combat Tactical Trainer (CCTT). Cost and Training Effectiveness Analysis (CTEA). Volume 2. Main Report

    Science.gov (United States)

    1991-05-01

    Plans t(NI𔃻) tasks. The tics of the C17. (2) develop training strate- aflIsi was. conlducted in s~uppiort of a mile- gies for armor and mechanized...the point of impact on the target, and if the round impacto the ground or an object other than the intended target. The hit detection computations

  7. Sensitivity Analysis of Wavelet Neural Network Model for Short-Term Traffic Volume Prediction

    Directory of Open Access Journals (Sweden)

    Jinxing Shen

    2013-01-01

    Full Text Available In order to achieve a more accurate and robust traffic volume prediction model, the sensitivity of wavelet neural network model (WNNM is analyzed in this study. Based on real loop detector data which is provided by traffic police detachment of Maanshan, WNNM is discussed with different numbers of input neurons, different number of hidden neurons, and traffic volume for different time intervals. The test results show that the performance of WNNM depends heavily on network parameters and time interval of traffic volume. In addition, the WNNM with 4 input neurons and 6 hidden neurons is the optimal predictor with more accuracy, stability, and adaptability. At the same time, a much better prediction record will be achieved with the time interval of traffic volume are 15 minutes. In addition, the optimized WNNM is compared with the widely used back-propagation neural network (BPNN. The comparison results indicated that WNNM produce much lower values of MAE, MAPE, and VAPE than BPNN, which proves that WNNM performs better on short-term traffic volume prediction.

  8. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. Multi-temporal MRI carpal bone volumes analysis by principal axes registration

    Science.gov (United States)

    Ferretti, Roberta; Dellepiane, Silvana

    2016-03-01

    In this paper, a principal axes registration technique is presented, with the relevant application to segmented volumes. The purpose of the proposed registration is to compare multi-temporal volumes of carpal bones from Magnetic Resonance Imaging (MRI) acquisitions. Starting from the study of the second-order moment matrix, the eigenvectors are calculated to allow the rotation of volumes with respect to reference axes. Then the volumes are spatially translated to become perfectly overlapped. A quantitative evaluation of the results obtained is carried out by computing classical indices from the confusion matrix, which depict similarity measures between the volumes of the same organ as extracted from MRI acquisitions executed at different moments. Within the medical field, the way a registration can be used to compare multi-temporal images is of great interest, since it provides the physician with a tool which allows a visual monitoring of a disease evolution. The segmentation method used herein is based on the graph theory and is a robust, unsupervised and parameters independent method. Patients affected by rheumatic diseases have been considered.

  10. Stereological analysis of the mediodorsal thalamic nucleus in schizophrenia: volume, neuron number, and cell types

    DEFF Research Database (Denmark)

    Dorph-Petersen, Karl-Anton; Pierri, Joseph N; Sun, Zhuoxin

    2004-01-01

    The mediodorsal thalamic nucleus (MD) is the principal relay nucleus for the prefrontal cortex, a brain region thought to be dysfunctional in schizophrenia. Several, but not all, postmortem studies of the MD in schizophrenia have reported decreased volume and total neuronal number. However......, it is not clear whether the findings are specific for schizophrenia nor is it known which subtypes of thalamic neurons are affected. We studied the left MD in 11 subjects with schizophrenia, 9 control subjects, and 12 subjects with mood disorders. Based on morphological criteria, we divided the neurons into two...... subclasses, presumably corresponding to projection neurons and local circuit neurons. We estimated MD volume and the neuron number of each subclass using methods based on modern unbiased stereological principles. We also estimated the somal volumes of each subclass using a robust, but biased, approach...

  11. Analysis on volume invariability of metal circular shaft in torsion deformation

    Science.gov (United States)

    Yang, Li-Hong; Zou, Guang-Ping; He, Yun-Zeng; Wang, Hui

    2010-03-01

    Volume invariability of metal circular shaft in the case of small strain torsion deformation and large strain torsion deformation was, respectively, discussed experimentally and theoretically in this study. In accordance with the elastoplastic theory, it was given that the shear stress did not cause the change of volume in the large strain range. By utilizing torsion experiment with the solid shaft of low carbon steel, it was proved that metal can meet the conditions of the volume invariability in torsion deformation while the cumulative damage was not very serious. Volumetric deformation was analyzed in torsion of circular shaft in the perspective of micromechanics. Finally, Swift effect of solid circular shaft and tubular shaft of brass material were interpreted by using the formulae of elastoplastic critic load obtained from double-limb bar model test presented by Shanley.

  12. Image analysis based quantification of bacterial volume change with high hydrostatic pressure.

    Science.gov (United States)

    Pilavtepe-Celik, M; Balaban, M O; Alpas, H; Yousef, A E

    2008-11-01

    Scanning electron microscopy (SEM) images of Staphylococcus aureus 485 and Escherichia coli O157:H7 933 were taken after pressure treatments at 200 to 400 MPa. Software developed for this purpose was used to analyze SEM images and to calculate the change in view area and volume of cells. Significant increase in average cell view area and volume for S. aureus 485 was observed in response to pressure treatment at 400 MPa. Cell view area for E. coli O157:H7 933 significantly increased at 325 MPa, the maximum pressure treatment tested against this pathogen. In contrast to S. aureus, cells of E. coli O157:H7 exhibited significant increase in average view area and volume at 200 MPa. The pressure-induced increase in these parameters may be attributed to modifications in membrane properties, for example, denaturation of membrane-bound proteins and pressure-induced phase transition of membrane lipid bilayer.

  13. Finite volume element method for analysis of unsteady reaction-diffusion problems

    Institute of Scientific and Technical Information of China (English)

    Sutthisak Phongthanapanich; Pramote Dechaumphai

    2009-01-01

    A finite volume element method is developed for analyzing unsteady scalar reaction--diffusion problems in two dimensions. The method combines the concepts that are employed in the finite volume and the finite element method together. The finite volume method is used to discretize the unsteady reaction--diffusion equation, while the finite element method is applied to estimate the gradient quantities at cell faces. Robustness and efficiency of the combined method have been evaluated on uniform rectangular grids by using available numerical solutions of the two-dimensional reaction-diffusion problems. The numerical solutions demonstrate that the combined method is stable and can provide accurate solution without spurious oscillation along the highgradient boundary layers.

  14. Analysis of the Microstructure and Permeability of the Laminates with Different Fiber Volume Fraction

    Institute of Scientific and Technical Information of China (English)

    MA Yue; LI Wei; LIANG Zi-qing

    2008-01-01

    Microstmctures of laminates produced by epoxy/ carbon fibers with different fiber volume fraction were studied by analyzing the composite cross-sections. The main result of the compaction of reinforcement is the flatting of bundle shape, the reducing of gap and the embedment of bundles among each layer. The void content outside the bundle decreased sharply during the compoction until it is less than that inside the bundle when the fiber volume fraction is over 60%. The resin flow velocity in the fiber tow is 102-104 times greater than the flow velocity out the fiber tow no matter the capillary pressure is taken into account or not.

  15. A phenomenological analysis of sintering kinetics from the viewpoint of activated volume

    Directory of Open Access Journals (Sweden)

    Nikolić M.V.

    2005-01-01

    Full Text Available The sintering kinetics of real systems has been viewed as a process of transport of activated volume. Activated volume is a parameter that can be used to describe mass transport during the sintering process. It defines the movement of point defects and dislocations during the sintering process. A phenomenological equation has been defined using this parameter, which can be applied to analyze kinetics of the sintering process. It has been applied to analyze the sintering process of several disperse systems. Values obtained for parameters of the equation have also been analyzed.

  16. Multiple Criteria Decision Analysis for Health Care Decision Making--Emerging Good Practices: Report 2 of the ISPOR MCDA Emerging Good Practices Task Force.

    Science.gov (United States)

    Marsh, Kevin; IJzerman, Maarten; Thokala, Praveen; Baltussen, Rob; Boysen, Meindert; Kaló, Zoltán; Lönngren, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Devlin, Nancy

    2016-01-01

    Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making. A set of techniques, known under the collective heading, multiple criteria decision analysis (MCDA), are useful for this purpose. In 2014, ISPOR established an Emerging Good Practices Task Force. The task force's first report defined MCDA, provided examples of its use in health care, described the key steps, and provided an overview of the principal methods of MCDA. This second task force report provides emerging good-practice guidance on the implementation of MCDA to support health care decisions. The report includes: a checklist to support the design, implementation and review of an MCDA; guidance to support the implementation of the checklist; the order in which the steps should be implemented; illustrates how to incorporate budget constraints into an MCDA; provides an overview of the skills and resources, including available software, required to implement MCDA; and future research directions.

  17. Perception adapts via top-down regulation to task repetition: A Lotka-Volterra-Haken modeling analysis of experimental data.

    Science.gov (United States)

    Frank, T D

    2016-03-01

    Two experiments are reported in which participants perceived different physical quantities: size and speed. The perceptual tasks were performed in the context of motor performance problems. Participants perceived the size of objects in order to grasp the objects single handed or with both hands. Likewise, participants perceived the speed of a moving treadmill in order to control walking or running at that speed. In both experiments, the perceptual tasks were repeatedly performed by the participants while the to-be-perceived quantity was gradually varied from small to large objects (Experiment 1) and from low to high speeds (Experiment 2). Hysteresis with negative sign was found when participants were not allowed to execute the motor component, that is, when the execution stage was decoupled from the planning stage. No such effect was found in the control condition, when participants were allowed to execute the motor action. Using a Lotka-Volterra-Haken model for two competing neural populations, it is argued that the observations are consistent with the notion that the repetitions induce an adaptation effect of the perceptual system via top-down regulation. Moreover, the amount of synaptic modulation involved in the adaptation is estimated from participant data.

  18. Selection of Mother Wavelet Functions for Multi-Channel EEG Signal Analysis during a Working Memory Task

    Directory of Open Access Journals (Sweden)

    Noor Kamal Al-Qazzaz

    2015-11-01

    Full Text Available We performed a comparative study to select the efficient mother wavelet (MWT basis functions that optimally represent the signal characteristics of the electrical activity of the human brain during a working memory (WM task recorded through electro-encephalography (EEG. Nineteen EEG electrodes were placed on the scalp following the 10–20 system. These electrodes were then grouped into five recording regions corresponding to the scalp area of the cerebral cortex. Sixty-second WM task data were recorded from ten control subjects. Forty-five MWT basis functions from orthogonal families were investigated. These functions included Daubechies (db1–db20, Symlets (sym1–sym20, and Coiflets (coif1–coif5. Using ANOVA, we determined the MWT basis functions with the most significant differences in the ability of the five scalp regions to maximize their cross-correlation with the EEG signals. The best results were obtained using “sym9” across the five scalp regions. Therefore, the most compatible MWT with the EEG signals should be selected to achieve wavelet denoising, decomposition, reconstruction, and sub-band feature extraction. This study provides a reference of the selection of efficient MWT basis functions.

  19. Scale-4 analysis of pressurized water reactor critical configurations: Volume 5, North Anna Unit 1 Cycle 5

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M. [Oak Ridge National Lab., TN (United States); Suto, T. [Power Reactor and Nuclear Fuel Development Corp., Tokyo (Japan)]|[Oak Ridge National Lab., TN (United States)

    1996-10-01

    ANSI/ANS 8.1 requires that calculational methods for away-from- reactor (AFR) criticality safety analyses be validated against experiment. This report summarizes part of the ongoing effort to benchmark AFR criticality analysis methods using selected critical configurations from commercial PWRs. Codes and data in the SCALE-4 code system were used. This volume documents the SCALE system analysis of one reactor critical configuration for North Anna Unit 1 Cycle 5. The KENO V.a criticality calculations for the North Anna 1 Cycle 5 beginning-of-cycle model yielded a value for k{sub eff} of 1. 0040{+-}0.0005.

  20. Analysis of air-toxics emissions, exposures, cancer risks and controllability in five urban areas. Volume 2. Controllability analysis and results

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.; Coleman, B.; Laich, E.; Powell, R.

    1990-04-01

    The report (Volume 2) is the second phase of a study to define the urban air toxics problem and to discern what combination of control measures can best be employed to mitigate the problem. Volume 1 of the study documented the base year analysis (nominally the year 1980), involving dispersion modeling of emissions data for 25 carcinogenic air toxics in five U.S. urban areas and a subsequent assessment of estimated aggregate cancer incidence. The Volume 2 report applies various control strategies and analyzes the resulting reduction in aggregate cancer incidence that would occur between 1980 and 1995. Control scenarios consisted of (1) efforts that were currently underway to reduce air toxics emissions at the time of the study, (2) efforts that were expected to occur by 1995, mainly national standards that were under development, and (3) a series of selected more rigorous controls.