WorldWideScience

Sample records for analysis task volume

  1. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  2. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  3. Underground Test Area Subproject Phase I Data Analysis Task. Volume VI - Groundwater Flow Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-11-01

    Volume VI of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the groundwater flow model data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  4. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  5. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  6. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    International Nuclear Information System (INIS)

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses

  7. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A. [Pacific Science and Engineering Group, San Diego, CA (United States); Saunders, W.M.; Lepage, R.P.; Chin, E. [University of California San Diego Medical Center, CA (United States). Div. of Radiation Oncology; Schoenfeld, I.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.

  8. Subcortical volume analysis in traumatic brain injury: the importance of the fronto-striato-thalamic circuit in task switching.

    Science.gov (United States)

    Leunissen, Inge; Coxon, James P; Caeyenberghs, Karen; Michiels, Karla; Sunaert, Stefan; Swinnen, Stephan P

    2014-02-01

    Traumatic brain injury (TBI) is associated with neuronal loss, diffuse axonal injury and executive dysfunction. Whereas executive dysfunction has traditionally been associated with prefrontal lesions, ample evidence suggests that those functions requiring behavioral flexibility critically depend on the interaction between frontal cortex, basal ganglia and thalamus. To test whether structural integrity of this fronto-striato-thalamic circuit can account for executive impairments in TBI we automatically segmented the thalamus, putamen and caudate of 25 patients and 21 healthy controls and obtained diffusion weighted images. We assessed components of executive function using the local-global task, which requires inhibition, updating and switching between actions. Shape analysis revealed localized atrophy of the limbic, executive and rostral-motor zones of the basal ganglia, whereas atrophy of the thalami was more global in TBI. This subcortical atrophy was related to white matter microstructural organization in TBI, suggesting that axonal injuries possibly contribute to subcortical volume loss. Global volume of the nuclei showed no clear relationship with task performance. However, the shape analysis revealed that participants with smaller volume of those subregions that have connections with the prefrontal cortex and rostral motor areas showed higher switch costs and mixing costs, and made more errors while switching. These results support the idea that flexible cognitive control over action depends on interactions within the fronto-striato-thalamic circuit.

  9. Blade loss transient dynamics analysis, volume 1. Task 1: Survey and perspective. [aircraft gas turbine engines

    Science.gov (United States)

    Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.

  10. Skill Components of Task Analysis

    Science.gov (United States)

    Adams, Anne E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Some task analysis methods break down a task into a hierarchy of subgoals. Although an important tool of many fields of study, learning to create such a hierarchy (redescription) is not trivial. To further the understanding of what makes task analysis a skill, the present research examined novices' problems with learning Hierarchical Task…

  11. Genetic Inventory Task Final Report. Volume 2

    Science.gov (United States)

    Venkateswaran, Kasthuri; LaDuc, Myron T.; Vaishampayan, Parag

    2012-01-01

    Contaminant terrestrial microbiota could profoundly impact the scientific integrity of extraterrestrial life-detection experiments. It is therefore important to know what organisms persist on spacecraft surfaces so that their presence can be eliminated or discriminated from authentic extraterrestrial biosignatures. Although there is a growing understanding of the biodiversity associated with spacecraft and cleanroom surfaces, it remains challenging to assess the risk of these microbes confounding life-detection or sample-return experiments. A key challenge is to provide a comprehensive inventory of microbes present on spacecraft surfaces. To assess the phylogenetic breadth of microorganisms on spacecraft and associated surfaces, the Genetic Inventory team used three technologies: conventional cloning techniques, PhyloChip DNA microarrays, and 454 tag-encoded pyrosequencing, together with a methodology to systematically collect, process, and archive nucleic acids. These three analysis methods yielded considerably different results: Traditional approaches provided the least comprehensive assessment of microbial diversity, while PhyloChip and pyrosequencing illuminated more diverse microbial populations. The overall results stress the importance of selecting sample collection and processing approaches based on the desired target and required level of detection. The DNA archive generated in this study can be made available to future researchers as genetic-inventory-oriented technologies further mature.

  12. Task analysis and support for problem solving tasks

    International Nuclear Information System (INIS)

    This paper is concerned with Task Analysis as the basis for ergonomic design to reduce human error rates, rather than for predicting human error rates. Task Analysis techniques usually provide a set of categories for describing sub tasks, and a framework describing the relations between sub-tasks. Both the task type categories and their organisation have implications for optimum interface and training design. In this paper, the framework needed for considering the most complex tasks faced by operators in process industries is discussed such as fault management in unexpected situations, and what is likely to minimise human error in these circumstances. (author)

  13. [Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio]. Volume 3, Sampling and analysis plan (SAP): Phase 1, Task 4, Field Investigation: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    In April 1990, Wright-Patterson Air Force Base (WPAFB), initiated an investigation to evaluate a potential Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) removal action to prevent, to the extent practicable, the offsite migration of contaminated ground water from WPAFB. WPAFB retained the services of the Environmental Management Operations (EMO) and its principle subcontractor, International Technology Corporation (IT) to complete Phase 1 of the environmental investigation of ground-water contamination at WPAFB. Phase 1 of the investigation involves the short-term evaluation and potential design for a program to remove ground-water contamination that appears to be migrating across the western boundary of Area C, and across the northern boundary of Area B along Springfield Pike. Primarily, Task 4 of Phase 1 focuses on collection of information at the Area C and Springfield Pike boundaries of WPAFB. This Sampling and Analysis Plan (SAP) has been prepared to assist in completion of the Task 4 field investigation and is comprised of the Quality Assurance Project Plan (QAPP) and the Field Sampling Plan (FSP).

  14. Analysis of the structural parameters that influence gas production from the Devonian shale. Volume 1. Executive Summary and Task Reports. Annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Shumaker, R.C.; de Wys, J.N.; Dixon, J.M.

    1978-10-01

    The first portion of the report, from the Executive Summary (page 1) through the Schedule of Milestones (page 10), gives a general overview which highlights our progress and problems for the second year. The Task report portion of the text, written by individual task investigators, is designed primarily for scientists interested in technical details of the second year's work. The second portion of the report consists of appendices of data compiled by the principal investigators.

  15. A Brief Analysis of Communication Tasks in Task- based Teaehing

    Institute of Scientific and Technical Information of China (English)

    Xu Xiaoying

    2011-01-01

    Task -Based Language Teaching (TBLT) aims at proving opportunities for the learners to experiment with and explore both spoken and written language through learning activities. This passage further exam if the following four communicative tasks jigsaw tasks, role - play tasks, problem solving tasks, and information gap tasks can assist classroom learning.

  16. An ergonomic task analysis of spinal anaesthesia.

    LENUS (Irish Health Repository)

    Ajmal, Muhammad

    2009-12-01

    Ergonomics is the study of physical interaction between humans and their working environment. The objective of this study was to characterize the performance of spinal anaesthesia in an acute hospital setting, applying ergonomic task analysis.

  17. Task Descriptions in Diagnostic Radiology. Research Report No. 7. Volume 1, Medical Tasks: What the Radiologist Does.

    Science.gov (United States)

    Gilpatrick, Eleanor

    The first of four volumes in Research Report No. 7 of the Health Services Mobility Study (HSMS), this book contains 143 task descriptions covering most of the medical activities carried out by diagnostic radiologists. (The work carried out by radiologic technologists, and administrative, machine-related, and nursing-type functions are found in…

  18. Job and task analysis for technical staff

    International Nuclear Information System (INIS)

    In September of 1989 Cooper Nuclear Station began a project to upgrade the Technical Staff Training Program. This project's roots began by performing job and Task Analysis for Technical Staff. While the industry has long been committed to Job and Task Analysis to target performance based instruction for single job positions, this approach was unique in that it was not originally considered appropriate for a group as diverse as Tech Staff. Much to his satisfaction the Job and Task Analysis Project was much less complicated for Technical Staff than the author had imagined. The benefits of performing the Job and Task Analysis for Technical Staff have become increasingly obvious as he pursues lesson plan development and course revisions. The outline for this presentation will be as follows: philosophy adopted; preparation of the job survey document; performing the job analysis; performing task analysis for technical staff and associated pitfalls; clustering objectives for training and comparison to existing program; benefits now and in the future; final phase (comparison to INPO guides and meeting the needs of non-degreed engineering professionals); and conclusion. By focusing on performance based needs for engineers rather than traditional academics for training the author is confident the future Technical Staff Program will meet the challenges ahead and will exceed requirements for accreditation

  19. Task force on compliance and enforcement. Final report. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    Recommendations for measures to strengthen the FEA enforcement program in the area of petroleum price regulation are presented. Results of task force efforts are presented in report and recommendations sections concerned with pending cases, compliance program organization, enforcement powers, compliance strategy, and audit staffing and techniques. (JRD)

  20. A Cognitive Task Analysis for Dental Hygiene.

    Science.gov (United States)

    Cameron, Cheryl A.; Beemsterboer, Phyllis L.; Johnson, Lynn A.; Mislevy, Robert J.; Steinberg, Linda S.; Breyer, F. Jay

    2000-01-01

    As part of the development of a scoring algorithm for a simulation-based dental hygiene initial licensure examination, this effort conducted a task analysis of the dental hygiene domain. Broad classes of behaviors that distinguish along the dental hygiene expert-novice continuum were identified and applied to the design of nine paper-based cases…

  1. Radiation protection technician job task analysis manual

    International Nuclear Information System (INIS)

    This manual was developed to assist all DOE contractors in the design and conduct of job task analysis (JTA) for the radiation protection technician. Experience throughout the nuclear industry and the DOE system has indicated that the quality and efficiency in conducting a JTA at most sites is greatly enhanced by using a generic task list for the position, and clearly written guidelines on the JTA process. This manual is designed to provide this information for personnel to use in developing and conducting site-specific JTAs. (VC)

  2. 49 CFR 236.1043 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... of work, etc.), task(s), and desired success rate; (2) Based on a formal task analysis, identify the... performance necessary to perform each task; (5) Develop a training and evaluation curriculum that includes... curriculum and pass an examination that covers the PTC system and appropriate rules and tasks for which...

  3. Final report on the Pathway Analysis Task

    International Nuclear Information System (INIS)

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University's Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere

  4. Final report on the Pathway Analysis Task

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, F.W.; Kirchner, T.B. [Colorado State Univ., Fort Collins, CO (United States)

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University`s Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere.

  5. Research and development of a heat-pump water heater. Volume 2. R and D task reports

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, R.L.; Amthor, F.R.; Doyle, E.J.

    1978-08-01

    The heat pump water heater is a device that works much like a window air conditioner except that heat from the home is pumped into a water tank rather than to the outdoors. The objective established for the device is to operate with a Coefficient of Performance (COP) of 3 or, an input of one unit of electric energy would create three units of heat energy in the form of hot water. With such a COP, the device would use only one-third the energy and at one-third the cost of a standard resistance water heater. This Volume 2 contains the final reports of the three major tasks performed in Phase I. In Task 2, a market study identifies the future market and selects an initial target market and channel of distribution, all based on an analysis of the parameters affecting feasibility of the device and the factors that will affect its market acceptance. In the Task 3 report, the results of a design and test program to arrive at final designs of heat pumps for both new water heaters and for retrofitting existing water heaters are presented. In the Task 4 report, a plan for an extensive field demonstration involving use in actual homes is presented. Volume 1 contains a final summary report of the information in Volume 2.

  6. Data analysis & probability task & drill sheets

    CERN Document Server

    Cook, Tanya

    2011-01-01

    For grades 3-5, our State Standards-based combined resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. The task sheets introduce the mathematical concepts to the students around a central problem taken from real-life experiences, while the drill sheets provide warm-up and timed practice questions for the students to strengthen their procedural proficiency skills. Included in our resource are activities to help students learn how to collect, organize, analyze, interpret, and predict data pro

  7. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  8. A cognitive task analysis of the SGTR scenario

    International Nuclear Information System (INIS)

    This report constitutes a contribution to the NKS/RAK-1:3 project on Integrated Sequence Analysis. Following the meeting at Ringhals, the work was proposed to be performed by the following three steps: Task 1. Cognitive Task Analysis of the E-3 procedure. Task 2. Evaluation and revision of task analysis with Ringhals/KSU experts. Task 3. Integration with simulator data. The Cognitive Task Analysis (CTA) of Task 1 uses the Goals-Means Task Analysis (GMTA) method to identify the sequence of tasks and task steps necessary to achieve the goals of the procedure. It is based on material supplied by Ringhals, which describes the E-3 procedure, including the relevant ES and ECA procedures. The analysis further outlines the cognitive demands profile associated with individual task steps as well as with the task as a whole, as an indication of the nominal task load. The outcome of the cognitive task analysis provides a basis for proposing an adequate event tree. This report describes the results from Task 1. The work has included a two-day meeting between the three contributors, as well as the exchange of intermediate results and comments throughout the period. After the initial draft of the report was prepared, an opportunity was given to observe the SGTR scenario in a full-scope training simulator, and to discuss the details with the instructors. This led to several improvements from the initial draft. (EG)

  9. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  10. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  11. Cue Representation and Situational Awareness in Task Analysis

    Science.gov (United States)

    Carl, Diana R.

    2009-01-01

    Task analysis in human performance technology is used to determine how human performance can be well supported with training, job aids, environmental changes, and other interventions. Early work by Miller (1953) and Gilbert (1969, 1974) addressed cue processing in task execution and recommended cue descriptions in task analysis. Modern task…

  12. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 2: Participant Case Studies

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  13. Analysis of Task-based Syllabus

    Institute of Scientific and Technical Information of China (English)

    马进胜

    2011-01-01

    Task-based language teaching is very popular in the modem English teaching.It is based on the Task-based Syllabus.Taskbased Syllabus focuses on the learners' communicative competence,which stresses learning by doing.From the theoretical assumption and definitions of the task,the paper analysizes the components of the task,then points out the merits and demerits of the syllabus.By this means the paper may give some tips to teachers and students when they use the tsk-based language teaching.

  14. Workplace for analysis of task performance

    NARCIS (Netherlands)

    Bos, J; Mulder, LJM; van Ouwerkerk, RJ; Maarse, FJ; Akkerman, AE; Brand, AN; Mulder, LJM

    2003-01-01

    In current research on mental workload and task performance a large gap exists between laboratory based studies and research projects in real life working practice. Tasks conducted within a laboratory environment often lack a strong resemblance with real life working situations. This paper presents

  15. Job and task analysis: a view from the inside

    International Nuclear Information System (INIS)

    This paper is not intended to describe how to perform a Job and Task Analysis. There are a wide variety of approaches to conducting a Job and Task Analysis, many of which have been developed by highy seasoned and skilled professionals in this field. This paper is intended to discuss the internal support, in terms of money, time, and people, required for the Job and Task Analysis Project

  16. Guidelines for job and task analysis for DOE nuclear facilities

    International Nuclear Information System (INIS)

    The guidelines are intended to be responsive to the need for information on methodology, procedures, content, and use of job and task analysis since the establishment of a requirement for position task analysis for Category A reactors in DOE 5480.1A, Chapter VI. The guide describes the general approach and methods currently being utilized in the nuclear industry and by several DOE contractors for the conduct of job and task analysis and applications to the development of training programs or evaluation of existing programs. In addition other applications for job and task analysis are described including: operating procedures development, personnel management, system design, communications, and human performance predictions

  17. Task Analysis as a Resource for Strengthening Health Systems.

    Science.gov (United States)

    Hart, Leah J; Carr, Catherine; Fullerton, Judith T

    2016-01-01

    Task analysis is a descriptive study methodology that has wide application in the health professions. Task analysis is particularly useful in assessment and definition of the knowledge, skills, and behaviors that define the scope of practice of a health profession or occupation. Jhpiego, a US-based nongovernmental organization, has adapted traditional task analysis methods in several countries in assessment of workforce education and practice issues. Four case studies are presented to describe the utility and adaptability of the task analysis approach. Traditional task analysis field survey methods were used in assessment of the general and maternal-child health nursing workforce in Mozambique that led to curriculum redesign, reducing the number of education pathways from 4 to 2. The process of health system strengthening in Liberia, following a long history of civil war conflict, included a traditional task analysis study conducted among 119 registered nurses and 46 certified midwives who had graduated in the last 6 months to 2 years to determine gaps in education and preparation. An innovative approach for data collection that involves "playing cards" to document participant opinions (Task Master, Mining for Data) was developed by Jhpiego for application in other countries. Results of a task analysis involving 54 nurses and 100 nurse-midwives conducted in Lesotho were used to verify the newly drafted scope and standards of practice for nurses and to inform planning for a competency-based preservice curriculum for nursing. The Nursing and Midwifery Council developed a 100-question licensing examination for new graduates following a task analysis in Botswana. The task analysis process in each country resulted in recommendations that were action oriented and were implemented by the country governments. For maximal utility and ongoing impact, a task analysis study should be repeated on a periodic basis and more frequently in countries undergoing rapid change in

  18. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  19. Swept Volume Parameterization for Isogeometric Analysis

    Science.gov (United States)

    Aigner, M.; Heinrich, C.; Jüttler, B.; Pilgerstorfer, E.; Simeon, B.; Vuong, A.-V.

    Isogeometric Analysis uses NURBS representations of the domain for performing numerical simulations. The first part of this paper presents a variational framework for generating NURBS parameterizations of swept volumes. The class of these volumes covers a number of interesting free-form shapes, such as blades of turbines and propellers, ship hulls or wings of airplanes. The second part of the paper reports the results of isogeometric analysis which were obtained with the help of the generated NURBS volume parameterizations. In particular we discuss the influence of the chosen parameterization and the incorporation of boundary conditions.

  20. A Task-Content Analysis of an Introductory Entomology Curriculum.

    Science.gov (United States)

    Brandenburg, R.

    Described is an analysis of the content, tasks, and strategies needed by students to enable them to identify insects to order by sight and to family by use of a standard dichotomous taxonomic key. Tasks and strategies are broken down and arranged progressively in the approximate order in which students should progress. Included are listings of…

  1. A Task that Elicits Reasoning: A Dual Analysis

    Science.gov (United States)

    Yankelewitz, Dina; Mueller, Mary; Maher, Carolyn A.

    2010-01-01

    This paper reports on the forms of reasoning elicited as fourth grade students in a suburban district and sixth grade students in an urban district worked on similar tasks involving reasoning with the use of Cuisenaire rods. Analysis of the two data sets shows similarities in the reasoning used by both groups of students on specific tasks, and the…

  2. Toward a cognitive task analysis for biomedical query mediation.

    Science.gov (United States)

    Hruby, Gregory W; Cimino, James J; Patel, Vimla; Weng, Chunhua

    2014-01-01

    In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM. PMID:25954589

  3. Sentiment Analysis of Suicide Notes: A Shared Task.

    Science.gov (United States)

    Pestian, John P; Matykiewicz, Pawel; Linn-Gust, Michelle; South, Brett; Uzuner, Ozlem; Wiebe, Jan; Cohen, K Bretonnel; Hurdle, John; Brew, Christopher

    2012-01-30

    This paper reports on a shared task involving the assignment of emotions to suicide notes. Two features distinguished this task from previous shared tasks in the biomedical domain. One is that it resulted in the corpus of fully anonymized clinical text and annotated suicide notes. This resource is permanently available and will (we hope) facilitate future research. The other key feature of the task is that it required categorization with respect to a large set of labels. The number of participants was larger than in any previous biomedical challenge task. We describe the data production process and the evaluation measures, and give a preliminary analysis of the results. Many systems performed at levels approaching the inter-coder agreement, suggesting that human-like performance on this task is within the reach of currently available technologies.

  4. 49 CFR 236.923 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements... structured training designed to impart the knowledge, skills, and abilities identified as necessary...

  5. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.B.; Stobbs, J.J.; Collier, D.M.; Hobbs, J.S.

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US counries. Data are compiled in this volume on Canada, Egypt, Federal Republic of Germany, Finland, and France.

  6. International data collection and analysis. Task 1

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    Commercial nuclear power has grown to the point where 13 nations now operate commercial nuclear power plants. Another four countries should join this list before the end of 1980. In the Nonproliferation Alternative Systems Assessment Program (NASAP), the US DOE is evaluating a series of alternate possible power systems. The objective is to determine practical nuclear systems which could reduce proliferation risk while still maintaining the benefits of nuclear power. Part of that effort is the development of a data base denoting the energy needs, resources, technical capabilities, commitment to nuclear power, and projected future trends for various non-US countries. The data are presented by country for each of 28 non-US countries. This volume contains compiled data on Mexico, Netherlands, Pakistan, Philippines, South Africa, South Korea, and Spain.

  7. Units of analysis in task-analytic research.

    Science.gov (United States)

    Haring, T G; Kennedy, C H

    1988-01-01

    We develop and discuss four criteria for evaluating the appropriateness of units of analysis for task-analytic research and suggest potential alternatives to the units of analysis currently used. Of the six solutions discussed, the most commonly used unit of analysis in current behavior analytic work, percentage correct, meets only one of the four criteria. Five alternative units of analysis are presented and evaluated: (a) percentage of opportunities to perform meeting criterion, (b) trials to criteria, (c) cumulative competent performances, (d) percentage correct with competent performance coded, and (e) percentage correct with competent performance coded and a grid showing performance on individual steps of the task analysis. Of the solutions evaluated, only one--percentage correct with competent performance coded and a task analysis grid--met all four criteria.

  8. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  9. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  10. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  11. Sensometrics methods for descriptive analysis and sorting task. Two applications

    OpenAIRE

    Miranda, Karen

    2013-01-01

    We have presented the results obtained through two sensory methods that are descriptive analysis and categorization. Several statistical methods have been applied to analyze the results: ANOVA; PCA, MCA and MFA.. Sensometrics methods for descriptive analysis and sorting task have to be applied in two cases related to food and beverage industries

  12. Feasibility of developing a portable driver performance data acquisition system for human factors research: Technical tasks. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.J.; Barickman, F.S.; Spelt, P.F.; Schmoyer, R.L.; Kirkpatrick, J.R.

    1998-01-01

    A two-phase, multi-year research program entitled ``development of a portable driver performance data acquisition system for human factors research`` was recently completed. The primary objective of the project was to develop a portable data acquisition system for crash avoidance research (DASCAR) that will allow drive performance data to be collected using a large variety of vehicle types and that would be capable of being installed on a given vehicle type within a relatively short-time frame. During phase 1 a feasibility study for designing and fabricating DASCAR was conducted. In phase 2 of the research DASCAR was actually developed and validated. This technical memorandum documents the results from the feasibility study. It is subdivided into three volumes. Volume one (this report) addresses the last five items in the phase 1 research and the first issue in the second phase of the project. Volumes two and three present the related appendices, and the design specifications developed for DASCAR respectively. The six tasks were oriented toward: identifying parameters and measures; identifying analysis tools and methods; identifying measurement techniques and state-of-the-art hardware and software; developing design requirements and specifications; determining the cost of one or more copies of the proposed data acquisition system; and designing a development plan and constructing DASCAR. This report also covers: the background to the program; the requirements for the project; micro camera testing; heat load calculations for the DASCAR instrumentation package in automobile trunks; phase 2 of the research; the DASCAR hardware and software delivered to the National Highway Traffic Safety Administration; and crash avoidance problems that can be addressed by DASCAR.

  13. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes. PMID:19507425

  14. Evaluation of the Trajectory Operations Applications Software Task (TOAST). Volume 2: Interview transcripts

    Science.gov (United States)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project whose purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle. The purpose of the evaluation was to evaluate TOAST as an Application Manager - to assess current and planned capabilities, compare capabilities to commercially-available off the shelf (COTS) software, and analyze requirements of MCC and Flight Analysis Design System (FADS) for TOAST implementation. As a major part of the data gathering for the evaluation, interviews were conducted with NASA and contractor personnel. Real-time and flight design users, orbit navigation users, the TOAST developers, and management were interviewed. Code reviews and demonstrations were also held. Each of these interviews was videotaped and transcribed as appropriate. Transcripts were edited and are presented chronologically.

  15. A Validated Task Analysis of the Single Pilot Operations Concept

    Science.gov (United States)

    Wolter, Cynthia A.; Gore, Brian F.

    2015-01-01

    The current day flight deck operational environment consists of a two-person Captain/First Officer crew. A concept of operations (ConOps) to reduce the commercial cockpit to a single pilot from the current two pilot crew is termed Single Pilot Operations (SPO). This concept has been under study by researchers in the Flight Deck Display Research Laboratory (FDDRL) at the National Aeronautics and Space Administration's (NASA) Ames (Johnson, Comerford, Lachter, Battiste, Feary, and Mogford, 2012) and researchers from Langley Research Centers (Schutte et al., 2007). Transitioning from a two pilot crew to a single pilot crew will undoubtedly require changes in operational procedures, crew coordination, use of automation, and in how the roles and responsibilities of the flight deck and ATC are conceptualized in order to maintain the high levels of safety expected of the US National Airspace System. These modifications will affect the roles and the subsequent tasks that are required of the various operators in the NextGen environment. The current report outlines the process taken to identify and document the tasks required by the crew according to a number of operational scenarios studied by the FDDRL between the years 2012-2014. A baseline task decomposition has been refined to represent the tasks consistent with a new set of entities, tasks, roles, and responsibilities being explored by the FDDRL as the move is made towards SPO. Information from Subject Matter Expert interviews, participation in FDDRL experimental design meetings, and study observation was used to populate and refine task sets that were developed as part of the SPO task analyses. The task analysis is based upon the proposed ConOps for the third FDDRL SPO study. This experiment possessed nine different entities operating in six scenarios using a variety of SPO-related automation and procedural activities required to guide safe and efficient aircraft operations. The task analysis presents the roles and

  16. Task Analysis Exemplified: The Process of Resolving Unfinished Business.

    Science.gov (United States)

    Greenberg, Leslie S.; Foerster, Florence S.

    1996-01-01

    The steps of a task analysis research program designed to identify the in-session performances involved in resolving lingering bad feelings toward a significant other are described. A rational-empirical methodology of repeatedly cycling between rational conjecture and empirical observations is demonstrated as a method of developing an intervention…

  17. Partial Derivative Games in Thermodynamics: A Cognitive Task Analysis

    Science.gov (United States)

    Kustusch, Mary Bridget; Roundy, David; Dray, Tevian; Manogue, Corinne A.

    2014-01-01

    Several studies in recent years have demonstrated that upper-division students struggle with the mathematics of thermodynamics. This paper presents a task analysis based on several expert attempts to solve a challenging mathematics problem in thermodynamics. The purpose of this paper is twofold. First, we highlight the importance of cognitive task…

  18. Data analysis & probability task sheets : grades pk-2

    CERN Document Server

    Cook, Tanya

    2009-01-01

    For grades PK-2, our Common Core State Standards-based resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to learn and review the concepts in unique ways. Each task sheet is organized around a central problem taken from real-life experiences of the students.

  19. Analysis and Modeling of Control Tasks in Dynamic Systems

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær; Krink, Thiemo; Jensen, Mikkel Thomas;

    2002-01-01

    -case generators (TCGs), which requires a systematic analysis of dynamic optimization tasks. So far, only a few TCGs have been suggested. Our investigation leads to the conclusion that these TCGs are not capable of generating realistic dynamic benchmark tests. The result of our research is the design of a new TCG...

  20. An Analysis of Tasks Performed in the Ornamental Horticulture Industry.

    Science.gov (United States)

    Berkey, Arthur L.; Drake, William E.

    This publication is the result of a detailed task analysis study of the ornamental horticulture industry in New York State. Nine types of horticulture businesses identified were: (1) retail florists, (2) farm and garden supply store, (3) landscape services, (4) greenhouse production, (5) nursery production, (6) turf production, (7) arborist…

  1. Inpo/industry job and task analysis efforts

    International Nuclear Information System (INIS)

    One of the goals of INPO is to develop and coordinate industrywide programs to improve the education, training and qualification of nuclear utility personnel. To accomplish this goal, INPO's Training and Education Division: conducts periodic evaluations of industry training programs; provides assistance to the industry in developing training programs; manages the accreditation of utility training programs. These efforts are aimed at satisfying the need for training programs for nuclear utility personnel to be performance-based. Performance-based means that training programs provide an incumbent with the skills and knowledge required to safely perform the job. One of the ways that INPO has provided assistance to the industry is through the industrywide job and task analysis effort. I will discuss the job analysis and task analysis processes, the current status of JTA efforts, JTA products and JTA lessons learned

  2. Relationship between stroke volume and pulse pressure during blood volume perturbation: a mathematical analysis.

    Science.gov (United States)

    Bighamian, Ramin; Hahn, Jin-Oh

    2014-01-01

    Arterial pulse pressure has been widely used as surrogate of stroke volume, for example, in the guidance of fluid therapy. However, recent experimental investigations suggest that arterial pulse pressure is not linearly proportional to stroke volume. However, mechanisms underlying the relation between the two have not been clearly understood. The goal of this study was to elucidate how arterial pulse pressure and stroke volume respond to a perturbation in the left ventricular blood volume based on a systematic mathematical analysis. Both our mathematical analysis and experimental data showed that the relative change in arterial pulse pressure due to a left ventricular blood volume perturbation was consistently smaller than the corresponding relative change in stroke volume, due to the nonlinear left ventricular pressure-volume relation during diastole that reduces the sensitivity of arterial pulse pressure to perturbations in the left ventricular blood volume. Therefore, arterial pulse pressure must be used with care when used as surrogate of stroke volume in guiding fluid therapy.

  3. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 4, Task 5, Operation of PFH on beneficiated shale, Task 6, Environmental data and mitigation analyses and Task 7, Sample procurement, preparation, and characterization: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    The objective of Task 5 (Operation of Pressurized Fluidized-Bed Hydro-Retorting (PFH) on Beneficiated Shale) was to modify the PFH process to facilitate its use for fine-sized, beneficiated Eastern shales. This task was divided into 3 subtasks: Non-Reactive Testing, Reactive Testing, and Data Analysis and Correlations. The potential environment impacts of PFH processing of oil shale must be assessed throughout the development program to ensure that the appropriate technologies are in place to mitigate any adverse effects. The overall objectives of Task 6 (Environmental Data and Mitigation Analyses) were to obtain environmental data relating to PFH and shale beneficiation and to analyze the potential environmental impacts of the integrated PFH process. The task was divided into the following four subtasks. Characterization of Processed Shales (IGT), 6.2. Water Availability and Treatment Studies, 6.3. Heavy Metals Removal and 6.4. PFH Systems Analysis. The objective of Task 7 (Sample Procurement, Preparation, and Characterization) was to procure, prepare, and characterize raw and beneficiated bulk samples of Eastern oil shale for all of the experimental tasks in the program. Accomplishments for these tasks are presented.

  4. Analysis of the Discontinuities in Prioritized Tasks-Space Control Under Discrete Task Scheduling Operations

    OpenAIRE

    Keith, François; Wieber, Pierre-Brice; Mansard, Nicolas; Kheddar, Abderrahmane

    2011-01-01

    International audience This paper examines the control continuity in hierarchical task-space controllers. While the continuity is ensured for any a priori fixed number of tasks -even in ill-conditioned configurations-, the control resulting from a hierarchical stack-of-task computation may not be continuous under some discrete events. In particular, we study how the continuity of the stack-of-task control computation is affected under discreet scheduling operations such as on-the-fly prior...

  5. Sandia-Power Surety Task Force Hawaii foam analysis.

    Energy Technology Data Exchange (ETDEWEB)

    McIntyre, Annie

    2010-11-01

    The Office of Secretary of Defense (OSD) Power Surety Task Force was officially created in early 2008, after nearly two years of work in demand reduction and renewable energy technologies to support the Warfighter in Theater. The OSD Power Surety Task Force is tasked with identifying efficient energy solutions that support mission requirements. Spray foam insulation demonstrations were recently expanded beyond field structures to include military housing at Ft. Belvoir. Initial results to using the foam in both applications are favorable. This project will address the remaining key questions: (1) Can this technology help to reduce utility costs for the Installation Commander? (2) Is the foam cost effective? (3) What application differences in housing affect those key metrics? The critical need for energy solutions in Hawaii and the existing relationships among Sandia, the Department of Defense (DOD), the Department of Energy (DOE), and Forest City, make this location a logical choice for a foam demonstration. This project includes application and analysis of foam to a residential duplex at the Waikulu military community on Oahu, Hawaii, as well as reference to spray foam applied to a PACOM facility and additional foamed units on Maui, conducted during this project phase. This report concludes the analysis and describes the utilization of foam insulation at military housing in Hawaii and the subsequent data gathering and analysis.

  6. The Indirect Effect of Age Group on Switch Costs via Gray Matter Volume and Task-Related Brain Activity

    Science.gov (United States)

    Steffener, Jason; Gazes, Yunglin; Habeck, Christian; Stern, Yaakov

    2016-01-01

    Healthy aging simultaneously affects brain structure, brain function, and cognition. These effects are often investigated in isolation ignoring any relationships between them. It is plausible that age related declines in cognitive performance are the result of age-related structural and functional changes. This straightforward idea is tested in within a conceptual research model of cognitive aging. The current study tested whether age-related declines in task-performance were explained by age-related differences in brain structure and brain function using a task-switching paradigm in 175 participants. Sixty-three young and 112 old participants underwent MRI scanning of brain structure and brain activation. The experimental task was an executive context dual task with switch costs in response time as the behavioral measure. A serial mediation model was applied voxel-wise throughout the brain testing all pathways between age group, gray matter volume, brain activation and increased switch costs, worsening performance. There were widespread age group differences in gray matter volume and brain activation. Switch costs also significantly differed by age group. There were brain regions demonstrating significant indirect effects of age group on switch costs via the pathway through gray matter volume and brain activation. These were in the bilateral precuneus, bilateral parietal cortex, the left precentral gyrus, cerebellum, fusiform, and occipital cortices. There were also significant indirect effects via the brain activation pathway after controlling for gray matter volume. These effects were in the cerebellum, occipital cortex, left precentral gyrus, bilateral supramarginal, bilateral parietal, precuneus, middle cingulate extending to medial superior frontal gyri and the left middle frontal gyri. There were no significant effects through the gray matter volume alone pathway. These results demonstrate that a large proportion of the age group effect on switch costs can

  7. Sensitivity analysis of volume scattering phase functions.

    Science.gov (United States)

    Tuchow, Noah; Broughton, Jennifer; Kudela, Raphael

    2016-08-01

    To solve the radiative transfer equation and relate inherent optical properties (IOPs) to apparent optical properties (AOPs), knowledge of the volume scattering phase function is required. Due to the difficulty of measuring the phase function, it is frequently approximated. We explore the sensitivity of derived AOPs to the phase function parameterization, and compare measured and modeled values of both the AOPs and estimated phase functions using data from Monterey Bay, California during an extreme "red tide" bloom event. Using in situ measurements of absorption and attenuation coefficients, as well as two sets of measurements of the volume scattering function (VSF), we compared output from the Hydrolight radiative transfer model to direct measurements. We found that several common assumptions used in parameterizing the radiative transfer model consistently introduced overestimates of modeled versus measured remote-sensing reflectance values. Phase functions from VSF data derived from measurements at multiple wavelengths and a single scattering single angle significantly overestimated reflectances when using the manufacturer-supplied corrections, but were substantially improved using newly published corrections; phase functions calculated from VSF measurements using three angles and three wavelengths and processed using manufacture-supplied corrections were comparable, demonstrating that reasonable predictions can be made using two commercially available instruments. While other studies have reached similar conclusions, our work extends the analysis to coastal waters dominated by an extreme algal bloom with surface chlorophyll concentrations in excess of 100 mg m-3. PMID:27505819

  8. Computational stress analysis using finite volume methods

    OpenAIRE

    Fallah, Nosrat Allah

    2000-01-01

    There is a growing interest in applying finite volume methods to model solid mechanics problems and multi-physics phenomena. During the last ten years an increasing amount of activity has taken place in this area. Unlike the finite element formulation, which generally involves volume integrals, the finite volume formulation transfers volume integrals to surface integrals using the divergence theorem. This transformation for convection and diffusion terms in the governing equations, ensures...

  9. Consensus statement of the ESICM task force on colloid volume therapy in critically ill patients

    DEFF Research Database (Denmark)

    Reinhart, Konrad; Perner, Anders; Sprung, Charles L;

    2012-01-01

    PURPOSE: Colloids are administered to more patients than crystalloids, although recent evidence suggests that colloids may possibly be harmful in some patients. The European Society of Intensive Care Medicine therefore assembled a task force to compile consensus recommendations based on the current...

  10. Physical and cognitive task analysis in interventional radiology

    International Nuclear Information System (INIS)

    AIM: To identify, describe and detail the cognitive thought processes, decision-making, and physical actions involved in the preparation and successful performance of core interventional radiology procedures. MATERIALS AND METHODS: Five commonly performed core interventional radiology procedures were selected for cognitive task analysis. Several examples of each procedure being performed by consultant interventional radiologists were videoed. The videos of those procedures, and the steps required for successful outcome, were analysed by a psychologist and an interventional radiologist. Once a skeleton algorithm of the procedures was defined, further refinement was achieved using individual interview techniques with consultant interventional radiologists. Additionally a critique of each iteration of the established algorithm was sought from non-participating independent consultant interventional radiologists. RESULTS: Detailed task descriptions and decision protocols were developed for five interventional radiology procedures (arterial puncture, nephrostomy, venous access, biopsy-using both ultrasound and computed tomography, and percutaneous transhepatic cholangiogram). Identical tasks performed within these procedures were identified and standardized within the protocols. CONCLUSIONS: Complex procedures were broken down and their constituent processes identified. This might be suitable for use as a training protocol to provide a universally acceptable safe practice at the most fundamental level. It is envisaged that data collected in this way can be used as an educational resource for trainees and could provide the basis for a training curriculum in interventional radiology. It will direct trainees towards safe practice of the highest standard. It will also provide performance objectives of a simulator model

  11. Job task analysis: lessons learned from application in course development

    International Nuclear Information System (INIS)

    Those at Public Service Electric and Gas Company are committed to a systematic approach to training known as Instructional System Design. Our performance-based training emphasizes the ISD process to have trainees do or perform the task whenever and wherever it is possible for the jobs for which they are being trained. Included is a brief description of our process for conducting and validating job analyses. The major thrust of this paper is primarily on the lessons that we have learned in the design and development of training programs based upon job analysis results

  12. Using Goal Setting and Task Analysis to Enhance Task-Based Language Learning and Teaching

    Science.gov (United States)

    Rubin, Joan

    2015-01-01

    Task-Based Language Learning and Teaching has received sustained attention from teachers and researchers for over thirty years. It is a well-established pedagogy that includes the following characteristics: major focus on authentic and real-world tasks, choice of linguistic resources by learners, and a clearly defined non-linguistic outcome. This…

  13. Theoretical Analysis of Task-Based Language Teaching Pedagogy

    Institute of Scientific and Technical Information of China (English)

    HUANG Li-na

    2013-01-01

    Since the implementation of English class XinCheng, English teachers actively studying task-based language teaching approach, try to use task-based language teaching in the classroom teaching. This article will combine the implementation of task-based language teaching, and discussed the application of the task-based language teaching in English teaching.

  14. Task-positive and task-negative networks and their relation to depression: EEG beamformer analysis.

    Science.gov (United States)

    Knyazev, Gennady G; Savostyanov, Alexander N; Bocharov, Andrey V; Tamozhnikov, Sergey S; Saprigyn, Alexander E

    2016-06-01

    Major Depressive Disorder (MDD) has been associated with predominance of the default-mode network (DMN) over the task-positive network (TPN), which is considered a neurobiological base for ruminative responding. It is not known whether this predominance is a signature of the full-blown MDD or it already exists at preclinical stages. Besides, all relevant evidence has been obtained using fMRI, which allows for a precise spatial characterization of resting state networks (RSNs), but their neural correlates remain elusive. Here we show that after leakage correction of beamformer-projected resting EEG time series, seed-based oscillatory-power envelope correlation analysis allows revealing RSNs with significant similarity to respective fMRI RSNs. In a non-clinical sample, depressive symptoms, as measured by the Beck Depression Inventory, are associated with predominance of DMN over TPN connectivity in the right insula and the right temporal lobe in the delta frequency band. These findings imply that in individuals with heightened level of depressive symptoms, emotional circuits are stronger connected with DMN than TPN and should be more easily engaged in self-referential rumination than in responding to environmental challenges. The study's findings are in agreement with fMRI evidence, thus confirming the neural base of the observed in fMRI research effects and showing that implicated in depression neural mechanism may already be in action even at preclinical stages. PMID:27001453

  15. Spatio-temporal analysis reveals active control of both task-relevant and task-irrelevant variables

    Directory of Open Access Journals (Sweden)

    Kornelius eRácz

    2013-11-01

    Full Text Available The Uncontrolled Manifold hypothesis and Minimal Intervention principle propose that the observed differential variability across task relevant (i.e., task goals vs. irrelevant (i.e., in the null space of those goals variables is evidence of a separation of task variables for efficient neural control, ranked by their respective variabilities (sometimes referred to as hierarchy of control. Support for this comes from spatial domain analyses (i.e., structure of of kinematic, kinetic and EMG variability. While proponents admit the possibility of textsl{preferential} as opposed to strictly textsl{uncontrolled} variables, such distinctions have only begun to be quantified or considered in the temporal domain when inferring control action. Here we extend the study of task variability during tripod static grasp to the temporal domain by applying diffusion analysis. We show that both task-relevant and task-irrelevant parameters show corrective action at some time scales; and conversely, that task-relevant parameters do not show corrective action at other time scales. That is, the spatial fluctuations of fingertip forces show, as expected, greater ranges of variability in task-irrelevant variables (> 98% associated with changes in total grasp force; vs. only

  16. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  17. Effects of immersion on visual analysis of volume data.

    Science.gov (United States)

    Laha, Bireswar; Sensharma, Kriti; Schiffbauer, James D; Bowman, Doug A

    2012-04-01

    Volume visualization has been widely used for decades for analyzing datasets ranging from 3D medical images to seismic data to paleontological data. Many have proposed using immersive virtual reality (VR) systems to view volume visualizations, and there is anecdotal evidence of the benefits of VR for this purpose. However, there has been very little empirical research exploring the effects of higher levels of immersion for volume visualization, and it is not known how various components of immersion influence the effectiveness of visualization in VR. We conducted a controlled experiment in which we studied the independent and combined effects of three components of immersion (head tracking, field of regard, and stereoscopic rendering) on the effectiveness of visualization tasks with two x-ray microscopic computed tomography datasets. We report significant benefits of analyzing volume data in an environment involving those components of immersion. We find that the benefits do not necessarily require all three components simultaneously, and that the components have variable influence on different task categories. The results of our study improve our understanding of the effects of immersion on perceived and actual task performance, and provide guidance on the choice of display systems to designers seeking to maximize the effectiveness of volume visualization applications. PMID:22402687

  18. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  19. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  20. Analysis of Brain Cognitive State for Arithmetic Task and Motor Task Using Electroencephalography Signal

    Directory of Open Access Journals (Sweden)

    R Kalpana

    2013-08-01

    Full Text Available To localize the brain dynamics for cognitive processes from EEG signature has been a challenging taskfrom last two decades. In this paper we explore the spatial-temporal correlations of brain electricalneuronal activity for cognitive task such as Arithmetic and Motor Task using 3D cortical distributionmethod. Ten healthy right handed volunteers participated in the experiment. EEG signal was acquiredduring resting state with eyes open and eyes closed; performing motor task and arithmetic calculations.The signal was then computed for three dimensional cortical distributions on realistic head model withMNI152 template using standardized low resolution brain electromagnetic tomography (sLORETA. Thiswas followed by an appropriate standardization of the current density, producing images of electricneuronal activity without localization bias. Neuronal generators responsible for cognitive state such asArithmetic Task and Motor Task were localized. The result was correlated with the previous neuroimaging(fMRI study investigation. Hence our result directed that the neuronal activity from EEG signal can bedemonstrated in cortical level with good spatial resolution. 3D cortical distribution method, thus, may beused to obtain both spatial and temporal information from EEG signal and may prove to be a significanttechnique to investigate the cognitive functions in mental health and brain dysfunctions. Also, it may behelpful for brain/human computer interfacing.

  1. Life science payload definition and integration study, task C and D. Volume 3: Appendices

    Science.gov (United States)

    1973-01-01

    Research equipment requirements were based on the Mini-7 and Mini-30 laboratory concepts defined in Tasks A and B of the intial LSPD contract. Modified versions of these laboratories and the research equipment within them were to be used in three missions of Shuttle/Sortie Module. These were designated (1) the shared 7-day laboratory (a mission with the life sciences laboratory sharing the sortie module with another scientific laboratory), (2) the dedicated 7-day laboratory (full use of the sortie module), and (3) the dedicated 30-day laboratory (full sortie module use with a 30-day mission duration). In defining the research equipment requirements of these laboratories, the equipment was grouped according to its function, and equipment unit data packages were prepared.

  2. Taste perception analysis using a semantic verbal fluency task

    Directory of Open Access Journals (Sweden)

    Ghemulet M

    2014-09-01

    Full Text Available Maria Ghemulet,1,2 Maria Baskini,3 Lambros Messinis,2,4 Eirini Mouza,1 Hariklia Proios1,5 1Department of Speech Therapy, Anagennisis (Revival Physical Recovery and Rehabilitation Centre, Nea Raidestos, Filothei, Thessaloniki, Greece; 2Department of Speech and Language Therapy, Technological Institute of Western Greece, Patra, Greece; 3Department of Neurosurgery, Interbalkan European Medical Centre, Thessaloniki, Greece; 4Neuropsychology Section, Department of Neurology, University of Patras, Medical School, Patras, Greece; 5Department of Education and Social Policy, University of Macedonia, Thessaloniki, Greece Abstract: A verbal fluency (VF task is a test used to examine cognitive perception. The main aim of this study was to explore a possible relationship between taste perception in the basic taste categories (sweet, salty, sour, and bitter and subjects’ taste preferences, using a VF task in healthy and dysphagic subjects. In addition, we correlated the results of the VF task with body mass index (BMI. The hypothesis is that categorical preferences would be consistent with the number of verbal responses. We also hypothesized that higher BMI (.30 kg/m2 would correlate with more responses in either some or all four categories. VF tasks were randomly administered. Analysis criteria included number of verbally produced responses, number of clusters, number of switches, number and type of errors, and VF consistency with taste preferences. Sixty Greek-speaking individuals participated in this study. Forty-three healthy subjects were selected with a wide range of ages, sex, and education levels. Seventeen dysphagic patients were then matched with 17 healthy subjects according to age, sex, and BMI. Quantitative one-way analysis of variance (between groups as well as repeated measures, post hoc, and chi-square, and qualitative analyses were performed. In the healthy subjects’ group, the differences among the mean number of responses for the four

  3. The assessment of risky decision making: a factor analysis of performance on the Iowa Gambling Task, Balloon Analogue Risk Task, and Columbia Card Task.

    Science.gov (United States)

    Buelow, Melissa T; Blaine, Amber L

    2015-09-01

    Researchers and clinicians frequently use behavioral measures to assess decision making. The most common task that is marketed to clinicians is the Iowa Gambling Task (IGT), thought to assess risky decision making. How does performance on the IGT relate to performance on other common measures of decision making? The present study sought to examine relationships between the IGT, the Balloon Analogue Risk Task (BART), and the Columbia Card Task (CCT). Participants were 390 undergraduate students who completed the IGT, BART, and either the "hot" or "cold" CCT. Principal components factor analysis on the IGT, BART, and CCT-cold (n = 112) indicated that the IGT measures a different component of decision making than the BART, and the CCT-cold weakly correlated with early IGT trials. Results of the exploratory factor analysis on the IGT, BART, and CCT-hot (n = 108) revealed a similar picture: the IGT and BART assessed different types of decision making, and the BART and CCT-hot were weakly correlated. A confirmatory factor analysis (n = 170) indicated that a 3-factor model without the CCT-cold (Factor 1: later IGT trials; Factor 2: BART; and Factor 3: early IGT trials) was a better fitting model than one that included the CCT-cold and early IGT trials on the same factor. Collectively, the present results suggest that the IGT, BART, and CCT all measure unique, nonoverlapping decision making processes. Further research is needed to more fully understand the neuropsychological construct of decision making. PMID:25580611

  4. Task analysis for the single-shell Tank Waste Retrieval Manipulator System

    Energy Technology Data Exchange (ETDEWEB)

    Draper, J.V.

    1993-03-01

    This document describes a task analysis for the Tank Waste Retrieval Manipulator System. A task analysis is a formal method of examining work that must be done by the operators of human-machine systems. The starting point for a task analysis is the mission that a human-machine system must perform, and the ending point is a list of requirements for human actions and the displays and controls that must be provided to support them. The task analysis approach started with a top-down definition of the steps in a tank retrieval campaign. It started by dividing a waste retrieval campaign for one single-shell tank into the largest logical components (mission phases), then subdivided these into secondary components (sub functions), and then further subdivided the secondary components into tertiary units (tasks). Finally, the tertiary units were divided into potentially observable operator behaviors (task elements). In the next stage of the task analysis, the task elements were evaluated by completing an electronic task analysis form patterned after one developed by the Nuclear Regulatory Commission for task analysis of nuclear power plant control rooms. In the final stage, the task analysis data base was used in a bottom-up approach to develop clusters of controls and displays called panel groups and to prioritize these groups for each subfunction. Panel groups are clusters of functionally related controls and displays. Actual control panels will be designed from panel groups, and panel groups will be organized within workstations to promote efficient operations during retrieval campaigns.

  5. Life sciences payload definition and integration study, task C and D. Volume 1: Management summary

    Science.gov (United States)

    1973-01-01

    The findings of a study to define the required payloads for conducting life science experiments in space are presented. The primary objectives of the study are: (1) identify research functions to be performed aboard life sciences spacecraft laboratories and necessary equipment, (2) develop conceptual designs of potential payloads, (3) integrate selected laboratory designs with space shuttle configurations, and (4) establish cost analysis of preliminary program planning.

  6. A Longitudinal Behavioral Genetic Analysis of Task Persistence

    Science.gov (United States)

    Deater-Deckard, Kirby; Petrill, Stephen A.; Thompson, Lee A.; DeThorne, Laura S.

    2006-01-01

    Change in task persistence was assessed in two annual assessments using teachers', testers', and observers' ratings. Participants included 79 monozygotic and 116 same-sex dizygotic twin pairs who were in Kindergarten or 1st grade (4.3 to 7.9 years old) at the initial assessment. Task persistence was widely distributed and higher among older…

  7. ALE Meta-Analysis of Schizophrenics Performing the N-Back Task

    Science.gov (United States)

    Harrell, Zachary

    2010-10-01

    MRI/fMRI has already proven itself as a valuable tool in the diagnosis and treatment of many illnesses of the brain, including cognitive problems. By exploiting the differences in magnetic susceptibility between oxygenated and deoxygenated hemoglobin, fMRI can measure blood flow in various regions of interest within the brain. This can determine the level of brain activity in relation to motor or cognitive functions and provide a metric for tissue damage or illness symptoms. Structural imaging techniques have shown lesions or deficiencies in tissue volumes in schizophrenics corresponding to areas primarily in the frontal and temporal lobes. These areas are currently known to be involved in working memory and attention, which many schizophrenics have trouble with. The ALE (Activation Likelihood Estimation) Meta-Analysis is able to statistically determine the significance of brain area activations based on the post-hoc combination of multiple studies. This process is useful for giving a general model of brain function in relation to a particular task designed to engage the affected areas (such as working memory for the n-back task). The advantages of the ALE Meta-Analysis include elimination of single subject anomalies, elimination of false/extremely weak activations, and verification of function/location hypotheses.

  8. Bringing Reading-to-Write and Writing-Only Assessment Tasks Together: A Generalizability Analysis

    Science.gov (United States)

    Gebril, Atta

    2010-01-01

    Integrated tasks are currently employed in a number of L2 exams since they are perceived as an addition to the writing-only task type. Given this trend, the current study investigates composite score generalizability of both reading-to-write and writing-only tasks. For this purpose, a multivariate generalizability analysis is used to investigate…

  9. Task analysis method for procedural training curriculum development.

    Science.gov (United States)

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments. PMID:24366759

  10. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 1

    Science.gov (United States)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 1 of the four major tasks included in the study. Task 1 compares flight plans based on forecasts with plans based on the verifying analysis from 33 days during the summer and fall of 1979. The comparisons show that: (1) potential fuel savings conservatively estimated to be between 1.2 and 2.5 percent could result from using more timely and accurate weather data in flight planning and route selection; (2) the Suitland forecast generally underestimates wind speeds; and (3) the track selection methodology of many airlines operating on the North Atlantic may not be optimum resulting in their selecting other than the optimum North Atlantic Organized Track about 50 percent of the time.

  11. Darwinian algorithms and the Wason selection task: a factorial analysis of social contract selection task problems.

    Science.gov (United States)

    Platt, R D; Griggs, R A

    1993-08-01

    In four experiments with 760 subjects, the present study examined Cosmides' Darwinian algorithm theory of reasoning: specifically, its explanation of facilitation on the Wason selection task. The first experiment replicated Cosmides' finding of facilitation for social contract versions of the selection task, using both her multiple-problem format and a single-problem format. Experiment 2 examined performance on Cosmides' three main social contract problems while manipulating the perspective of the subject and the presence and absence of cost-benefit information. The presence of cost-benefit information improved performance in two of the three problems while the perspective manipulation had no effect. In Experiment 3, the cost-benefit effect was replicated; and performance on one of the three problems was enhanced by the presence of explicit negatives on the NOT-P and NOT-Q cards. Experiment 4 examined the role of the deontic term "must" in the facilitation observed for two of the social contract problems. The presence of "must" led to a significant improvement in performance. The results of these experiments are strongly supportive of social contract theory in that cost-benefit information is necessary for substantial facilitation to be observed in Cosmides' problems. These findings also suggest the presence of other cues that can help guide subjects to a deontic social contract interpretation when the social contract nature of the problem is not clear.

  12. Task Descriptions in Diagnostic Radiology. Research Report No. 7. Volume 3, Machine-Related, Patient Care and Administrative Tasks: What Radiologists, Technologists, Nurses and Physicists Do to Run Things and Look After Patients and Equipment.

    Science.gov (United States)

    Gilpatrick, Eleanor

    The third of four volumes in Research Report No. 7 of the Health Services Mobility Study (HSMS), this book contains 149 diagnostic radiologist task descriptions that cover activities in the area of nursing (patient care), film processing, quality assurance, radiation protection, machine maintenance, housekeeping, and administration at the…

  13. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  14. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  15. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  16. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  17. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  18. A comparative critical analysis of modern task-parallel runtimes.

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Kyle Bruce; Stark, Dylan; Murphy, Richard C.

    2012-12-01

    The rise in node-level parallelism has increased interest in task-based parallel runtimes for a wide array of application areas. Applications have a wide variety of task spawning patterns which frequently change during the course of application execution, based on the algorithm or solver kernel in use. Task scheduling and load balance regimes, however, are often highly optimized for specific patterns. This paper uses four basic task spawning patterns to quantify the impact of specific scheduling policy decisions on execution time. We compare the behavior of six publicly available tasking runtimes: Intel Cilk, Intel Threading Building Blocks (TBB), Intel OpenMP, GCC OpenMP, Qthreads, and High Performance ParalleX (HPX). With the exception of Qthreads, the runtimes prove to have schedulers that are highly sensitive to application structure. No runtime is able to provide the best performance in all cases, and those that do provide the best performance in some cases, unfortunately, provide extremely poor performance when application structure does not match the schedulers assumptions.

  19. Designing Preclinical Instruction for Psychomotor Skills (II)--Instructional Engineering: Task Analysis.

    Science.gov (United States)

    Knight, G. William; And Others

    1994-01-01

    The first step in engineering the instruction of dental psychomotor skills, task analysis, is explained. A chart details the procedural, cognitive, desired-criteria, and desired-performance analysis of a single task, occlusal preparation for amalgam restoration with carious lesion. (MSE)

  20. Using Cost-Volume-Profit Analysis in Decision Making

    OpenAIRE

    IONELA-CLAUDIA DINA; GABRIELA BUŞAN

    2009-01-01

    The cost-volume-profit study the manner how evolve the total revenues, the total costs and operating profit, as changes occur in volume production, sale price, the unit variable cost and / or fixed costs of a product. Managers use this analysis to answer different questions like: How will incomes and costs be affected if we still sell 1.000 units? But if you expand or reduce selling prices? If we expand our business in foreign markets?

  1. Life sciences payload definition and integration study, task C and D. Volume 2: Payload definition, integration, and planning studies

    Science.gov (United States)

    1973-01-01

    The Life Sciences Payload Definition and Integration Study was composed of four major tasks. Tasks A and B, the laboratory definition phase, were the subject of prior NASA study. The laboratory definition phase included the establishment of research functions, equipment definitions, and conceptual baseline laboratory designs. These baseline laboratories were designated as Maxi-Nom, Mini-30, and Mini-7. The outputs of Tasks A and B were used by the NASA Life Sciences Payload Integration Team to establish guidelines for Tasks C and D, the laboratory integration phase of the study. A brief review of Tasks A and B is presented provide background continuity. The tasks C and D effort is the subject of this report. The Task C effort stressed the integration of the NASA selected laboratory designs with the shuttle sortie module. The Task D effort updated and developed costs that could be used by NASA for preliminary program planning.

  2. Fluid Vessel Quantity Using Non-invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  3. Fluid Vessel Quantity using Non-Invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A.

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  4. Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory

    Science.gov (United States)

    Fiester, Herbert R.

    2010-01-01

    The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…

  5. A Cross-Sectional Behavioral Genetic Analysis of Task Persistence in the Transition to Middle Childhood

    Science.gov (United States)

    Deater-Deckard, Kirby; Petrill, Stephen A.; Thompson, Lee A.; DeThorne, Laura S.

    2005-01-01

    Task persistence, measured by a composite score of independent teacher, tester and observer reports, was examined using behavioral genetic analysis. Participants included 92 monozygotic and 137 same-sex dizygotic twin pairs in Kindergarten or 1st grade (4.3 to 7.9 years old). Task persistence was widely distributed, higher among older children,…

  6. Job and Task Analysis project at Brookhaven National Laboratory's high flux beam reactor

    International Nuclear Information System (INIS)

    The presenter discussed the Job and Task Analysis (JTA) project conducted at Brookhaven National Laboratory's High Flux Beam Reactor (HFBR). The project's goal was to provide JTA guidelines for use by DOE contractors, then, using the guidelines conduct a JTA for the reactor operator and supervisor positions at the HFBR. Details of the job analysis and job description preparation as well as details of the task selection and task analysis were given. Post JTA improvements to the HFBR training programs were covered. The presentation concluded with a listing of the costs and impacts of the project

  7. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    Science.gov (United States)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  8. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    Science.gov (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  9. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 4

    Science.gov (United States)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 4 of the four major tasks included in the study. Task 4 uses flight plan segment wind and temperature differences as indicators of dates and geographic areas for which significant forecast errors may have occurred. An in-depth analysis is then conducted for the days identified. The analysis show that significant errors occur in the operational forecast on 15 of the 33 arbitrarily selected days included in the study. Wind speeds in an area of maximum winds are underestimated by at least 20 to 25 kts. on 14 of these days. The analysis also show that there is a tendency to repeat the same forecast errors from prog to prog. Also, some perceived forecast errors from the flight plan comparisons could not be verified by visual inspection of the corresponding National Meteorological Center forecast and analyses charts, and it is likely that they are the result of weather data interpolation techniques or some other data processing procedure in the airlines' flight planning systems.

  10. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 2

    Science.gov (United States)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 2 of the four major tasks included in the study. Task 2 compares various catagories of flight plans and flight tracking data produced by a simulation system developed for the Federal Aviation Administrations by SRI International. (Flight tracking data simulate actual flight tracks of all aircraft operating at a given time and provide for rerouting of flights as necessary to resolve traffic conflicts.) The comparisons of flight plans on the forecast to flight plans on the verifying analysis confirm Task 1 findings that wind speeds are generally underestimated. Comparisons involving flight tracking data indicate that actual fuel burn is always higher than planned, in either direction, and even when the same weather data set is used. Since the flight tracking model output results in more diversions than is known to be the case, it was concluded that there is an error in the flight tracking algorithm.

  11. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 3

    Science.gov (United States)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 3 of the four major tasks included in the study. Task 3 compares flight plans developed on the Suitland forecast with actual data observed by the aircraft (and averaged over 10 degree segments). The results show that the average difference between the forecast and observed wind speed is 9 kts. without considering direction, and the average difference in the component of the forecast wind parallel to the direction of the observed wind is 13 kts. - both indicating that the Suitland forecast underestimates the wind speeds. The Root Mean Square (RMS) vector error is 30.1 kts. The average absolute difference in direction between the forecast and observed wind is 26 degrees and the temperature difference is 3 degree Centigrade. These results indicate that the forecast model as well as the verifying analysis used to develop comparison flight plans in Tasks 1 and 2 is a limiting factor and that the average potential fuel savings or penalty are up to 3.6 percent depending on the direction of flight.

  12. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  13. Performance Analysis of Software to Hardware Task Migration in Codesign

    CERN Document Server

    Sebai, Dorsaf; Bennour, Imed

    2010-01-01

    The complexity of multimedia applications in terms of intensity of computation and heterogeneity of treated data led the designers to embark them on multiprocessor systems on chip. The complexity of these systems on one hand and the expectations of the consumers on the other hand complicate the designers job to conceive and supply strong and successful systems in the shortest deadlines. They have to explore the different solutions of the design space and estimate their performances in order to deduce the solution that respects their design constraints. In this context, we propose the modeling of one of the design space possible solutions: the software to hardware task migration. This modeling exploits the synchronous dataflow graphs to take into account the different migration impacts and estimate their performances in terms of throughput.

  14. IEA Wind Task 24 Integration of Wind and Hydropower Systems; Volume 1: Issues, Impacts, and Economics of Wind and Hydropower Integration

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.

    2011-12-01

    This report describes the background, concepts, issues and conclusions related to the feasibility of integrating wind and hydropower, as investigated by the members of IEA Wind Task 24. It is the result of a four-year effort involving seven IEA member countries and thirteen participating organizations. The companion report, Volume 2, describes in detail the study methodologies and participant case studies, and exists as a reference for this report.

  15. Micro analysis of fringe field formed inside LDA measuring volume

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.

    2016-05-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement.

  16. Method for measuring anterior chamber volume by image analysis

    Science.gov (United States)

    Zhai, Gaoshou; Zhang, Junhong; Wang, Ruichang; Wang, Bingsong; Wang, Ningli

    2007-12-01

    Anterior chamber volume (ACV) is very important for an oculist to make rational pathological diagnosis as to patients who have some optic diseases such as glaucoma and etc., yet it is always difficult to be measured accurately. In this paper, a method is devised to measure anterior chamber volumes based on JPEG-formatted image files that have been transformed from medical images using the anterior-chamber optical coherence tomographer (AC-OCT) and corresponding image-processing software. The corresponding algorithms for image analysis and ACV calculation are implemented in VC++ and a series of anterior chamber images of typical patients are analyzed, while anterior chamber volumes are calculated and are verified that they are in accord with clinical observation. It shows that the measurement method is effective and feasible and it has potential to improve accuracy of ACV calculation. Meanwhile, some measures should be taken to simplify the handcraft preprocess working as to images.

  17. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    Science.gov (United States)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  18. Dashboard task monitor for managing ATLAS user analysis on the grid

    International Nuclear Information System (INIS)

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  19. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  20. Energy use in the marine transportation industry: Task III. Efficiency improvements; Task IV. Industry future. Final report, Volume IV. [Projections for year 2000

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Tasks III and IV measure the characteristics of potential research and development programs that could be applied to the maritime industry. It was necessary to identify potential operating scenarios for the maritime industry in the year 2000 and determine the energy consumption that would result given those scenarios. After the introductory chapter the operational, regulatory, and vessel-size scenarios for the year 2000 are developed in Chapter II. In Chapter III, future cargo flows and expected levels of energy use for the baseline 2000 projection are determined. In Chapter IV, the research and development programs are introduced into the future US flag fleet and the energy-savings potential associated with each is determined. The first four appendices (A through D) describe each of the generic technologies. The fifth appendix (E) contains the baseline operating and cost parameters against which 15 program areas were evaluated. (MCW)

  1. Trading Volume and Stock Indices: A Test of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Paul Abbondante

    2010-01-01

    Full Text Available Problem statement: Technical analysis and its emphasis on trading volume has been used to analyze movements in individual stock prices and make investment recommendations to either buy or sell that stock. Little attention has been paid to investigating the relationship between trading volume and various stock indices. Approach: Since stock indices track overall stock market movements, trends in trading volume could be used to forecast future stock market trends. Instead of focusing only on individual stocks, this study will examine movements in major stock markets as a whole. Regression analysis was used to investigate the relationship between trading volume and five popular stock indices using daily data from January, 2000 to June, 2010. A lag of 5 days was used because this represents the prior week of trading volume. The total sample size ranges from 1,534-2,638 to observations. Smaller samples were used to test the investment horizon that explains movements of the indices more completely. Results: The F statistics were significant for samples using 6 and 16 months of data. The F statistic was not significant using a sample of 1 month of data. This is surprising given the short term focus of technical analysis. The results indicate that above-average returns can be achieved using futures, options and exchange traded funds which track these indices. Conclusion: Future research efforts will include out-of-sample forecasting to determine if above-average returns can be achieved. Additional research can be conducted to determine the optimal number of lags for each index.

  2. In Vivo Analysis of Trapeziometacarpal Joint Kinematics during Pinch Tasks

    Directory of Open Access Journals (Sweden)

    Li-Chieh Kuo

    2014-01-01

    Full Text Available This study investigated how the posture of the thumb while performing common pinch movements and the levels of pinch force applied by the thumb affect the arthrokinematics of the trapeziometacarpal joint in vivo. Fifteen subjects performed the pinch tasks at the distal phalange (DP, proximal interphalangeal (PIP joint, and metacarpophalangeal (MP joint of the index finger with 0%, 50%, and 80% of maximal pinch forces by a single-axis load cell. 3D images of the thumb were obtained using the computed tomography. The results show that the reference points moved from the central region to the dorsal-radial region when changing from pinching the DP to the MP joint without pinching force being applied. Pinching with 80% of the maximum pinching force resulted in reference points being the closest to the volar-ulnar direction. Significant differences were seen between 0% and 50% of maximum pinch force, as well as between 0% and 80%, when pinching the MP joint in the distal-proximal direction. The effects of posture of the thumb and applied pinch force on the arthrokinematics of the joint were investigated with a 3D model of the trapeziometacarpal joint. Pinching with more than 50% of maximum pinch force might subject this joint to extreme displacement.

  3. District heating and cooling systems for communities through power-plant retrofit and distribution network. Volume 2. Tasks 1-3. Final report. [Downtown Toledo steam system

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Each of the tasks is described separately: Task 1 - Demonstration Team; Task 2 - Identify Thermal Energy Source(s) and Potential Service Area(s); and Task 3 - Energy Market Analysis. The purpose of the project is to establish and implement measures in the downtown Toledo steam system for conserving scarce fuel supplies through cogeneration, by retrofit of existing base- or intermediate-loaded electric-generating plants to provide for central heating and cooling systems, with the ultimate purpose of applying the results to other communities. For Task 1, Toledo Edison Company has organized a Demonstration Team (Battelle Columbus Laboratories; Stone and Webster; Ohio Dept. of Energy; Public Utilities Commission of Ohio; Toledo Metropolitan Area Council of Governments; and Toledo Edison) that it hopes has the expertise to evaluate the technical, legal, economic, and marketing issues related to the utilization of by-product heat from power generation to supply district heating and cooling services. Task 2 gives a complete technical description of the candidate plant(s), its thermodynamic cycle, role in load dispatch, ownership, and location. It is concluded that the Toledo steam distribution system can be the starting point for developing a new district-heating system to serve an expanding market. Battelle is a member of the team employed as a subcontractor to complete the energy market analysis. The work is summarized in Task 3. (MCW)

  4. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  5. Analysis of Spectral Features of EEG during four different Cognitive Tasks

    Directory of Open Access Journals (Sweden)

    S.BAGYARAJ

    2014-05-01

    Full Text Available Cognition is a group of information processing activities that involves the visual attention, visual awareness, problem solving and decision making. Finding the cognitive task related regional cerebral activations are of great interest among researchers in cognitive neuroscience. In this study four different types of cognitive tasks, namely tracking pendulum movement and counting, red flash counting, sequential subtraction, spot the difference is performed by 32 subjects and the EEG signals are acquired by using 24 channels RMS EEG-32 Super Spec machine. The analyses of the EEG signal are done by using well known spectral methods. The band powers are calculated in the frequency domain by using the Welch method. The task- relaxes relative band power values and the ratios of theta band power/ beta band power are the two variables used to find the regional cerebral activations during the four different cognitive tasks. The statistical paired t test is used to evaluate the significant difference between the particular tasks related cerebral activations and relaxation. The statistical significance level is set at p< 0.05. During the tracking pendulum movement and counting task, the cerebral activations are found to be bilateral prefrontal, frontal, right central and temporal regions. Red flash counting task has activations in bilateral prefrontal, frontal, right central, right parietal and right occipital lobes. Bilateral prefrontal regions are activated during the sequence subtraction task. The spot the difference task has activations in the left and right prefrontal cortex. The unique and common activations regions for the selected four different cognitive tasks are found to be left and right prefrontal cortex. The pre frontal lobe electrodes namely Fp1 & Fp2 can be used as the recording electrodes for detailed cognitive task analysis were cerebral activations are observed when compared with the other cerebral regions.

  6. Task 11 - systems analysis of environmental management technologies

    Energy Technology Data Exchange (ETDEWEB)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy & Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech.

  7. Task 11 - systems analysis of environmental management technologies. Topical report

    International Nuclear Information System (INIS)

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy ampersand Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech

  8. Task Analysis in Action: The Role of Information Systems in Communicable Disease Reporting

    OpenAIRE

    Pina, Jamie; Turner, Anne; Kwan-Gett, Tao; Duchin, Jeff

    2009-01-01

    In order to improve the design of information systems for notifiable conditions reporting, it is essential to understand the role of such systems in public health practice. Using qualitative techniques, we performed a task analysis of the activities associated with notifiable conditions reporting at a large urban health department. We identified seventeen primary tasks associated with the use of the department’s information system. The results of this investigation suggest that communicable d...

  9. Analysis of Member State RED implementation. Final Report (Task 2)

    Energy Technology Data Exchange (ETDEWEB)

    Peters, D.; Alberici, S.; Toop, G. [Ecofys, Utrecht (Netherlands); Kretschmer, B. [Institute for European Environmental Policy IEEP, London (United Kingdom)

    2012-12-15

    This report describes the way EU Member States have transposed the sustainability and chain of custody requirements for biofuels as laid down in the Renewable Energy Directive (RED) and Fuel Quality Directive (FQD). In the assessment of Member States' implementation, the report mainly focuses on effectiveness and administrative burden. Have Member States transposed the Directives in such a way that compliance with the sustainability criteria can be ensured as effectively as possible? To what extent does the Member States' implementation lead to unnecessary administrative burden for economic operators in the (bio)fuel supply chain? The report focuses specifically on the transposition of the sustainability and chain of custody requirements, not on the target for renewables on transport. This means that for example the double counting provision is not included as part of the scope of this report. This report starts with an introduction covering the implementation of the Renewable Energy (and Fuel Quality) Directive into national legislation, the methodology by which Member States were assessed against effectiveness and administrative burden and the categorisation of Member State's national systems for RED-implementation (Chapter 1). The report continues with a high level description of each Member State system assessed (Chapter 2). Following this, the report includes analysis of the Member States on the effectiveness and administrative burden of a number of key ('major') measures (Chapter 3). The final chapter presents the conclusions and recommendations (Chapter 4)

  10. Development of contextual task analysis for NPP control room operators' work

    International Nuclear Information System (INIS)

    The paper introduces a contextual approach to task analysis concerning control room operators' tasks and task conditions in nuclear power plants. The approach is based on the ecological concept of the situational appropriateness of activity. The task demands are dependent on the ultimate task of the operators which is to maintain the critical safety functions of the process. The context also sets boundary conditions to the fulfilment of these demands. The conceptualisation of the context affords possibilities to comprehend and make visible the core demands of the operators' work. Characteristic to the approach is that the conceptualisation is made both from the point of the operators who are making interpretations of the situation and from the point of the process to be controlled. The context is described as a world of operators' possibilities and constraints and, at the same time, in relation to the demands set by the nature of the process. The method is under development and has been applied in simulator training, in the evaluation of the control room information and in the integrated development of reliability analysis. The method emphasizes the role of explicit conceptualisation of the task situations. Explicity enhances its role as a conceptual tool and, therefore, promotes common awareness in these domains. (orig.)

  11. Extending hierarchical task analysis to identify cognitive demands and information design requirements.

    Science.gov (United States)

    Phipps, Denham L; Meakin, George H; Beatty, Paul C W

    2011-07-01

    While hierarchical task analysis (HTA) is well established as a general task analysis method, there appears a need to make more explicit both the cognitive elements of a task and design requirements that arise from an analysis. One way of achieving this is to make use of extensions to the standard HTA. The aim of the current study is to evaluate the use of two such extensions--the sub-goal template (SGT) and the skills-rules-knowledge (SRK) framework--to analyse the cognitive activity that takes place during the planning and delivery of anaesthesia. In quantitative terms, the two methods were found to have relatively poor inter-rater reliability; however, qualitative evidence suggests that the two methods were nevertheless of value in generating insights about anaesthetists' information handling and cognitive performance. Implications for the use of an extended HTA to analyse work systems are discussed.

  12. Using plant procedures as the basis for conducting a job and task analysis

    International Nuclear Information System (INIS)

    Plant procedures were selected, by Northeast Utilities (NU), as the basis for conducting Job and Task Analyses (JTA). The resultant JTA was used to design procedure based simulator training programs for Millstone 1, 2, and Connecticut Yankee. The task listings were both plant specific and exhibited excellent correlation to INPO's generic PWR and BWR task analyses. Using the procedures based method enabled us to perform the JTA using plant and training staff. This proved cost effective in terms of both time and money. Learning objectives developed from the JTA were easily justified and correlated directly to job performance within the context of the plant procedures. In addition, the analysis generated a comprehensive review of plant procedures and, conversely, the plant's normal procedure revision process generated an automatic trigger for updating the task data

  13. Workflow Modelling and Analysis Based on the Construction of Task Models

    Directory of Open Access Journals (Sweden)

    Glória Cravo

    2015-01-01

    Full Text Available We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.

  14. Advanced coal-using community systems. Task 1A. Technology characteristics. Volume 1. Fuel- and energy-production systems

    Energy Technology Data Exchange (ETDEWEB)

    Tison, R.R.; Blazek, C.F.; Biederman, N.P.; Malik, N.J.; Gamze, M.G.; Wetterstrom, D.; Diskant, W.; Malfitani, L.

    1979-03-01

    This report is presented in 2 volumes. It contains descriptions of engineering characterizations and equipment used in coal processing, fuel and energy distribution, storage, and end-use utilization. Volume 1 contains 4 chapters dealing with: coal conversion processes (high- and low-Btu gas from coal and coal-to-liquid fuels); coal cleaning and direct combustion (pretreating, direct combustion, and stack gas cleaning); electricity production (compression-ignition engines, turbines, combined-cycle, fuel cells, alternative Rankine cycles, Stirling cycles, and closed Brayton cycles); and thermal generating processes (steam plants, direct-contact steam-heated hot water systems, thermal liquid plants, absorption chillers, and centrifugal chillers). (DMC)

  15. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  16. Analysis of volume holographic storage allowing large-angle illumination

    Science.gov (United States)

    Shamir, Joseph

    2005-05-01

    Advanced technological developments have stimulated renewed interest in volume holography for applications such as information storage and wavelength multiplexing for communications and laser beam shaping. In these and many other applications, the information-carrying wave fronts usually possess narrow spatial-frequency bands, although they may propagate at large angles with respect to each other or a preferred optical axis. Conventional analytic methods are not capable of properly analyzing the optical architectures involved. For mitigation of the analytic difficulties, a novel approximation is introduced to treat narrow spatial-frequency band wave fronts propagating at large angles. This approximation is incorporated into the analysis of volume holography based on a plane-wave decomposition and Fourier analysis. As a result of the analysis, the recently introduced generalized Bragg selectivity is rederived for this more general case and is shown to provide enhanced performance for the above indicated applications. The power of the new theoretical description is demonstrated with the help of specific examples and computer simulations. The simulations reveal some interesting effects, such as coherent motion blur, that were predicted in an earlier publication.

  17. Synfuel program analysis. Volume I. Procedures-capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This is the first of the two volumes describing the analytic procedures and resulting capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative synfuel projects and integrated programs. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specifiy cases and interpret outputs. It also contains an explicit description (with examples) of the types of results which can be obtained when applied to: the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. In all cases, the objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  18. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    Science.gov (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  19. Motion analysis of knee joint using dynamic volume images

    Science.gov (United States)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  20. The Quantitative Overhead Analysis for Effective Task Migration in Biosensor Networks

    Directory of Open Access Journals (Sweden)

    Sung-Min Jung

    2013-01-01

    Full Text Available We present a quantitative overhead analysis for effective task migration in biosensor networks. A biosensor network is the key technology which can automatically provide accurate and specific parameters of a human in real time. Biosensor nodes are typically very small devices, so the use of computing resources is restricted. Due to the limitation of nodes, the biosensor network is vulnerable to an external attack against a system for exhausting system availability. Since biosensor nodes generally deal with sensitive and privacy data, their malfunction can bring unexpected damage to system. Therefore, we have to use a task migration process to avoid the malfunction of particular biosensor nodes. Also, it is essential to accurately analyze overhead to apply a proper migration process. In this paper, we calculated task processing time of nodes to analyze system overhead and compared the task processing time applied to a migration process and a general method. We focused on a cluster ratio and different processing time between biosensor nodes in our simulation environment. The results of performance evaluation show that task execution time is greatly influenced by a cluster ratio and different processing time of biosensor nodes. In the results, the proposed algorithm reduces total task execution time in a migration process.

  1. Performance monitoring and analysis of task-based OpenMP.

    Directory of Open Access Journals (Sweden)

    Yi Ding

    Full Text Available OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  2. Performance monitoring and analysis of task-based OpenMP.

    Science.gov (United States)

    Ding, Yi; Hu, Kai; Wu, Kai; Zhao, Zhenlong

    2013-01-01

    OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  3. Parallel runway requirement analysis study. Volume 1: The analysis

    Science.gov (United States)

    Ebrahimi, Yaghoob S.

    1993-01-01

    The correlation of increased flight delays with the level of aviation activity is well recognized. A main contributor to these flight delays has been the capacity of airports. Though new airport and runway construction would significantly increase airport capacity, few programs of this type are currently underway, let alone planned, because of the high cost associated with such endeavors. Therefore, it is necessary to achieve the most efficient and cost effective use of existing fixed airport resources through better planning and control of traffic flows. In fact, during the past few years the FAA has initiated such an airport capacity program designed to provide additional capacity at existing airports. Some of the improvements that that program has generated thus far have been based on new Air Traffic Control procedures, terminal automation, additional Instrument Landing Systems, improved controller display aids, and improved utilization of multiple runways/Instrument Meteorological Conditions (IMC) approach procedures. A useful element to understanding potential operational capacity enhancements at high demand airports has been the development and use of an analysis tool called The PLAND_BLUNDER (PLB) Simulation Model. The objective for building this simulation was to develop a parametric model that could be used for analysis in determining the minimum safety level of parallel runway operations for various parameters representing the airplane, navigation, surveillance, and ATC system performance. This simulation is useful as: a quick and economical evaluation of existing environments that are experiencing IMC delays, an efficient way to study and validate proposed procedure modifications, an aid in evaluating requirements for new airports or new runways in old airports, a simple, parametric investigation of a wide range of issues and approaches, an ability to tradeoff air and ground technology and procedures contributions, and a way of considering probable

  4. An analysis of physiological signals as a measure of task engagement in a multi-limb-coordination motor-learning task.

    Science.gov (United States)

    Murray, Spencer A; Goldfarb, Michael

    2015-01-01

    There is widespread agreement in the physical rehabilitation community that task engagement is essential to effective neuromuscular recovery. Despite this, there are no clear measures of such task engagement. This paper assesses the extent to which certain physiological measurements might provide a measure of task engagement. In previous studies, correlations between mental focus and certain physiological measurements have been observed in subjects performing tasks requiring mental effort. In this study, the authors analyzed whether these signals showed similar correlation when subjects performed a multi-limb-coordination motor-learning task. Subjects played a video game which required the use of both arms and one leg to play a simplified electronic drum set with varying difficulty. Heart rate (HR), skin conductance level (SCL), and facial electromyogram (EMG) were recorded while the subjects played. Analysis of the recordings showed statistically significant correlations relating task difficulty to SCL, HR and EMG amplitude in corrugator supercilii. No statistically significant correlation was observed between task difficulty and EMG in frontalis.

  5. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  6. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use. PMID:25381020

  7. CancellationTools: All-in-one software for administration and analysis of cancellation tasks.

    Science.gov (United States)

    Dalmaijer, Edwin S; Van der Stigchel, Stefan; Nijboer, Tanja C W; Cornelissen, Tim H W; Husain, Masud

    2015-12-01

    In a cancellation task, a participant is required to search for and cross out ("cancel") targets, which are usually embedded among distractor stimuli. The number of cancelled targets and their location can be used to diagnose the neglect syndrome after stroke. In addition, the organization of search provides a potentially useful way to measure executive control over multitarget search. Although many useful cancellation measures have been introduced, most fail to make their way into research studies and clinical practice due to the practical difficulty of acquiring such parameters from traditional pen-and-paper measures. Here we present new, open-source software that is freely available to all. It allows researchers and clinicians to flexibly administer computerized cancellation tasks using stimuli of their choice, and to directly analyze the data in a convenient manner. The automated analysis suite provides output that includes almost all of the currently existing measures, as well as several new ones introduced here. All tasks can be performed using either a computer mouse or a touchscreen as an input device, and an online version of the task runtime is available for tablet devices. A summary of the results is produced in a single A4-sized PDF document, including high quality data visualizations. For research purposes, batch analysis of large datasets is possible. In sum, CancellationTools allows users to employ a flexible, computerized cancellation task, which provides extensive benefits and ease of use.

  8. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  9. Noticing in Task Performance and Learning Outcomes: A Qualitative Analysis of Instructional Effects in Interlanguage Pragmatics

    Science.gov (United States)

    Takahashi, Satomi

    2005-01-01

    This study aims to provide an in-depth qualitative analysis of instructional effects in L2 pragmatics by exploring the manner in which Japanese EFL learners' noticing of target English request forms is constrained by different types of treatment tasks and the subsequent effect of the learners' noticing on their learning outcomes. Following the…

  10. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    Energy Technology Data Exchange (ETDEWEB)

    Torralba, B.; Martinez-Arias, R.

    2007-07-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  11. The Use of Cognitive Task Analysis to Capture Expertise for Tracheal Extubation Training in Anesthesiology

    Science.gov (United States)

    Embrey, Karen K.

    2012-01-01

    Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…

  12. A language independent task engine for incremental name and type analysis

    NARCIS (Netherlands)

    Wachsmuth, G.H.; Konat, G.D.P.; Vergu, V.A.; Groenewegen, D.M.; Visser, E.

    2013-01-01

    This paper is a pre-print of: Guido H. Wachsmuth, Gabriel D.P. Konat, Vlad A. Vergu, Danny M. Groenewegen, Eelco Visser. A Language Independent Task Engine for Incremental Name and Type Analysis. In: Martin Erwig, Richard F. Paige, Eric Van Wyk, editors, Software Language Engineering, Sixth Interna

  13. Analysis of target volumes for gliomas; Volumes-cibles anatomocliniques (GTV et CTV) des tumeurs gliales

    Energy Technology Data Exchange (ETDEWEB)

    Kantor, G. [Centre Regional de Lutte Contre le Cancer, Service de Radiotherapie, Institut Bergonie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France); Loiseau, H. [Hopital Pellegrin-Tripode, Service de Neurochirurgie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France)

    2005-06-15

    Gliomas are the most frequent tumors of the central nervous system of the adult. These intra-parenchymal tumors are infiltrative and the most important criterion for definition of GTV and CTV is the extent of infiltration. Delineation of GTV and CTV for untreated and resected glioma remains a controversial and difficult issue because of the discrepancy between real tumor invasion and that estimated by CT or MRI. Is particularly helpful a joint analysis of the four different methods as histopathological correlations with CT and MRI, use of new modality imaging, pattern of relapses after treatment and interobserver studies. The presence of isolated tumor cells in intact brain, oedema or adjacent structures requires the definition of two different options for CTV: i) a geometrical option with GTV defined as the tumor mass revealed by the contrast-enhanced zone on CT or MRI and a CTV with an expanded margin of 2 or 3 cm; ii) an anatomic option including the entire zone of oedema or isolated tumor cell infiltration extending at least as far as the limits of the hyperintense zone on T2-weighted MRI. Inclusion of adjacent structures (such as white matter, corpus callosum, subarachnoid spaces) in the CTV mainly depends on the site of the tumor and size of the volume is generally enlarged. (authors)

  14. Analysis of Partial Volume Effects on Accurate Measurement of the Hippocampus Volume

    Institute of Scientific and Technical Information of China (English)

    Maryam Hajiesmaeili; Jamshid Dehmeshki; Tim Ellis

    2014-01-01

    Hippocampal volume loss is an important biomarker in distinguishing subjects with Alzheimer’s disease (AD) and its measurement in magnetic resonance images (MRI) is influenced by partial volume effects (PVE). This paper describes a post-processing approach to quantify PVE for correction of the hippocampal volume by using a spatial fuzzyC-means (SFCM) method. The algorithm is evaluated on a dataset of 20 T1-weighted MRI scans sampled at two different resolutions. The corrected volumes for left and right hippocampus (HC) which are 23% and 18% for the low resolution and 6% and 5% for the high resolution datasets, respectively are lower than hippocampal volume results from manual segmentation. Results show the importance of applying this technique in AD detection with low resolution datasets.

  15. Parametric analysis of architectural volumes through genetic algorithms

    Directory of Open Access Journals (Sweden)

    Pedro Salcedo Lagos

    2015-03-01

    Full Text Available During the last time, architectural design has developed partly due to new digital design techniques, which allow the generation of geometries based on the definition of initial parameters and the programming of formal relationship between them. Design processes based on these technologies allow to create shapes with the capacity to modify and adapt to multiple constrains or specific evaluation criteria, which raises the problem of identifying the best architectural solution. Several experiences have set up the utilization of genetic algorithm to face this problem. This paper demonstrates the possibility to implement a parametric analysis of architectural volumes with genetic algorithm, in order to combine functional, environmental and structural requirements, with an effective search method to select a variety of proper solutions through digital technologies.

  16. Factor analysis for imperfect maintenance planning at nuclear power plants by cognitive task analysis

    International Nuclear Information System (INIS)

    Imperfect maintenance planning was frequently identified in domestic nuclear power plants. To prevent such an event, we analyzed causal factors in maintenance planning stages and showed the directionality of countermeasures in this study. There is a pragmatic limit in finding the causal factors from the items based on report descriptions. Therefore, the idea of the systemic accident model, which is used to monitor the performance variability in normal circumstances, is taken as a new concept instead of investigating negative factors. As an actual method for analyzing usual activities, cognitive task analysis (CTA) was applied. Persons who experienced various maintenance activities at one electric power company were interviewed about sources related to decision making during maintenance planning, and then usual factors affecting planning were extracted as performance variability factors. The tendency of domestic events was analyzed using the classification item of those factors, and the directionality of countermeasures was shown. The following are critical for preventing imperfect maintenance planning: the persons in charge should fully understand the situation of the equipment for which they are responsible in the work planning and maintenance evaluation stages, and they should definitely understand, for example, the maintenance bases of that equipment. (author)

  17. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  18. Aviation and programmatic analyses; Volume 1, Task 1: Aviation data base development and application. [for NASA OAST programs

    Science.gov (United States)

    1977-01-01

    A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.

  19. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    Directory of Open Access Journals (Sweden)

    Guan Yu

    Full Text Available Accurately identifying mild cognitive impairment (MCI individuals who will progress to Alzheimer's disease (AD is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI and fluorodeoxyglucose positron emission tomography (FDG-PET. However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI subjects and 226 stable MCI (sMCI subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images and also the single-task classification method (using only MRI or only subjects with both MRI and

  20. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  1. Change Best: Task 2.3. Analysis of policy mix and development of Energy Efficiency Services

    International Nuclear Information System (INIS)

    The aim of the Change Best project is to promote the development of an energy efficiency service (EES) market and to give good practice examples of changes in energy service business, strategies, and supportive policies and measures in the course of the implementation of Directive 2006/32/EC on Energy End-Use Efficiency and Energy Services. This report addresses task 2.3: Analysis of policy mix and development of Energy Efficiency Services.

  2. The Quantitative Overhead Analysis for Effective Task Migration in Biosensor Networks

    OpenAIRE

    Sung-Min Jung; Tae-Kyung Kim; Jung-Ho Eom; Tai-Myoung Chung

    2013-01-01

    We present a quantitative overhead analysis for effective task migration in biosensor networks. A biosensor network is the key technology which can automatically provide accurate and specific parameters of a human in real time. Biosensor nodes are typically very small devices, so the use of computing resources is restricted. Due to the limitation of nodes, the biosensor network is vulnerable to an external attack against a system for exhausting system availability. Since biosensor nodes gener...

  3. Brief experimental analysis of stimulus prompts for accurate responding on academic tasks in an outpatient clinic.

    OpenAIRE

    McComas, J J; Wacker, D P; Cooper, L J; Asmus, J M; Richman, D; Stoner, B

    1996-01-01

    Brief multielement designs were used to examine the effects of specific instructional strategies on accuracy of academic performance during outpatient evaluations of 4 children with learning disorders. Instructional strategies that improved accuracy on academic tasks were identified for all participants. These results suggest that the application of experimental analysis methodologies to instructional variables may facilitate the identification of stimulus prompts that are associated with enh...

  4. Energy Consumption Analysis Procedure for Robotic Applications in different task motion

    Science.gov (United States)

    Ahmed, Iman; Aris, Ishak b.; Hamiruce Marhaban, Mohammad; Juraiza Ishak, Asnor

    2015-11-01

    This work proposes energy analysis method for humanoid robot, seen from simple motion task to complex one in energy chain. The research developed a procedure suitable for analysis, saving and modelling of energy consumption not only in this type of robot but also in most robots that based on electrical power as an energy source. This method has validated by an accurate integration using Matlab software for the power consumption curve to calculate the energy of individual and multiple servo motors. Therefore, this study can be considered as a procedure for energy analysis by utilizing the laboratory instruments capabilities to measure the energy parameters. We performed a various task motions with different angular speed to find out the speed limits in terms of robot stability and control strategy. A battery capacity investigation have been searched for several types of batteries to extract the power modelling equation and energy density parameter for each battery type, Matlab software have been built to design the algorithm and to evaluate experimental amount of the energy which is represented by area under the curve of the power curves. This will provide a robust estimation for the required energy in different task motions to be considered in energy saving (i.e., motion planning and real time scheduling).

  5. Analysis on Refinery System as a Complex Task-resource Network

    Institute of Scientific and Technical Information of China (English)

    LIU Suyu; RONG Gang

    2013-01-01

    Refinery system,a typical example of process systems,is presented as complex network in this paper.The topology of this system is described by task-resource network and modeled as directed and weighted graph,in which nodes represent various tasks and edges denote the resources exchanged among tasks.Using the properties of node degree distribution,strength distribution and other weighted quantities,we demonstrate the heterogeneity of the network and point out the relation between structural characters of vertices and the functionality of corresponding tasks.The above phenomena indicate that the design requirements and principles of production process contribute to the heterogeneous features of the network.Besides,betweenness centrality of nodes can be used as an importance indicator to provide additional information for decision making.The correlations between structure and weighted properties are investigated to further address the influence brought by production schemes in system connectivity patterns.Cascading failures model is employed to analyze the robustness of the network when targeted attack happens.Two capacity assignment strategies are compared in order to improve the robustness of the network at certain cost.The refinery system displays more reliable behavior when the protecting strategy considers heterogeneous properties.This phenomenon further implies the structure-activity relationship of the refinery system and provides insightful suggestions for process system design.The results also indicate that robustness analysis is a promising application of methodologies from complex networks to process system engineering.

  6. Hawaii Energy Strategy Project 2: Fossil Energy Review. Task IV. Scenario development and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, N.D.; Breazeale, K. [ed.

    1993-12-01

    The Hawaii Energy Strategy (HES) Program is a seven-project effort led by the State of Hawaii Department of Business, Economic Development & Tourism (DBEDT) to investigate a wide spectrum of Hawaii energy issues. The East-West Center`s Program on Resources: Energy and Minerals, has been assigned HES Project 2, Fossil Energy Review, which focuses on fossil energy use in Hawaii and the greater regional and global markets. HES Project 2 has four parts: Task I (World and Regional Fossil Energy Dynamics) covers petroleum, natural gas, and coal in global and regional contexts, along with a discussion of energy and the environment. Task II (Fossil Energy in Hawaii) focuses more closely on fossil energy use in Hawaii: current utilization and trends, the structure of imports, possible future sources of supply, fuel substitutability, and energy security. Task III`s emphasis is Greenfield Options; that is, fossil energy sources not yet used in Hawaii. This task is divided into two sections: first, an in-depth {open_quotes}Assessment of Coal Technology Options and Implications for the State of Hawaii,{close_quotes} along with a spreadsheet analysis model, which was subcontracted to the Environmental Assessment and Information Sciences Division of Argonne National Laboratory; and second, a chapter on liquefied natural gas (LNG) in the Asia-Pacific market and the issues surrounding possible introduction of LNG into the Hawaii market.

  7. Proposal of Constraints Analysis Method Based on Network Model for Task Planning

    Science.gov (United States)

    Tomiyama, Tomoe; Sato, Tatsuhiro; Morita, Toyohisa; Sasaki, Toshiro

    Deregulation has been accelerating several activities toward reengineering business processes, such as railway through service and modal shift in logistics. Making those activities successful, business entities have to regulate new business rules or know-how (we call them ‘constraints’). According to the new constraints, they need to manage business resources such as instruments, materials, workers and so on. In this paper, we propose a constraint analysis method to define constraints for task planning of the new business processes. To visualize each constraint's influence on planning, we propose a network model which represents allocation relations between tasks and resources. The network can also represent task ordering relations and resource grouping relations. The proposed method formalizes the way of defining constraints manually as repeatedly checking the network structure and finding conflicts between constraints. Being applied to crew scheduling problems shows that the method can adequately represent and define constraints of some task planning problems with the following fundamental features, (1) specifying work pattern to some resources, (2) restricting the number of resources for some works, (3) requiring multiple resources for some works, (4) prior allocation of some resources to some works and (5) considering the workload balance between resources.

  8. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOAR\\CR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This document is the final report describing the theoretical basis and analytical results from the ADPAC-AOACR codes developed under task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR Program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  9. Finite-Volume Analysis for the Cahn-Hilliard equation with Dynamic boundary conditions

    OpenAIRE

    Nabet, Flore

    2014-01-01

    This work is devoted to the convergence analysis of a finite-volume approximation of the 2D Cahn-Hilliard equation with dynamic boundary conditions. The method that we propose couples a 2d-finite-volume method in a bounded, smooth domain and a 1d-finite-volume method on its boundary. We prove convergence of the sequence of approximate solutions.

  10. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  11. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    Leeuwen, van Marieke; Peper, Jiska S.; Berg, van den Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler I

  12. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Pol, Hilleke E. Hulshoff; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension. perceptual organization and perceptual speed as assessed by the Wechsler I

  13. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  14. Volume component analysis for classification of LiDAR data

    Science.gov (United States)

    Varney, Nina M.; Asari, Vijayan K.

    2015-03-01

    One of the most difficult challenges of working with LiDAR data is the large amount of data points that are produced. Analysing these large data sets is an extremely time consuming process. For this reason, automatic perception of LiDAR scenes is a growing area of research. Currently, most LiDAR feature extraction relies on geometrical features specific to the point cloud of interest. These geometrical features are scene-specific, and often rely on the scale and orientation of the object for classification. This paper proposes a robust method for reduced dimensionality feature extraction of 3D objects using a volume component analysis (VCA) approach.1 This VCA approach is based on principal component analysis (PCA). PCA is a method of reduced feature extraction that computes a covariance matrix from the original input vector. The eigenvectors corresponding to the largest eigenvalues of the covariance matrix are used to describe an image. Block-based PCA is an adapted method for feature extraction in facial images because PCA, when performed in local areas of the image, can extract more significant features than can be extracted when the entire image is considered. The image space is split into several of these blocks, and PCA is computed individually for each block. This VCA proposes that a LiDAR point cloud can be represented as a series of voxels whose values correspond to the point density within that relative location. From this voxelized space, block-based PCA is used to analyze sections of the space where the sections, when combined, will represent features of the entire 3-D object. These features are then used as the input to a support vector machine which is trained to identify four classes of objects, vegetation, vehicles, buildings and barriers with an overall accuracy of 93.8%

  15. A Novel Segmentation, Mutual Information Network Framework for EEG Analysis of Motor Tasks

    Directory of Open Access Journals (Sweden)

    Lee Pamela

    2009-05-01

    Full Text Available Abstract Background Monitoring the functional connectivity between brain regions is becoming increasingly important in elucidating brain functionality in normal and disease states. Current methods of detecting networks in the recorded electroencephalogram (EEG such as correlation and coherence are limited by the fact that they assume stationarity of the relationship between channels, and rely on linear dependencies. In contrast to diseases of the brain cortex (e.g. Alzheimer's disease, with motor disorders such as Parkinson's disease (PD the EEG abnormalities are most apparent during performance of dynamic motor tasks, but this makes the stationarity assumption untenable. Methods We therefore propose a novel EEG segmentation method based on the temporal dynamics of the cross-spectrogram of the computed Independent Components (ICs. We then utilize mutual information (MI as the metric for determining also nonlinear statistical dependencies between EEG channels. Graphical theoretical analysis is then applied to the derived MI networks. The method was applied to EEG data recorded from six normal subjects and seven PD subjects off medication. One-way analysis of variance (ANOVA tests demonstrated statistically significant difference in the connectivity patterns between groups. Results The results suggested that PD subjects are unable to independently recruit different areas of the brain while performing simultaneous tasks compared to individual tasks, but instead they attempt to recruit disparate clusters of synchronous activity to maintain behavioral performance. Conclusion The proposed segmentation/MI network method appears to be a promising approach for analyzing the EEG recorded during dynamic behaviors.

  16. Task 7a: Dynamic analysis of Paks NPP structures reactor building

    International Nuclear Information System (INIS)

    This report describes dynamic response calculation of the NPP Paks, reactor building to the full scale blast testing. All calculations described in this report have been elaborated within the scope of IAEA co-ordinated research - Benchmark Study for for seismic analysis/testing of NPPs type WWER - Task 7a - Dynamic Analysis of PAKS NPP structures, i.e. reactor building. The input in the form of time history of velocities or accelerations on the free field caused by blast testing was only available for the participants of the task No.7a. The aim of this task is to calculate the dynamic response to the blast load in the form of floor response spectra in selected nodes of the structure without knowing the measured data. The data measured by the full scale blast test are published and the results of different calculations compared. The following structures were taken into account: turbine hall, intermediate multi-storey building, lateral multi-storey building, reactor building, ventilation center and condenser towers

  17. Demonstration project as a procedure for accelerating the application of new technology (Charpie Task Force report). Volume II

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-02-01

    This report examines the issues associated with government programs proposed for the ''commercialization'' of new energy technologies; these programs are intended to hasten the pace at which target technologies are adopted by the private sector. The ''commercial demonstration'' is the principal tool used in these programs. Most previous government interventions in support of technological change have focused on R and D and left to the private sector the decision as to adoption for commercial utilization; thus there is relatively little in the way of analysis or experience which bears direct application. The analysis is divided into four sections. First, the role of R, D, and D within the structure of the national energy goals and policies is examined. The issue of ''prices versus gaps'' is described as a crucial difference of viewpoint concerning the role of the government in the future of the energy system. Second, the process of technological change as it occurs with respect to energy technologies is then examined for possible sources of misalignment of social and private incentives. The process is described as a series of investments. Third, correction of these sources of misalignment then becomes the goal of commercial demonstration programs as this goal and the means for attaining it are explored. Government-supported commercialization may be viewed as a subsidy to the introduction stage of the process; the circumstances under which such subsidies are likely to affect the success of the subsequent diffusion stage are addressed. The discussion then turns to the political, legal, and institutional problems. Finally, methods for evaluation and planning of commercial demonstration programs are analyzed. The critical areas of ignorance are highlighted and comprise a research agenda for improved analytical techniques to support decisions in this area.

  18. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  19. Thermal-Hydraulic Analysis Tasks for ANAV NPPs in Support of Plant Operation and Control

    Directory of Open Access Journals (Sweden)

    L. Batet

    2007-11-01

    Full Text Available Thermal-hydraulic analysis tasks aimed at supporting plant operation and control of nuclear power plants are an important issue for the Asociación Nuclear Ascó-Vandellòs (ANAV. ANAV is the consortium that runs the Ascó power plants (2 units and the Vandellòs-II power plant. The reactors are Westinghouse-design, 3-loop PWRs with an approximate electrical power of 1000 MW. The Technical University of Catalonia (UPC thermal-hydraulic analysis team has jointly worked together with ANAV engineers at different levels in the analysis and improvement of these reactors. This article is an illustration of the usefulness of computational analysis for operational support. The contents presented were operational between 1985 and 2001 and subsequently changed slightly following various organizational adjustments. The paper has two different parts. In the first part, it describes the specific aspects of thermal-hydraulic analysis tasks related to operation and control and, in the second part, it briefly presents the results of three examples of analyses that were performed. All the presented examples are related to actual situations in which the scenarios were studied by analysts using thermal-hydraulic codes and prepared nodalizations. The paper also includes a qualitative evaluation of the benefits obtained by ANAV through thermal-hydraulic analyses aimed at supporting operation and plant control.

  20. District heating and cooling systems for communities through power plant retrofit and distribution network. Volume 3. Tasks 4-6. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Watt, J.R.; Sommerfield, G.A.

    1979-08-01

    Stone and Webster Engineering Corporation is a member of the Demonstration Team to review and assess the technical aspects of cogeneration for district heating. Task 4 details the most practical retrofit schemes. Of the cogeneration schemes studied, a back-pressure turbine is considered the best source of steam for district heating. Battelle Columbus Laboratories is a member of the Demonstration Team employed to investigate several institutional issues affecting the success of district heating. The Toledo Edison legal staff reviewed the legal aspects of mandate to serve, easement and franchise requirements, and corporate charter requirements. The principal findings of both the Battelle investigations and the legal research are summarized in Task 5. A complete discussion of each issue is included in the two sections labeled Legal Issues and Institutional Issues. In Task 6, Battelle Columbus Laboratories completed a preliminary economic analysis, incorporating accurate input parameters applicable to utility ownership of the proposed district-heating system. The methodology used is summarized, the assumptions are listed, and the results are briefly reviewed.

  1. Implementation of Hierarchical Task Analysis for User Interface Design in Drawing Application for Early Childhood Education

    Directory of Open Access Journals (Sweden)

    Mira Kania Sabariah

    2016-05-01

    Full Text Available Draw learning in early childhood is an important lesson and full of stimulation of the process of growth and development of children which could help to train the fine motor skills. We have had a lot of applications that can be used to perform learning, including interactive learning applications. Referring to the observations that have been conducted showed that the experiences given by the applications that exist today are very diverse and have not been able to represent the model of learning and characteristics of early childhood (4-6 years. Based on the results, Hierarchical Task Analysis method generated a list of tasks that must be done in designing an user interface that represents the user experience in draw learning. Then by using the Heuristic Evaluation method the usability of the model has fulfilled a very good level of understanding and also it can be enhanced and produce a better model.

  2. Style, content and format guide for writing safety analysis documents. Volume 1, Safety analysis reports for DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    The purpose of Volume 1 of this 4-volume style guide is to furnish guidelines on writing and publishing Safety Analysis Reports (SARs) for DOE nuclear facilities at Sandia National Laboratories. The scope of Volume 1 encompasses not only the general guidelines for writing and publishing, but also the prescribed topics/appendices contents along with examples from typical SARs for DOE nuclear facilities.

  3. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  4. Genre analysis and task-based course design for isiXhosa second language teaching in local government contexts

    Directory of Open Access Journals (Sweden)

    Edith Venter

    2011-08-01

    Full Text Available The successful implementation of a multilingual language policy in the public and private sectors in South Africa depends on vibrant research. This article explores the design and nature of the isiXhosa communication tasks for specific purposes second language teaching in local government context, within a framework of genre-based and task-based approaches to language teaching. These two approaches also form the theoretical basis of the analysis of the rhetorical move structure and the task types of selected communication tasks.

  5. Space Tug Docking Study. Volume 5: Cost Analysis

    Science.gov (United States)

    1976-01-01

    The cost methodology, summary cost data, resulting cost estimates by Work Breakdown Structure (WBS), technical characteristics data, program funding schedules and the WBS for the costing are discussed. Cost estimates for two tasks of the study are reported. The first, developed cost estimates for design, development, test and evaluation (DDT&E) and theoretical first unit (TFU) at the component level (Level 7) for all items reported in the data base. Task B developed total subsystem DDT&E costs and funding schedules for the three candidate Rendezvous and Docking Systems: manual, autonomous, and hybrid.

  6. Analysis of Mexico wind tunnel measurements. Final report of IEA Task 29, Mexnext (Phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G.; Boorsma, K. [Energy research Center of the Netherlands ECN, Petten (Netherlands); Cho, T. [Korea Aerospace Research Institute KARI, Daejeon (Korea, Republic of); Gomez-Iradi, S. [National Renewable Energy Center of Spain CENER, Sarriguren (Spain); Schaffarczyk, P. [A. Jeromin University of Applied Sciences, CEWind EG, Kiel (Germany); Shen, W.Z. [The Technical University of Denmark, Kongens Lyngby (Denmark); Lutz, T. [K. Meister University of Stuttgart, Stuttgart (Germany); Stoevesandt, B. [ForWind, Zentrum fuer Windenergieforschung, Oldenburg (Germany); Schreck, S. [National Renewable Energy Laboratory NREL, Golden, CO (United States); Micallef, D.; Pereira, R.; Sant, T. [Delft University of Technology TUD, Delft (Netherlands); Madsen, H.A.; Soerensen, N. [Risoe-DTU, Roskilde (Denmark)

    2012-02-15

    This report describes the work performed within the first phase of IEA Task 29 Mexnext. In this IEA Task 29 a total of 20 organisations from 11 different countries collaborated in analysing the measurements which have been performed in the EU project 'Mexico'. Within this Mexico project 9 European institutes carried out a wind tunnel experiment in the Large Low Speed Facility (LLF) of the German Dutch Wind Facilities DNW on a rotor with a diameter of 4.5 m. Pressure distributions were measured at five locations along the blade along with detailed flow field measurements around the rotor plane using stereo PIV. As a result of the international collaboration within this task a very thorough analysis of the data could be carried out and a large number of codes were validated not only in terms of loads but also in terms of underlying flow field. The detailed pressure measurements along the blade in combination with the detailed flow field measurements gave a unique opportunity to better understand the response of a wind turbine to the incoming flow field. Deficiencies in modelling have been established and directions for model improvement can be given.

  7. Utilizing job/task analysis to establish content validity in the design of training programs

    International Nuclear Information System (INIS)

    The decade of the 1980's has been a turbulent time for the Department of Energy. With concern mounting about the terrorist threat, a wave of congressional inquiries and internal inspections crossed the nation and engulfed many of the nuclear laboratories and facilities operated by DOE contractors. A typical finding was the need to improve, and increase, the training of the protective force. The immediate reaction resulted in a wide variety of responses, with most contractors feeling safer with too much, rather than not enough training. As soon as the initial pressures to upgrade subsided, a task force was established to evaluate the overall training needs. Representatives from the contractor facilities worked together to conduct a job analysis of the protective force. A generic task inventory was established, and validated at the different sites. This list has been invaluable for determining the tasks, conditions, and standards needed to develop well stated learning objectives. The enhanced training programs are being refined to ensure job content validity based on the data collected

  8. Reliability of steam-turbine rotors. Task 1. Lifetime prediction analysis system. Final report

    International Nuclear Information System (INIS)

    Task 1 of RP 502, Reliability of Steam Turbine Rotors, resulted in the development of a computerized lifetime prediction analysis system (STRAP) for the automatic evaluation of rotor integrity based upon the results of a boresonic examination of near-bore defects. Concurrently an advanced boresonic examination system (TREES), designed to acquire data automatically for lifetime analysis, was developed and delivered to the maintenance shop of a major utility. This system and a semi-automated, state-of-the-art system (BUCS) were evaluated on two retired rotors as part of the Task 2 effort. A modified nonproprietary version of STRAP, called SAFER, is now available for rotor lifetime prediction analysis. STRAP and SAFER share a common fracture analysis postprocessor for rapid evaluation of either conventional boresonic amplitude data or TREES cell data. The final version of this postprocessor contains general stress intensity correlations for elliptical cracks in a radial stress gradient and provision for elastic-plastic instability of the ligament between an imbedded crack and the bore surface. Both linear elastic and ligament rupture models were developed for rapid analysis of linkup within three-dimensional clusters of defects. Bore stress-rupture criteria are included, but a creep-fatigue crack growth data base is not available. Physical and mechanical properties of air-melt 1CrMoV forgings are built into the program; however, only bounding values of fracture toughness versus temperature are available. Owing to the lack of data regarding the probability of flaw detection for the boresonic systems and of quantitative verification of the flaw linkup analysis, automatic evlauation of boresonic results is not recommended, and the lifetime prediction system is currently restricted to conservative, deterministic analysis of specified flaw geometries

  9. Brain connectivity analysis from EEG signals using stable phase-synchronized states during face perception tasks

    Science.gov (United States)

    Jamal, Wasifa; Das, Saptarshi; Maharatna, Koushik; Pan, Indranil; Kuyucu, Doga

    2015-09-01

    Degree of phase synchronization between different Electroencephalogram (EEG) channels is known to be the manifestation of the underlying mechanism of information coupling between different brain regions. In this paper, we apply a continuous wavelet transform (CWT) based analysis technique on EEG data, captured during face perception tasks, to explore the temporal evolution of phase synchronization, from the onset of a stimulus. Our explorations show that there exists a small set (typically 3-5) of unique synchronized patterns or synchrostates, each of which are stable of the order of milliseconds. Particularly, in the beta (β) band, which has been reported to be associated with visual processing task, the number of such stable states has been found to be three consistently. During processing of the stimulus, the switching between these states occurs abruptly but the switching characteristic follows a well-behaved and repeatable sequence. This is observed in a single subject analysis as well as a multiple-subject group-analysis in adults during face perception. We also show that although these patterns remain topographically similar for the general category of face perception task, the sequence of their occurrence and their temporal stability varies markedly between different face perception scenarios (stimuli) indicating toward different dynamical characteristics for information processing, which is stimulus-specific in nature. Subsequently, we translated these stable states into brain complex networks and derived informative network measures for characterizing the degree of segregated processing and information integration in those synchrostates, leading to a new methodology for characterizing information processing in human brain. The proposed methodology of modeling the functional brain connectivity through the synchrostates may be viewed as a new way of quantitative characterization of the cognitive ability of the subject, stimuli and information integration

  10. Performance Analysis Of A Upnp/Dhcompliant Robotic Adapter For Collaborative Tasks Development

    Directory of Open Access Journals (Sweden)

    Alejandro Alvarez Vazquez

    2012-02-01

    Full Text Available This paper describes the performance analysis of an adapter in accordance with standard UPnP DHCompliant (Digital Home Compliant for a service robot. The DHCompliant adapter has been developed to solve some limitations that UPnP protocol suffers and to develop new DHC concepts. Moreover, it showcases with a particular example how the open protocol DHC is useful for the development of collaborative tasks, localization, energy management and other fields altogether. That interoperability is being done between devices obtaining a virtual device which can obtain the controlpoint logic and the device logic simultaneously.

  11. Determination of fiber volume in graphite/epoxy materials using computer image analysis

    Science.gov (United States)

    Viens, Michael J.

    1990-01-01

    The fiber volume of graphite/epoxy specimens was determined by analyzing optical images of cross sectioned specimens using image analysis software. Test specimens were mounted and polished using standard metallographic techniques and examined at 1000 times magnification. Fiber volume determined using the optical imaging agreed well with values determined using the standard acid digestion technique. The results were found to agree within 5 percent over a fiber volume range of 45 to 70 percent. The error observed is believed to arise from fiber volume variations within the graphite/epoxy panels themselves. The determination of ply orientation using image analysis techniques is also addressed.

  12. Nonnegative least-correlated component analysis for separation of dependent sources by volume maximization.

    Science.gov (United States)

    Wang, Fa-Yu; Chi, Chong-Yung; Chan, Tsung-Han; Wang, Yue

    2010-05-01

    Although significant efforts have been made in developing nonnegative blind source separation techniques, accurate separation of positive yet dependent sources remains a challenging task. In this paper, a joint correlation function of multiple signals is proposed to reveal and confirm that the observations after nonnegative mixing would have higher joint correlation than the original unknown sources. Accordingly, a new nonnegative least-correlated component analysis (n/LCA) method is proposed to design the unmixing matrix by minimizing the joint correlation function among the estimated nonnegative sources. In addition to a closed-form solution for unmixing two mixtures of two sources, the general algorithm of n/LCA for the multisource case is developed based on an iterative volume maximization (IVM) principle and linear programming. The source identifiability and required conditions are discussed and proven. The proposed n/LCA algorithm, denoted by n/LCA-IVM, is evaluated with both simulation data and real biomedical data to demonstrate its superior performance over several existing benchmark methods. PMID:20299711

  13. Mission analysis of photovoltaic solar energy conversion. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, S.L.; Rattin, E.J.; Siegel, B.

    1977-03-01

    An investigation of terrestrial applications for the photovoltaic conversion of solar energy is summarized. The specific objectives of the study were: (a) to survey and evaluate near-term (1976--1985) civilian photovoltaic applications in the United States; (b) to evaluate the most promising major missions for the mid-term period (1986--2000) and to determine the conditions under which photovoltaic technology can compete in those applications at array prices consistent with ERDA goals; (c) to address critical external issues and identify the sensitivity of photovoltaic system technical requirements to such factors; and (d) to quantify the societal costs of alternative energy sources and identify equalizing incentives. The study was divided into six separate but interrelated tasks: Task 1, Analysis of Near-Term Applications; Task 2, Analysis of Major Mid-Term Missions; Task 3, Review and Updating of the ERDA Technology Implementation Plan; Task 4, Critical External Issues; Task 5, The Impact of Incentives; and Task 6, The Societal Costs of Conventional Power Generation. The emphasis of the study was on the first two of these tasks, the other four serving to provide supplementary information.

  14. Performance Task using Video Analysis and Modelling to promote K12 eight practices of science

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    We will share on the use of Tracker as a pedagogical tool in the effective learning and teaching of physics performance tasks taking root in some Singapore Grade 9 (Secondary 3) schools. We discuss the pedagogical use of Tracker help students to be like scientists in these 6 to 10 weeks where all Grade 9 students are to conduct a personal video analysis and where appropriate the 8 practices of sciences (1. ask question, 2. use models, 3. Plan and carry out investigation, 4. Analyse and interpret data, 5. Using mathematical and computational thinking, 6. Construct explanations, 7. Discuss from evidence and 8. Communicating information). We will situate our sharing on actual students work and discuss how tracker could be an effective pedagogical tool. Initial research findings suggest that allowing learners conduct performance task using Tracker, a free open source video analysis and modelling tool, guided by the 8 practices of sciences and engineering, could be an innovative and effective way to mentor authent...

  15. The human factors and job task analysis in nuclear power plant operation

    International Nuclear Information System (INIS)

    After a long period of time, during the development of the NPP technology, where the plant hardware has been considered to be the main factor for a safe, reliable and economic operation, the industry is now changing to an adequate responsibility of plant hardware and operation. Since the human factors has been not discussed methodically so far, there is still a lack of improved classification systems for human errors as well as a lack of methods for the systematic approach in designing the operator's working system, as for instance by using the job task analysis (J.T.A.). The J.T.A. appears to be an adequate method to study the human factor in the nuclear power plant operation, enabling an easy conversion to operational improvements. While the results of the analysis of human errors tell 'what' is to be improved, the J.T.A. shows 'how' to improve, for increasing the quality of the work and the safety of the operator's working system. The paper analyses the issue of setting the task and displays four criteria used to select aspects in NPP operation which require special consideration as personal training, design of control room, content and layout of the procedure manual, or organizing the operating personnel. The results are given as three tables giving: 1- Evaluation of deficiencies in the Working System; 2- Evaluation of the Deficiencies of Operator's Disposition; 3- Evaluation of the Mental Structure of Operation

  16. Automated segmentation and dose-volume analysis with DICOMautomaton

    Science.gov (United States)

    Clark, H.; Thomas, S.; Moiseenko, V.; Lee, R.; Gill, B.; Duzenli, C.; Wu, J.

    2014-03-01

    Purpose: Exploration of historical data for regional organ dose sensitivity is limited by the effort needed to (sub-)segment large numbers of contours. A system has been developed which can rapidly perform autonomous contour sub-segmentation and generic dose-volume computations, substantially reducing the effort required for exploratory analyses. Methods: A contour-centric approach is taken which enables lossless, reversible segmentation and dramatically reduces computation time compared with voxel-centric approaches. Segmentation can be specified on a per-contour, per-organ, or per-patient basis, and can be performed along either an embedded plane or in terms of the contour's bounds (e.g., split organ into fractional-volume/dose pieces along any 3D unit vector). More complex segmentation techniques are available. Anonymized data from 60 head-and-neck cancer patients were used to compare dose-volume computations with Varian's EclipseTM (Varian Medical Systems, Inc.). Results: Mean doses and Dose-volume-histograms computed agree strongly with Varian's EclipseTM. Contours which have been segmented can be injected back into patient data permanently and in a Digital Imaging and Communication in Medicine (DICOM)-conforming manner. Lossless segmentation persists across such injection, and remains fully reversible. Conclusions: DICOMautomaton allows researchers to rapidly, accurately, and autonomously segment large amounts of data into intricate structures suitable for analyses of regional organ dose sensitivity.

  17. Job/task analysis for I ampersand C [Instrumentation and Controls] instrument technicians at the High Flux Isotope Reactor

    International Nuclear Information System (INIS)

    To comply with Department of Energy Order 5480.XX (Draft), a job/task analysis was initiated by the Maintenance Management Department at Oak Ridge National Laboratory (ORNL). The analysis was applicable to instrument technicians working at the ORNL High Flux Isotope Reactor (HFIR). This document presents the procedures and results of that analysis. 2 refs., 2 figs

  18. Analysis of some nuclear waste management options. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Berman, L.E.; Ensminger, D.A.; Giuffre, M.S.; Koplik, C.M.; Oston, S.G.; Pollak, G.D.; Ross, B.I.

    1978-10-10

    This report describes risk analyses performed on that portion of a nuclear fuel cycle which begins following solidification of high-level waste. Risks associated with handling, interim storage and transportation of the waste are assessed, as well as the long term implications of disposal in deep mined cavities. The risk is expressed in terms of expected dose to the general population and peak dose to individuals in the population. This volume consists of appendices which provide technical details of the work performed.

  19. Analysis of airborne radiometric data. Volume 3. Topical reports

    International Nuclear Information System (INIS)

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors

  20. Measuring novices' field mapping abilities using an in-class exercise based on expert task analysis

    Science.gov (United States)

    Caulkins, J. L.

    2010-12-01

    We are interested in developing a model of expert-like behavior for improving the teaching methods of undergraduate field geology. Our aim is to assist students in mastering the process of field mapping more efficiently and effectively and to improve their ability to think creatively in the field. To examine expert-mapping behavior, a cognitive task analysis was conducted with expert geologic mappers in an attempt to define the process of geologic mapping (i.e. to understand how experts carry out geological mapping). The task analysis indicates that expert mappers have a wealth of geologic scenarios at their disposal that they compare against examples seen in the field, experiences that most undergraduate mappers will not have had. While presenting students with many geological examples in class may increase their understanding of geologic processes, novices still struggle when presented with a novel field situation. Based on the task analysis, a short (45-minute) paper-map-based exercise was designed and tested with 14 pairs of 3rd year geology students. The exercise asks students to generate probable geologic models based on a series of four (4) data sets. Each data set represents a day’s worth of data; after the first “day,” new sheets simply include current and previously collected data (e.g. “Day 2” data set includes data from “Day 1” plus the new “Day 2” data). As the geologic complexity increases, students must adapt, reject or generate new geologic models in order to fit the growing data set. Preliminary results of the exercise indicate that students who produced more probable geologic models, and produced higher ratios of probable to improbable models, tended to go on to do better on the mapping exercises at the 3rd year field school. These results suggest that those students with more cognitively available geologic models may be more able to use these models in field settings than those who are unable to draw on these models for whatever

  1. Three-dimensional knee kinematics by conventional gait analysis for eleven motor tasks of daily living: typical patterns and repeatability.

    Science.gov (United States)

    Scheys, Lennart; Leardini, Alberto; Wong, Pius D; Van Camp, Laurent; Callewaert, Barbara; Bellemans, Johan; Desloovere, Kaat

    2013-04-01

    The availability of detailed knee kinematic data during various activities can facilitate clinical studies of this joint. To describe in detail normal knee joint rotations in all three anatomical planes, 25 healthy subjects (aged 22-49 years) performed eleven motor tasks, including walking, step ascent and descent, each with and without sidestep or crossover turns, chair rise, mild and deep squats, and forward lunge. Kinematic data were obtained with a conventional lower-body gait analysis protocol over three trials per task. To assess the repeatability with standard indices, a representative subset of 10 subjects underwent three repetitions of the entire motion capture session. Extracted parameters with good repeatability included maximum and minimum axial rotation during turning, local extremes of the flexion curves during gait tasks, and stride times. These specific repeatable parameters can be used for task selection or power analysis when planning future clinical studies.

  2. Assessment of solar options for small power systems applications. Volume III. Analysis of concepts

    Energy Technology Data Exchange (ETDEWEB)

    Laity, W.W.; Aase, D.T.; Apley, W.J.; Bird, S.P.; Drost, M.K.; Garrett-Price, B.A.; Williams, T.A.

    1980-09-01

    A comparative analysis of solar thermal conversion concepts that are potentially suitable for development as small electric power systems (1 to 10 MWe) is given. Seven generic types of collectors, together with associated subsystems for electric power generation, were considered. The collectors can be classified into three categories: (1) two-axis tracking (with compound-curvature reflecting surfaces; (2) one-axis tracking (with single-curvature reflecting suraces; and (3) nontracking (with low-concentration reflecting surfaces). All seven collectors were analyzed in conceptual system configurations with Rankine-cycle engines. In addition, two of the collectors (the Point Focus Central Receiver and the Point Focus Distributed Receiver) were analyzed with Brayton-cycle engines, and the latter of the two also was analyzed with Stirling-cycle engines. This volume describes the systems analyses performed on all the alternative configurations of the seven generic collector concepts and the results obtained. The SOLSTEP computer code used to determine each configuration's system cost and performance is briefly described. The collector and receiver performance calculations used are also presented. The capital investment and related costs that were obtained from the systems studies are presented, and the levelized energy costs are given as a function of capacity factor obtained from the systems studies. Included also are the values of the other attributes used in the concepts' final ranking. The comments, conclusions, and recommendations developed by the PNL study team during the concept characterization and systems analysis tasks of the study are presented. (WHK)

  3. Curriculum Construction: A Critical Analysis of Rich Tasks in the Recontextualisation Field

    Science.gov (United States)

    Macdonald, Doune; Hunter, Lisa; Tinning, Richard

    2007-01-01

    Within Education Queensland's recent "new basics" curriculum initiative, Education Queensland developed 20 transdisciplinary learning and assessment tasks for Years 1 to 9, called "rich tasks". This paper critiques two of the rich tasks that were most closely aligned to knowledge and skills within the health and physical education learning area.…

  4. Human factors assessment in PRA using Task Analysis Linked Evaluation Technique (TALENT)

    International Nuclear Information System (INIS)

    Thirty years ago the US military and US aviation industry, and more recently, in response to the US Three Mile Island and USSR Chernobyl accidents, the US commercial nuclear power industry, acknowledged that human error, as an immediate precursor, and as a latent or indirect influence in the form of training, maintainability, inservice test, and surveillance programs, is a primary contributor to unreality and risk in complex high-reliability systems. A 1985 Nuclear Regulatory Commission (NRC) study of Licensee Event Reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Despite the magnitude and nature of human error cited in that study, there has been limited attention to personnel-centered issues, especially person-to-person issues involving group processes, management and organizational environment. The paper discusses NRC integration and applications research with respect to the Task Analysis Linked Evaluation Technique (TALENT) in risk assessment applications

  5. Performance-based training: from job and task analysis to training materials

    International Nuclear Information System (INIS)

    Historically, the smoke filled room approach has been used to revise training programs: instructors would sit down and design a program based on existing training materials and any federal requirements that applied. This failure to reflect a systematic definition of required job functions, responsibilities and performance standards in training programs has resulted in generic program deficiencies: they do not provide complete training of required skills and knowledge. Recognition of this need for change, coupled with a decrease in experienced industry personnel inputs and long training pipelines, has heightened the need for efficient performance-based training programs which are derived from and referenced to job performance criteria. This paper presents the process for developing performance-based training materials based on job and task analysis products

  6. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Tozer-Loft, S.M

    2000-12-01

    A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy is explained in detail, and some improvements proposed. These 'natural' histograms are extended to show the effects of real point sources which do not exactly follow the inverse-square law, and to demonstrate the in-target dose-volume distribution, previously unpublished. The histograms are used as a way of mathematically analysing the properties of theoretical mono-energetic radionuclides, and for demonstrating the dosimetric properties of a potential new brachytherapy source (Ytterbium-169). A new modification of the Anderson formalism is then described for producing Anderson Inverse-Square Shifted (AISS) histograms for the Gamma Knife, which are shown to be useful for demonstrating the quality of stereotactic radiosurgery dose distributions. A study is performed analysing the results of Gamma Knife treatments on 44 patients suffering from a benign brain tumour (acoustic neuroma). Follow-up data is used to estimate the volume shrinkage or growth of each tumour, and this measure of outcome is compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising

  7. Detecting Hidden Encrypted Volume Files via Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Mario Piccinelli

    2015-05-01

    Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.

  8. Synfuel program analysis. Volume 2: VENVAL users manual

    Science.gov (United States)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.

  9. Control Volume Analysis, Entropy Balance and the Entropy Production in Flow Systems

    CERN Document Server

    Niven, Robert K

    2014-01-01

    This chapter concerns "control volume analysis", the standard engineering tool for the analysis of flow systems, and its application to entropy balance calculations. Firstly, the principles of control volume analysis are enunciated and applied to flows of conserved quantities (e.g. mass, momentum, energy) through a control volume, giving integral (Reynolds transport theorem) and differential forms of the conservation equations. Several definitions of steady state are discussed. The concept of "entropy" is then established using Jaynes' maximum entropy method, both in general and in equilibrium thermodynamics. The thermodynamic entropy then gives the "entropy production" concept. Equations for the entropy production are then derived for simple, integral and infinitesimal flow systems. Some technical aspects are examined, including discrete and continuum representations of volume elements, the effect of radiation, and the analysis of systems subdivided into compartments. A Reynolds decomposition of the entropy ...

  10. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks.

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes. PMID:26193332

  11. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  12. Hydrogen Safety Project chemical analysis support task: Window ``C`` volatile organic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  13. Hydrogen Safety Project chemical analysis support task: Window C'' volatile organic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.; Hoope, E.A.

    1992-01-01

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requested analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.

  14. Turnaround operations analysis for OTV. Volume 2: Detailed technical report

    Science.gov (United States)

    1988-01-01

    The objectives and accomplishments were to adapt and apply the newly created database of Shuttle/Centaur ground operations. Previously defined turnaround operations analyses were to be updated for ground-based OTVs (GBOTVs) and space-based OTVs (SBOTVs), design requirements identified for both OTV and Space Station accommodations hardware, turnaround operations costs estimated, and a technology development plan generated to develop the required capabilities. Technical and programmatic data were provided for NASA pertinent to OTV round and space operations requirements, turnaround operations, task descriptions, timelines and manpower requirements, OTV modular design and booster and Space Station interface requirements. SBOTV accommodations development schedule, cost and turnaround operations requirements, and a technology development plan for ground and space operations and space-based accommodations facilities and support equipment. Significant conclusion are discussed.

  15. Unpacking High and Low Efficacy Teachers' Task Analysis and Competence Assessment in Teaching Low-Achieving Students in Secondary Schools

    Science.gov (United States)

    Wang, Li-Yi; Jen-Yi, Li; Tan, Liang-See; Tan, Irene; Lim, Xue-Fang; Wu, Bing Sheng

    2016-01-01

    This study adopted a pragmatic qualitative research design to unpack high and low efficacy teachers' task analysis and competence assessment in the context of teaching low-achieving students. Nine secondary school English and Science teachers were recruited and interviewed. Results of thematic analysis show that helping students perform well in…

  16. Discriminant analysis in schizophrenia and healthy subjects using prefrontal activation during frontal lobe tasks: a near-infrared spectroscopy.

    Science.gov (United States)

    Azechi, Michiyo; Iwase, Masao; Ikezawa, Koji; Takahashi, Hidetoshi; Canuet, Leonides; Kurimoto, Ryu; Nakahachi, Takayuki; Ishii, Ryouhei; Fukumoto, Motoyuki; Ohi, Kazutaka; Yasuda, Yuka; Kazui, Hiroaki; Hashimoto, Ryota; Takeda, Masatoshi

    2010-03-01

    While psychiatric disorders such as schizophrenia are largely diagnosed on symptomatology, several studies have attempted to determine which biomarkers can discriminate schizophrenia patients from non-patients with schizophrenia. The objective of this study is to assess whether near-infrared spectroscopy (NIRS) measurement can distinguish schizophrenia patients from healthy subjects. Sixty patients with schizophrenia and sixty age- and gender-matched healthy controls were divided into two sequential groups. The concentration change in oxygenated hemoglobin (Delta[oxy-Hb]) was measured in the bilateral prefrontal areas (Fp1-F7 and Fp2-F8) during the Verbal Fluency Test (VFT) letter version and category version, Tower of Hanoi (TOH), Sternberg's (SBT) and Stroop Tasks. In the first group, schizophrenia patients showed poorer task performance on all tasks and less prefrontal cortex activation during all but the Stroop Task compared to healthy subjects. In the second group, schizophrenia patients showed poorer task performance and less prefrontal cortex activation during VFTs and TOH tasks than healthy subjects. We then performed discriminant analysis by a stepwise method using Delta[oxy-Hb] and task performance measures as independent variables. The discriminant analysis in the first group included task performance of TOH, VFT letter and VFT category and Delta[oxy-Hb] of VFT letter. As a result, 88.3% of the participants were correctly classified as being schizophrenic or healthy subjects in the first analysis. The discriminant function derived from the first group correctly assigned 75% of the subjects in the second group. Our findings suggest that NIRS measurement could be applied to differentiate patients with schizophrenia from healthy subjects. PMID:19896332

  17. Measurement and analysis of grain boundary grooving by volume diffusion

    Science.gov (United States)

    Hardy, S. C.; Mcfadden, G. B.; Coriell, S. R.; Voorhees, P. W.; Sekerka, R. F.

    1991-01-01

    Experimental measurements of isothermal grain boundary grooving by volume diffusion are carried out for Sn bicrystals in the Sn-Pb system near the eutectic temperature. The dimensions of the groove increase with a temporal exponent of 1/3, and measurement of the associated rate constant allows the determination of the product of the liquid diffusion coefficient D and the capillarity length Gamma associated with the interfacial free energy of the crystal-melt interface. The small-slope theory of Mullins is generalized to the entire range of dihedral angles by using a boundary integral formulation of the associated free boundary problem, and excellent agreement with experimental groove shapes is obtained. By using the diffusivity measured by Jordon and Hunt, the present measured values of Gamma are found to agree to within 5 percent with the values obtained from experiments by Gunduz and Hunt on grain boundary grooving in a temperature gradient.

  18. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    CERN Document Server

    Tozer-Loft, S M

    2000-01-01

    compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising, but significant association with outcome. A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy i...

  19. Job task and functional analysis of the Division of Reactor Projects, office of Nuclear Reactor Regulation. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, J.A.; Gilmore, W.; Hahn, H.A.

    1998-07-10

    A job task and functional analysis was recently completed for the positions that make up the regional Divisions of Reactor Projects. Among the conclusions of that analysis was a recommendation to clarify roles and responsibilities among site, regional, and headquarters personnel. As that analysis did not cover headquarters personnel, a similar analysis was undertaken of three headquarters positions within the Division of Reactor Projects: Licensing Assistants, Project Managers, and Project Directors. The goals of this analysis were to systematically evaluate the tasks performed by these headquarters personnel to determine job training requirements, to account for variations due to division/regional assignment or differences in several experience categories, and to determine how, and by which positions, certain functions are best performed. The results of this analysis include recommendations for training and for job design. Data to support this analysis was collected by a survey instrument and through several sets of focus group meetings with representatives from each position.

  20. Economic analysis of the space shuttle system, volume 1

    Science.gov (United States)

    1972-01-01

    An economic analysis of the space shuttle system is presented. The analysis is based on economic benefits, recurring costs, non-recurring costs, and ecomomic tradeoff functions. The most economic space shuttle configuration is determined on the basis of: (1) objectives of reusable space transportation system, (2) various space transportation systems considered and (3) alternative space shuttle systems.

  1. Set-based Tasks within the Singularity-robust Multiple Task-priority Inverse Kinematics Framework: General Formulation, Stability Analysis and Experimental Results

    Directory of Open Access Journals (Sweden)

    Signe eMoe

    2016-04-01

    Full Text Available Inverse kinematics algorithms are commonly used in robotic systems to transform tasks to joint references, and several methods exist to ensure the achievement of several tasks simultaneously. The multiple task-priority inverse kinematicsframework allows tasks to be considered in a prioritized order by projecting task velocities through the nullspaces of higherpriority tasks. This paper extends this framework to handle setbased tasks, i.e. tasks with a range of valid values, in addition to equality tasks, which have a specific desired value. Examples of set-based tasks are joint limit and obstacle avoidance. The proposed method is proven to ensure asymptotic convergence of the equality task errors and the satisfaction of all high-priority set-based tasks. The practical implementation of the proposed algorithm is discussed, and experimental results are presented where a number of both set-based and equality tasks have been implemented on a 6 degree of freedom UR5 which is an industrial robotic arm from Universal Robots. The experiments validate thetheoretical results and confirm the effectiveness of the proposed approach.

  2. Study on the utilization of the cognitive architecture EPIC to the task analysis of a nuclear power plant operator

    International Nuclear Information System (INIS)

    This work presents a study of the use of the integrative cognitive architecture EPIC - Executive-Process - Interactive-Control, designed to evaluate the performance of a person performing tasks in parallel in a man-machine interface, as a methodology for Cognitive Task Analysis of a nuclear power plant operator. A comparison of the results obtained by the simulation by EPIC and the results obtained by application of the MHP model to the tasks performed by a shift operator during the execution of the procedure PO-E-3 - Steam Generator Tube Rupture of Angra 1 Nuclear Power Plant is done. To subsidize that comparison, an experiment was performed at the Angra 2 Nuclear Power Plant Full Scope Simulator in which three operator tasks were executed, its completion time measured and compared with the results of MHP and EPIC modeling. (author)

  3. Analysis of industrial tasks as a tool for the inclusion of people with disabilities in the work market.

    Science.gov (United States)

    Simonelli, Angela Paula; Camarotto, João Alberto

    2008-01-01

    This article describes the application of a model for analyzing industrial tasks that was developed to identify jobs that could potentially be filled by people with disabilities (DP) and to serve as a guideline for a company hiring policy. In Brazil, Law No. 8213/91 makes it obligatory to hire DP based on quotas that are established according to the number of employees in a public and private company. Using a set of methods and techniques based on ergonomic work analysis and on occupational therapy, we sought to build a model to indicate the skills required to perform industrial tasks. The model was applied at 19 workstations at a Brazilian aircraft manufacturer in 2002. The task supervisor and the operator performing the task were interviewed, the work activity was filmed, a kinesiological analysis was done, the task was observed and a checklist was applied to help recognize and systematize the skills involved in performing the job task. The last step consisted of correlating the skills required to perform the task to the potential skills of the various types of disability. It was found that 100% of the jobs could be filled by workers with low-level paraplegia, 89% by workers with general paraplegia, 0% with low-level tetraplegia, 47% with auditory impairment, 42% with hemiplegia, 68% with upper limb amputees wearing adequate prostheses, and 89% handicapped wheelchair users. The company hired 14 DP based on the results of this model. The model proved adequate for analyzing industrial tasks with a view to the inclusion of DP, and it can be applied to other sectors of industrial production.

  4. STICAP: A linear circuit analysis program with stiff systems capability. Volume 1: Theory manual. [network analysis

    Science.gov (United States)

    Cooke, C. H.

    1975-01-01

    STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.

  5. Video Analysis and Modeling Performance Task to promote becoming like scientists in classrooms

    CERN Document Server

    Wee, Loo Kang

    2015-01-01

    This paper aims to share the use of Tracker a free open source video analysis and modeling tool that is increasingly used as a pedagogical tool for the effective learning and teaching of Physics for Grade 9 Secondary 3 students in Singapore schools to make physics relevant to the real world. We discuss the pedagogical use of Tracker, guided by the Framework for K-12 Science Education by National Research Council, USA to help students to be more like scientists. For a period of 6 to 10 weeks, students use a video analysis coupled with the 8 practices of sciences such as 1. Ask question, 2. Use models, 3. Plan and carry out investigation, 4. Analyse and interpret data, 5. Use mathematical and computational thinking, 6. Construct explanations, 7. Argue from evidence and 8. Communicate information. This papers focus in on discussing some of the performance task design ideas such as 3.1 flip video, 3.2 starting with simple classroom activities, 3.3 primer science activity, 3.4 integrative dynamics and kinematics l...

  6. Human factors assessment in PRA using task analysis linked evaluation technique (TALENT)

    International Nuclear Information System (INIS)

    Human error is a primary contributor to risk in complex high-reliability systems. A 1985 U.S. Nuclear Regulatory Commission (USNRC) study of licensee event reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Since then, the USNRC has initiated research to fully and properly integrate human errors into the probabilistic risk assessment (PRA) process. The resulting implementation procedure is known as the Task Analysis Linked Evaluation Technique (TALENT). As indicated, TALENT is a broad-based method for integrating human factors expertise into the PRA process. This process achieves results which: (1) provide more realistic estimates of the impact of human performance on nuclear power safety, (2) can be fully audited, (3) provide a firm technical base for equipment-centered and personnel-centered retrofit/redesign of plants enabling them to meet internally and externally imposed safety standards, and (4) yield human and hardware data capable of supporting inquiries into human performance issues that transcend the individual plant. The TALENT procedure is being field-tested to verify its effectiveness and utility. The objectives of the field-test are to examine (1) the operability of the process, (2) its acceptability to the users, and (3) its usefulness for achieving measurable improvements in the credibility of the analysis. The field-test will provide the information needed to enhance the TALENT process

  7. Frequency analysis of a task-evoked pupillary response: Luminance-independent measure of mental effort.

    Science.gov (United States)

    Peysakhovich, Vsevolod; Causse, Mickaël; Scannella, Sébastien; Dehais, Frédéric

    2015-07-01

    Pupil diameter is a widely-studied cognitive load measure, which, despite its convenience for non-intrusive operator state monitoring in complex environments, is still not available for in situ measurements because of numerous methodological limitations. The most important of these limitations is the influence of pupillary light reflex. Hence, there is the need of providing a pupil-based cognitive load measure that is independent of light conditions. In this paper, we present a promising technique of pupillary signal analysis resulting in luminance-independent measure of mental effort that could be used in real-time without a priori on luminous conditions. Twenty-two participants performed a short-term memory task under different screen luminance conditions. Our results showed that the amplitude of pupillary dilation due to load on memory was luminance-dependent with higher amplitude corresponding to lower-luminance condition. Furthermore, our experimentation showed that load on memory and luminance factors express themselves differently according to frequency. Therefore, as our statistical analysis revealed, the ratio between low (0-1.6 Hz) and high frequency (1.6-4 Hz) bands (LF/HF ratio) of power spectral densities of pupillary signal is sensitive to the cognitive load but not to luminance. Our results are promising for the measurement of load on memory in ecological settings. PMID:25941013

  8. Multifamily Quality Control Inspector Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Quality Control Inspector JTA identifies and catalogs all of the tasks performed by multifamily quality control inspectors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  9. Multifamily Energy Auditor Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Energy Auditor JTA identifies and catalogs all of the tasks performed by multifamily energy auditors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  10. Multifamily Retrofit Project Manager Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Retrofit Project Manager JTA identifies and catalogs all of the tasks performed by multifamily retrofit project managers, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  11. Multifamily Building Operator Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Building Operator JTA identifies and catalogs all of the tasks performed by multifamily building operators, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  12. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  13. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  14. SLUDGE TREATMENT PROJECT ALTERNATIVES ANALYSIS SUMMARY REPORT (VOLUME 1)

    International Nuclear Information System (INIS)

    Highly radioactive sludge (containing up to 300,000 curies of actinides and fission products) resulting from the storage of degraded spent nuclear fuel is currently stored in temporary containers located in the 105-K West storage basin near the Columbia River. The background, history, and known characteristics of this sludge are discussed in Section 2 of this report. There are many compelling reasons to remove this sludge from the K-Basin. These reasons are discussed in detail in Section1, and they include the following: (1) Reduce the risk to the public (from a potential release of highly radioactive material as fine respirable particles by airborne or waterborn pathways); (2) Reduce the risk overall to the Hanford worker; and (3) Reduce the risk to the environment (the K-Basin is situated above a hazardous chemical contaminant plume and hinders remediation of the plume until the sludge is removed). The DOE-RL has stated that a key DOE objective is to remove the sludge from the K-West Basin and River Corridor as soon as possible, which will reduce risks to the environment, allow for remediation of contaminated areas underlying the basins, and support closure of the 100-KR-4 operable unit. The environmental and nuclear safety risks associated with this sludge have resulted in multiple legal and regulatory remedial action decisions, plans,and commitments that are summarized in Table ES-1 and discussed in more detail in Volume 2, Section 9

  15. Structural analysis of cylindrical thrust chambers, volume 1

    Science.gov (United States)

    Armstrong, W. H.

    1979-01-01

    Life predictions of regeneratively cooled rocket thrust chambers are normally derived from classical material fatigue principles. The failures observed in experimental thrust chambers do not appear to be due entirely to material fatigue. The chamber coolant walls in the failed areas exhibit progressive bulging and thinning during cyclic firings until the wall stress finally exceeds the material rupture stress and failure occurs. A preliminary analysis of an oxygen free high conductivity (OFHC) copper cylindrical thrust chamber demonstrated that the inclusion of cumulative cyclic plastic effects enables the observed coolant wall thinout to be predicted. The thinout curve constructed from the referent analysis of 10 firing cycles was extrapolated from the tenth cycle to the 200th cycle. The preliminary OFHC copper chamber 10-cycle analysis was extended so that the extrapolated thinout curve could be established by performing cyclic analysis of deformed configurations at 100 and 200 cycles. Thus the original range of extrapolation was reduced and the thinout curve was adjusted by using calculated thinout rates at 100 and 100 cycles. An analysis of the same underformed chamber model constructed of half-hard Amzirc to study the effect of material properties on the thinout curve is included.

  16. Photovoltaic venture analysis. Final report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    A description of the integrating model for photovoltaic venture analysis is given; input assumptions for the model are described; and the integrating model program listing is given. The integrating model is an explicit representation of the interactions between photovoltaic markets and supply under alternative sets of assumptions. It provides a consistent way of assembling and integrating the various assumptions, data, and information that have been obtained on photovoltaic systems supply and demand factors. Secondly, it provides a mechanism for understanding the implications of all the interacting assumptions. By representing the assumptions in a common, explicit framework, much more complex interactions can be considered than are possible intuitively. The integrating model therefore provides a way of examining the relative importance of different assumptions, parameters, and inputs through sensitivity analysis. Also, detailed results of model sensitivity analysis and detailed market and systems information are presented. (WHK)

  17. Thermal characterization and analysis of microliter liquid volumes using the three-omega method.

    Science.gov (United States)

    Roy-Panzer, Shilpi; Kodama, Takashi; Lingamneni, Srilakshmi; Panzer, Matthew A; Asheghi, Mehdi; Goodson, Kenneth E

    2015-02-01

    Thermal phenomena in many biological systems offer an alternative detection opportunity for quantifying relevant sample properties. While there is substantial prior work on thermal characterization methods for fluids, the push in the biology and biomedical research communities towards analysis of reduced sample volumes drives a need to extend and scale these techniques to these volumes of interest, which can be below 100 pl. This work applies the 3ω technique to measure the temperature-dependent thermal conductivity and heat capacity of de-ionized water, silicone oil, and salt buffer solution droplets from 24 to 80 °C. Heater geometries range in length from 200 to 700 μm and in width from 2 to 5 μm to accommodate the size restrictions imposed by small volume droplets. We use these devices to measure droplet volumes of 2 μl and demonstrate the potential to extend this technique down to pl droplet volumes based on an analysis of the thermally probed volume. Sensitivity and uncertainty analyses provide guidance for relevant design variables for characterizing properties of interest by investigating the tradeoffs between measurement frequency regime, device geometry, and substrate material. Experimental results show that we can extract thermal conductivity and heat capacity with these sample volumes to within less than 1% of thermal properties reported in the literature.

  18. A task-based analysis of machinery entanglement injuries among Western Canadian farmers.

    Science.gov (United States)

    Narasimhan, Gopinath; Crowe, Trever G; Peng, Yingwei; Hagel, Louise; Dosman, James; Pickett, William

    2011-10-01

    Machinery entanglements are a leading cause of hospitalized injury on Canadian farms. This study evaluates the role farm tasks play in the occurrence of machinery entanglement events. A retrospective case series of 41 entanglement injuries involving 35 farm-machinery types was assembled. Only a few limited tasks were implicated in the majority of entanglements. These tasks were as follows: (1) field adjustments of machinery; (2) product handling and conveyance; and (3) driveline attachments and servicing. Hazards inherent and common to these tasks affected the behavior of farmers, leading to entanglements. This study establishes a need to identify hazards and assess risks associated with different tasks involving the use of farm machinery under actual field situations. Systemic changes are required to improve existing machinery safety practices through engineering, work methods, and work practice modifications. In addition to design solutions, occupational health and safety strategies should consider activities associated with hazardous situations to inform the content of injury prevention efforts.

  19. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  20. Oak Ridge Health Studies Phase 1 report, Volume 2: Part D, Dose Reconstruction Feasibility Study. Tasks 6, Hazard summaries for important materials at the Oak Ridge Reservation

    Energy Technology Data Exchange (ETDEWEB)

    Bruce, G.M.; Walker, L.B.; Widner, T.E.

    1993-09-01

    The purpose of Task 6 of Oak Ridge Phase I Health Studies is to provide summaries of current knowledge of toxic and hazardous properties of materials that are important for the Oak Ridge Reservation. The information gathered in the course of Task 6 investigations will support the task of focussing any future health studies efforts on those operations and emissions which have likely been most significant in terms of off-site health risk. The information gathered in Task 6 efforts will likely also be of value to individuals evaluating the feasibility of additional health,study efforts (such as epidemiological investigations) in the Oak Ridge area and as a resource for citizens seeking information on historical emissions.

  1. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.

  2. Promoting best practice design intent in 3D CAD for engineers through a task analysis

    Directory of Open Access Journals (Sweden)

    Keelin Leahy

    2013-01-01

    Full Text Available Assessment encompasses a range of methods and techniques. At the University of Limerick, Ireland, it is an affirmed obligation to facilitate timely and useful feedback for both formative (for learning and summative (of learning assessment. However, the effectiveness of this feedback has raised concern and has a wide-ranging review of research findings. This paper presents research findings to build a picture of the extent to which the impact of feedback as a constructivist paradigm of teaching and learning can promote best practice design intent in 3D CAD Modelling. The resulting data set, comprised of 114 higher education students, is used to discuss the impact of assessment and feedback, comparing semesters Spring 2011/12 and Spring 2012/13. The 2012/13 cohort received formative assessment feedback from a task analysis. This evidenced an upsurge in understanding in best practice design intent in 3D CAD parametric modelling, supported by an effect size of 0.534.

  3. The Herschel/SPIRE Spectrometer Deglitching/Clipping Tasks and Statistical Analysis

    Science.gov (United States)

    LeLeu, G.; Méalier, A.-L.; Benielli, D.; Baluteau, J.-P.; Surace, C.; Fulton, T. R.; Imhof, P.; Polehampton, E.; Vibert, D.

    2012-09-01

    The Herschel Space Observatory is a cornerstone mission of the European Space Agency for astronomical observations in the far-infrared and sub-millimeter wavelength range (60 - 670 microns) and has been launched in May 2009 with three instruments onboard : SPIRE, PACS, HIFI. This paper focuses on the development and the statistical analysis of two phenomena treated by the pipeline through both tasks ClippingCorrection and Deglitching. Glitches have to be re-moved to avoid the introduction of unwanted artefacts in the spectral energy distribution across all wavelengths. Clipped signals in the time-line are problematic as they represent missed samples. If left uncorrected, missed or erroneous samples can lead to further difficulties in particular when the time-lines are converted into spectra. This paper describes the new reconstructed method based on an adaptive algorithm using wavelet decomposition parameters. We also present statistic tool to analyze clipping and deglitching phenomena. The tool shows ratios and indicators that are calculated and plotted allowing visual monitoring. The various aggregated values give a synthetic graphical representation of the quality of the observation including Instrument and Telescope Temperatures behaviours. This tool has been developed as a plug-in of the HIPE environment.

  4. Task-based optimization of flip angle for texture analysis in MRI

    Science.gov (United States)

    Brand, Jonathan F.; Furenlid, Lars R.; Altbach, Maria I.; Galons, Jean-Phillippe; Bhattacharyya, Achyut; Sharma, Puneet; Bhattacharyya, Tulshi; Bilgin, Ali; Martin, Diego R.

    2016-03-01

    Chronic liver disease is a worldwide health problem, and hepatic fibrosis (HF) is one of the hallmarks of the disease. The current reference standard for diagnosing HF is biopsy followed by pathologist examination, however this is limited by sampling error and carries risk of complications. Pathology diagnosis of HF is based on textural change in the liver as a lobular collagen network that develops within portal triads. The scale of collagen lobules is characteristically on order of 1-5 mm, which approximates the resolution limit of in vivo gadolinium-enhanced magnetic resonance imaging in the delayed phase. We have shown that MRI of formalin fixed human ex vivo liver samples mimic the textural contrast of in vivo Gd-MRI and can be used as MRI phantoms. We have developed local texture analysis that is applied to phantom images, and the results are used to train model observers. The performance of the observer is assessed with the area-under-the-receiveroperator- characteristic curve (AUROC) as the figure of merit. To optimize the MRI pulse sequence, phantoms are scanned with multiple times at a range of flip angles. The flip angle that associated with the highest AUROC is chosen as optimal based on the task of detecting HF.

  5. Space tug economic analysis study. Volume 1: Executive summary

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The space tug is defined as any liquid propulsion stage under 100,000 pounds propellant loading that is flown from the space shuttle cargo bay. Two classes of vehicles are the orbit injection stages and reusable space tugs. The vehicle configurations, propellant combinations, and operating modes used for the study are reported. The summary contains data on the study approach, results, conclusions, and recommendations.

  6. Sealed source and device design safety testing. Volume 5: Technical report on the findings of Task 4, Investigation of failed radioactive stainless steel troxler gauges

    Energy Technology Data Exchange (ETDEWEB)

    Benac, D.J.; Schick, W.R. [Southwest Research Inst., San Antonio, TX (United States)

    1995-10-01

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate failed radioactive stainless steel troxler gauges. SwRI`s task was to determine the cause of failure of the rods and the extent of the problem. SwRI concluded that the broken rod failed in a brittle manner due to a hard zone in the heat affected zone.

  7. Engineering characterization of ground motion. Task II. Effects of ground motion characteristics on structural response considering localized structural nonlinearities and soil-structure interaction effects. Volume 2

    International Nuclear Information System (INIS)

    This report presents the results of part of a two-task study on the engineering characterization of earthquake ground motion for nuclear power plant design. Task I of the study, which is presented in NUREG/CR-3805, Vol. 1, developed a basis for selecting design response spectra taking into account the characteristics of free-field ground motion found to be significant in causing structural damage. Task II incorporates additional considerations of effects of spatial variations of ground motions and soil-structure interaction on foundation motions and structural response. The results of Task II are presented in four parts: (1) effects of ground motion characteristics on structural response of a typical PWR reactor building with localized nonlinearities and soil-structure interaction effects; (2) empirical data on spatial variations of earthquake ground motion; (3) soil-structure interaction effects on structural response; and (4) summary of conclusions and recommendations based on Tasks I and II studies. This report presents the results of the first part of Task II. The results of the other parts will be presented in NUREG/CR-3805, Vols. 3 to 5

  8. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Shuai; Makarov, Yuri V.

    2009-04-01

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on February 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.

  9. Analysis of Cloud Network Management Using Resource Allocation and Task Scheduling Services

    Directory of Open Access Journals (Sweden)

    K.C. Okafor

    2016-01-01

    Full Text Available Network failure in cloud datacenter could result from inefficient resource allocation; scheduling and logical segmentation of physical machines (network constraints. This is highly undesirable in Distributed Cloud Computing Networks (DCCNs running mission critical services. Such failure has been identified in the University of Nigeria datacenter network situated in the south eastern part of Nigeria. In this paper, the architectural decomposition of a proposed DCCN was carried out while exploring its functionalities for grid performance. Virtualization services such as resource allocation and task scheduling were employed in heterogeneous server clusters. The validation of the DCCN performance was carried out using trace files from Riverbed Modeller 17.5 in order to ascertain the influence of virtualization on server resource pool. The QoS metrics considered in the analysis are: the service delay time, resource availability, throughput and utilization. From the validation analysis of the DCCN, the following results were obtained: average throughput (bytes/Sec for DCCN = 40.00%, DCell = 33.33% and BCube = 26.67%. Average resource availability response for DCCN = 38.46%, DCell = 33.33%, and BCube = 28.21%. DCCN density on resource utilization = 40% (when logically isolated and 60% (when not logically isolated. From the results, it was concluded that using virtualization in a cloud DataCenter servers will result in enhanced server performance offering lower average wait time even with a higher request rate and longer duration of resource use (service availability. By evaluating these recursive architectural designs for network operations, enterprises ready for Spine and leaf model could further develop their network resource management schemes for optimal performance.

  10. A neural network approach to fMRI binocular visual rivalry task analysis.

    Directory of Open Access Journals (Sweden)

    Nicola Bertolino

    Full Text Available The purpose of this study was to investigate whether artificial neural networks (ANN are able to decode participants' conscious experience perception from brain activity alone, using complex and ecological stimuli. To reach the aim we conducted pattern recognition data analysis on fMRI data acquired during the execution of a binocular visual rivalry paradigm (BR. Twelve healthy participants were submitted to fMRI during the execution of a binocular non-rivalry (BNR and a BR paradigm in which two classes of stimuli (faces and houses were presented. During the binocular rivalry paradigm, behavioral responses related to the switching between consciously perceived stimuli were also collected. First, we used the BNR paradigm as a functional localizer to identify the brain areas involved the processing of the stimuli. Second, we trained the ANN on the BNR fMRI data restricted to these regions of interest. Third, we applied the trained ANN to the BR data as a 'brain reading' tool to discriminate the pattern of neural activity between the two stimuli. Fourth, we verified the consistency of the ANN outputs with the collected behavioral indicators of which stimulus was consciously perceived by the participants. Our main results showed that the trained ANN was able to generalize across the two different tasks (i.e. BNR and BR and to identify with high accuracy the cognitive state of the participants (i.e. which stimulus was consciously perceived during the BR condition. The behavioral response, employed as control parameter, was compared with the network output and a statistically significant percentage of correspondences (p-value <0.05 were obtained for all subjects. In conclusion the present study provides a method based on multivariate pattern analysis to investigate the neural basis of visual consciousness during the BR phenomenon when behavioral indicators lack or are inconsistent, like in disorders of consciousness or sedated patients.

  11. Grid-connected ICES: preliminary feasibility analysis and evaluation. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    The HEAL Complex in New Orleans will serve as a Demonstration Community for which the ICES Demonstration System will be designed. The complex is a group of hospitals, clinics, research facilities, and medical educational facilities. The five tasks reported on are: preliminary energy analysis; preliminary institutional assessment; conceptual design; firming-up of commitments; and detailed work management plan.

  12. Multiple-task real-time PDP-15 operating system for data acquisition and analysis

    International Nuclear Information System (INIS)

    The RAMOS operating system is capable of handling up to 72 simultaneous tasks in an interrupt-driven environment. The minimum viable hardware configuration includes a Digital Equipment Corporation PDP-15 computer with 16384 words of memory, extended arithmetic element, automatic priority interrupt, a 256K-word RS09 DECdisk, two DECtape transports, and an alphanumeric keyboard/typer. The monitor executes major tasks by loading disk-resident modules to memory for execution; modules are written in a format that allows page-relocation by the monitor, and can be loaded into any available page. All requests for monitor service by tasks, including input/output, floating point arithmetic, request for additional memory, task initiation, etc., are implemented by privileged monitor calls (CAL). All IO device handlers are capable of queuing requests for service, allowing several tasks ''simultaneous'' use of all resources. All alphanumeric IO (including the PC05) is completely buffered and handled by a single multiplexing routine. The floating point arithmetic software is re-entrant to all operating modules and includes matrix arithmetic functions. One of the system tasks can be a ''batch'' job, controlled by simulating an alphanumeric command terminal through cooperative functions of the disk handler and alphanumeric device software. An alphanumeric control sequence may be executed, automatically accessing disk-resident tasks in any prescribed order; a library of control sequences is maintained on bulk storage for access by the monitor. (auth)

  13. Waste Isolation Pilot Plant Safety Analysis Report. Volume 5

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  14. Waste Isolation Pilot Plant Safety Analysis Report. Volume 1

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection: Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating control and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  15. Waste Isolation Pilot Plant Safety Analysis Report. Volume 2

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  16. Waste Isolation Pilot Plant Safety Analysis Report. Volume 4

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  17. Waste Isolation Pilot Plant Safety Analysis Report. Volume 3

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  18. Re: Madsen et al. "Unnecessary work tasks and mental health: a prospective analysis of Danish human service workers".

    Science.gov (United States)

    Durand-Moreau, Quentin; Loddé, Brice; Dewitte, Jean-Dominique

    2015-03-01

    Madsen et al (1) recently published a secondary analysis on data provided by the Project on Burnout, Motivation and Job Satisfaction (PUMA). The aim of their study, published in the Scandinavian Journal of Work, Environment & Health was to examine the associations between unnecessary work tasks and a decreased level of mental health. Though the topic was quite novel, reading this work proved disturbing and raised issues. Based on the results of this study, the authors stated that there is an association between unnecessary work tasks (assessed by a single question) and a decreased level of mental health, idem [assessed by the Mental Health Inventory (MHI-5)], in the specific population included in this PUMA survey. The authors point out a limitation of the study, namely that unnecessary work tasks were evaluated using one single question: "Do you sometimes have to do things in your job which appear to be unnecessary?". Semmer defines unnecessary work task as "tasks that should not be carried out at all because they do not make sense or because they could have been avoided, or could be carried out with less effort if things were organized more efficiently" (2). De facto, qualifying what an unnecessary task is requires stating or explaining whether the task makes sense. Making sense or not is not an objective notion. It is very difficult for either a manager or an employee to say if a task is necessary or not. Most important is that it makes sense from the worker's point of view. Making sense and being necessary are not synonyms. Some tasks do not make sense but are economically necessary (eg, when, as physicians, we are reporting our activity using ICD-10 on computers instead of being at patients' bedsides or reading this journal). Thus, there is a wide gap between Semmer's definition and the question used by the authors to evaluate his concept. A secondary analysis based on a single question is not adequate to evaluate unnecessary tasks. Nowadays, the general trend

  19. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 2: Economic analysis

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) cost uncertainties, (2) scenario analysis, (3) economic sensitivities, (4) mixed integer programming formulation of the space tug problem, and (5) critical parameters in the evaluation of a public expenditure.

  20. Determination of bone mineral volume fraction using impedance analysis and Bruggeman model

    Energy Technology Data Exchange (ETDEWEB)

    Ciuchi, Ioana Veronica; Olariu, Cristina Stefania, E-mail: oocristina@yahoo.com; Mitoseriu, Liliana, E-mail: lmtsr@uaic.ro

    2013-11-20

    Highlights: • Mineral volume fraction of a bone sample was determined. • Dielectric properties for bone sample and for the collagen type I were determined by impedance spectroscopy. • Bruggeman effective medium approximation was applied in order to evaluate mineral volume fraction of the sample. • The computed values were compared with ones derived from a histogram test performed on SEM micrographs. -- Abstract: Measurements by impedance spectroscopy and Bruggeman effective medium approximation model were employed in order to determine the mineral volume fraction of dry bone. This approach assumes that two or more phases are present into the composite: the matrix (environment) and the other ones are inclusion phases. A fragment of femur diaphysis dense bone from a young pig was investigated in its dehydrated state. Measuring the dielectric properties of bone and its main components (hydroxyapatite and collagen) and using the Bruggeman approach, the mineral volume filling factor was determined. The computed volume fraction of the mineral volume fraction was confirmed by a histogram test analysis based on the SEM microstructures. In spite of its simplicity, the method provides a good approximation for the bone mineral volume fraction. The method which uses impedance spectroscopy and EMA modeling can be further developed by considering the conductive components of the bone tissue as a non-invasive in situ impedance technique for bone composition evaluation and monitoring.

  1. Impact of Dual Task on Parkinson's Disease, Stroke and Ataxia Patients' Gait: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Michelly Arjona Maciel

    2014-01-01

    Full Text Available Introduction: Performing dual task for neurological patients is complex and it can be influenced by the localization of the neurological lesion. Objective: Comparing the impact of dual task on gait in patients with Parkinson's disease, stroke and ataxia. Method: Subjects with Parkinson's disease (PD in initial phase, stroke and ataxia, with independent gait, were evaluated while doing simple gait, with cognitive, motor and cognitive-motor gait demand, assessing average speed and number of steps. Results: Ataxia and stroke patients, compared with PD, showed an increase in the number of steps and decrease the average speed on the march with cognitive demand. Subjects with PD performed better on tasks when compared to others. Conclusion: In this study the impact of dual task was lower in Parkinson's disease patients.

  2. Uniprocessor Schedulability and Sensitivity Analysis of Multiple Criticality Tasks with Fixed-Priorities

    OpenAIRE

    Dorin, François; Richard, Pascal; Richard, Michael; Goossens, Joël

    2009-01-01

    Safety-critical real-time standards define several criticality levels for the tasks (e.g., DO-178B - Software Considerations in Airborne Systems and Equipment Certification). Classical models do not take into account these levels. Vestal introduced a new multiple criticality model, to model more precisely existing real-time systems, and algorithms to schedule such systems. Such task model represents a potentially very significant advance in the modeling of safety-critical real-time systems. B...

  3. Scaling analysis for mixing in large stratified volumes of passive containment

    International Nuclear Information System (INIS)

    Integral test plays a key role in assessing the feasibility of the passive containment cooling system (PCCS) and the accuracy of the calculation model. The scaling analysis for ; mixing in large stratified volumes of PCCS provides the primary theoretical basis for determining the key size of the integral test facility. Based on the mixing in large stratified volumes, the key parameters were obtained by scaling analysis based on the hierarchical two-tiered scaling method. The similarity criteria that ensure the integral test facility can well simulate mixing in the passive containment was obtained. (authors)

  4. Advanced coal-using community systems. Task 1A. Technology characteristics. Volume 2. Fuel- and energy-distribution and end-use systems

    Energy Technology Data Exchange (ETDEWEB)

    Tison, R.R.; Blazek, C.F.; Biederman, N.P.; Malik, N.J.; Gamze, M.G.; Wetterstrom, D.; Diskant, W.; Malfitani, L.

    1979-03-01

    This report is presented in 2 volumes. It contains descriptions of engineering characterizations and equipment used in coal processing, fuel and energy distribution, storage, and end-use utilization. Volume 2 contains 4 chapters dealing with: distribution systems for solid fuels, liquid fuels, gaseous fuels, steam, and electric power; storage systems for solid fuels, liquid fuels, gaseous fuels, electricity, and thermal energy; energy management systems; and energy-end use systems. (DMC)

  5. Lessons from a pilot project in cognitive task analysis: the potential role of intermediates in preclinical teaching in dental education.

    Science.gov (United States)

    Walker, Judith; von Bergmann, HsingChi

    2015-03-01

    The purpose of this study was to explore the use of cognitive task analysis to inform the teaching of psychomotor skills and cognitive strategies in clinical tasks in dental education. Methods used were observing and videotaping an expert at one dental school thinking aloud while performing a specific preclinical task (in a simulated environment), interviewing the expert to probe deeper into his thinking processes, and applying the same procedures to analyze the performance of three second-year dental students who had recently learned the analyzed task and who represented a spectrum of their cohort's ability to undertake the procedure. The investigators sought to understand how experts (clinical educators) and intermediates (trained students) overlapped and differed at points in the procedure that represented the highest cognitive load, known as "critical incidents." Findings from this study and previous research identified possible limitations of current clinical teaching as a result of expert blind spots. These findings coupled with the growing evidence of the effectiveness of peer teaching suggest the potential role of intermediates in helping novices learn preclinical dentistry tasks. PMID:25729022

  6. Time course of information representation of macaque AIP neurons in hand manipulation task revealed by information analysis.

    Science.gov (United States)

    Sakaguchi, Yutaka; Ishida, Fumihiko; Shimizu, Takashi; Murata, Akira

    2010-12-01

    We used mutual information analysis of neuronal activity in the macaque anterior intraparietal area (AIP) to examine information processing during a hand manipulation task. The task was to reach-to-grasp a three-dimensional (3D) object after presentation of a go signal. Mutual information was calculated between the spike counts of individual neurons in 50-ms-wide time bins and six unique shape classifications or 15 one-versus-one classifications of these shapes. The spatiotemporal distribution of mutual information was visualized as a two-dimensional image ("information map") to better observe global profiles of information representation. In addition, a nonnegative matrix factorization technique was applied for extracting its structure. Our major finding was that the time course of mutual information differed significantly according to different classes of task-related neurons. This strongly suggests that different classes of neurons were engaged in different information processing stages in executing the hand manipulation task. On the other hand, our analysis revealed the heterogeneous nature of information representation of AIP neurons. For example, "information latency" (or information onset) varied among individual neurons even in the same neuron class and the same shape classification. Further, some neurons changed "information preference" (i.e., shape classification with the largest amount of information) across different task periods. These suggest that neurons encode different information in the different task periods. Taking the present result together with previous findings, we used a Gantt chart to propose a hypothetical scheme of the dynamic interactions between different types of AIP neurons.

  7. [Environmental investigation of ground water contamination at Wright- Patterson Air Force Base, Ohio]. Volume 4, Health and Safety Plan (HSP); Phase 1, Task 4 Field Investigation report: Draft

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This Health and Safety Plan (HSP) was developed for the Environmental Investigation of Ground-water Contamination Investigation at Wright-Patterson Air Force Base near Dayton, Ohio, based on the projected scope of work for the Phase 1, Task 4 Field Investigation. The HSP describes hazards that may be encountered during the investigation, assesses the hazards, and indicates what type of personal protective equipment is to be used for each task performed. The HSP also addresses the medical monitoring program, decontamination procedures, air monitoring, training, site control, accident prevention, and emergency response.

  8. Recalling academic tasks

    Science.gov (United States)

    Draper, Franklin Gno

    This study was focused on what students remembered about five middle school science tasks when they were juniors and seniors in high school. Descriptions of the five tasks were reconstructed from available artifacts and teachers' records, notes and recollections. Three of the five tasks were "authentic" in the sense that students were asked to duplicate the decisions practitioners make in the adult world. The other two tasks were more typical school tasks involving note taking and preparation for a quiz. All five tasks, however, involved use of computers. Students were interviewed to examine what and how well they recalled the tasks and what forms or patterns of recall existed. Analysis of their responses indicated that different kinds of tasks produced different levels of recall. Authentically situated tasks were remembered much better than routine school tasks. Further, authentic tasks centered on design elements were recalled better than those for which design was not as pivotal. Patterns of recall indicated that participants most often recalled the decisions they made, the scenarios of the authentically situated tasks, the consequences of their tasks and the social contexts of the classroom. Task events, in other words, appeared to form a framework upon which students constructed stories of the tasks. The more salient the events, the richer the story, the deeper and more detailed the recall of the task. Thus, authentic tasks appeared to lend themselves to creating stories better than regular school tasks and therefore such tasks were recalled better. Implications of these patterns of recall are discussed with respect to issues of school learning and assessment.

  9. A task analysis-linked approach for integrating the human factor in reliability assessments of nuclear power plants

    International Nuclear Information System (INIS)

    This paper describes an emerging Task Analysis-Linked Evaluation Technique (TALENT) for assessing the contributions of human error to nuclear power plant systems unreliability and risk. Techniques such as TALENT are emerging as a recognition that human error is a primary contributor to plant safety, however, it has been a peripheral consideration to data in plant reliability evaluations. TALENT also recognizes that involvement of persons with behavioral science expertise is required to support plant reliability and risk analyses. A number of state-of-knowledge human reliability analysis tools are also discussed which support the TALENT process. The core of TALENT is comprised of task, timeline and interface analysis data which provide the technology base for event and fault tree development, serve as criteria for selecting and evaluating performance shaping factors, and which provide a basis for auditing TALENT results. Finally, programs and case studies used to refine the TALENT process are described along with future research needs in the area. (author)

  10. Design and Analysis of Self-Adapted Task Scheduling Strategies in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sajid Hussain

    2011-06-01

    Full Text Available In a wireless sensor network (WSN, the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and  scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, their counterparts for WSNs remain largely unexplored. Existing traditional high performance computing solutions cannot be directly implemented in WSNs due to the limitations of WSNs such as limited resource availability and the shared communication medium. In this paper, a self-adapted task scheduling strategy for WSNs is presented. First, a multi-agent-based architecture for WSNs is proposed and a mathematical model of dynamic alliance is constructed for the task allocation problem. Then an effective discrete particle swarm optimization (PSO algorithm for the dynamic alliance (DPSO-DA with a well-designed particle position code and fitness function is proposed. A mutation operator which can effectively improve the algorithm’s ability of global search and population diversity is also introduced in this algorithm. Finally, the simulation results show that the proposed solution can achieve significant better performance than other algorithms.

  11. Procedural learning deficits in specific language impairment (SLI): a meta-analysis of serial reaction time task performance.

    Science.gov (United States)

    Lum, Jarrad A G; Conti-Ramsden, Gina; Morgan, Angela T; Ullman, Michael T

    2014-02-01

    Meta-analysis and meta-regression were used to evaluate whether evidence to date demonstrates deficits in procedural memory in individuals with specific language impairment (SLI), and to examine reasons for inconsistencies of findings across studies. The Procedural Deficit Hypothesis (PDH) proposes that SLI is largely explained by abnormal functioning of the frontal-basal ganglia circuits that support procedural memory. It has also been suggested that declarative memory can compensate for at least some of the problems observed in individuals with SLI. A number of studies have used Serial Reaction Time (SRT) tasks to investigate procedural learning in SLI. In this report, results from eight studies that collectively examined 186 participants with SLI and 203 typically-developing peers were submitted to a meta-analysis. The average mean effect size was .328 (CI95: .071, .584) and was significant. This suggests SLI is associated with impairments of procedural learning as measured by the SRT task. Differences among individual study effect sizes, examined with meta-regression, indicated that smaller effect sizes were found in studies with older participants, and in studies that had a larger number of trials on the SRT task. The contributions of age and SRT task characteristics to learning are discussed with respect to impaired and compensatory neural mechanisms in SLI. PMID:24315731

  12. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report

    Science.gov (United States)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.

  13. Re: Madsen et al. "Unnecessary work tasks and mental health: a prospective analysis of Danish human service workers".

    Science.gov (United States)

    Durand-Moreau, Quentin; Loddé, Brice; Dewitte, Jean-Dominique

    2015-03-01

    Madsen et al (1) recently published a secondary analysis on data provided by the Project on Burnout, Motivation and Job Satisfaction (PUMA). The aim of their study, published in the Scandinavian Journal of Work, Environment & Health was to examine the associations between unnecessary work tasks and a decreased level of mental health. Though the topic was quite novel, reading this work proved disturbing and raised issues. Based on the results of this study, the authors stated that there is an association between unnecessary work tasks (assessed by a single question) and a decreased level of mental health, idem [assessed by the Mental Health Inventory (MHI-5)], in the specific population included in this PUMA survey. The authors point out a limitation of the study, namely that unnecessary work tasks were evaluated using one single question: "Do you sometimes have to do things in your job which appear to be unnecessary?". Semmer defines unnecessary work task as "tasks that should not be carried out at all because they do not make sense or because they could have been avoided, or could be carried out with less effort if things were organized more efficiently" (2). De facto, qualifying what an unnecessary task is requires stating or explaining whether the task makes sense. Making sense or not is not an objective notion. It is very difficult for either a manager or an employee to say if a task is necessary or not. Most important is that it makes sense from the worker's point of view. Making sense and being necessary are not synonyms. Some tasks do not make sense but are economically necessary (eg, when, as physicians, we are reporting our activity using ICD-10 on computers instead of being at patients' bedsides or reading this journal). Thus, there is a wide gap between Semmer's definition and the question used by the authors to evaluate his concept. A secondary analysis based on a single question is not adequate to evaluate unnecessary tasks. Nowadays, the general trend

  14. Analysis of Petri net model and task planning heuristic algorithms for product reconfiguration

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Reconfiguration planning is recognized as an important factor for reducing the cost of manufacturing reconfigurable products, and the associated main task is to generate a set of optimal or near-optimal reconfiguration sequences using some effect algorithms. A method is developed to generate a Petri net as the reconfiguration tree to represent two-state-transit of product, which solved the representation problem of reconfiguring interfaces replacement. Relating with this method, two heuristic algorithms are proposed to generate task sequences which considering economics to search reconfiguration paths effectively. At last,an objective evaluation is applied to compare these two heuristic algorithms to other ones. The developed reconfiguration task planning heuristic algorithms can generate better strategies and plans for reconfiguration. The research finds are exemplified with struts reconfiguration of reconfigurable parallel kinematics machine (RPKM).

  15. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 2. OPERATIONS MANUAL

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. Original system development was based on research conducted in a smaller water utility in Kenton Co...

  16. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 1. CASE STUDY

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. Original system development was based on research conducted in a smaller water utility in Kenton Co...

  17. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 3. PROGRAMMING MANUAL

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. MARS will help both the Water and Sanitary Sewer Departments control costs and manage expanding ser...

  18. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    International Nuclear Information System (INIS)

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  19. Supply-demand analysis. Volume II. of the offshore oil industry support craft market. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, J.H.

    1977-10-01

    Volume Two of this report presents a description of the market for offshore petroleum industry support craft and an analysis of that market. Financial performance of five major operating companies is described. A forecast of support craft supply and demand for 1977, 1982, and 1987 is included.

  20. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2006-03-20

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  1. Waste Isolation Pilot Plant Geotechnical Analysis Report for July 2005 - June 2006, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2007-03-25

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2006. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  2. Analysis of the psychometric properties of two different concept-map assessment tasks

    Science.gov (United States)

    Plummer, Kenneth James

    The ability to make sense of a wide array of stimuli presupposes the human tendency to organize information in a meaningful way. Efforts to assess the degree to which students organize information meaningfully have been hampered by several factors including the idiosyncratic way in which individuals represent their knowledge either with words or visually. Concept maps have been used as tools by researchers and educators alike to assist students in understanding the conceptual interrelationships within a subject domain. One concept-map assessment in particular known as the construct-a-map task has shown great promise in facilitating reliable and valid inferences from student concept-map ratings. With all of its promise, however, the construct-a-map task is burdened with several rating difficulties. One challenge in particular is that no published rubric has been developed that accounts for the degree to which individual propositions are important to an understanding of the overall topic or theme of the map. This study represents an attempt to examine the psychometric properties of two construct-a-map tasks designed to overcome in part this rating difficulty. The reliability of the concept-map ratings was calculated using a person-by-rater-by-occasion fully crossed design. This design made it possible to use generalizability theory to identify and estimate the variance in the ratings contributed by the three factors mentioned, the interaction effects, and unexplained error. The criterion validity of the concept-map ratings was examined by computing Pearson correlations between concept-map and essay ratings and concept-map and interview transcript ratings. The generalizability coefficients for student mean ratings were moderate to very high: .73 and .94 for the first concept-mapping task and .74 and .87 for the second concept-mapping task. A relatively large percentage of the rating variability was contributed by the object of measurement. Both tasks correlated highly

  3. Meta-analysis: Effects of glycerol administration on plasma volume, haemoglobin, and haematocrit.

    Science.gov (United States)

    Koehler, Karsten; Thevis, Mario; Schaenzer, Wilhelm

    2013-01-01

    The use of glycerol in combination with excess fluid can be used to increase total body water. Because glycerol hyperhydration may also be misused to mask the effects of blood doping on doping-relevant parameters, namely haemoglobin and haematocrit, glycerol has been prohibited by the World Anti-Doping Agency since 2010. In order to test this rationale, the purpose of this meta-analysis was to quantify the effects of glycerol hyperhydration on plasma volume, haemoglobin, and haematocrit in comparison to administration of fluid only. Following a literature search, a total of seven studies was included and meta-analyses were performed separately for the effects on plasma volume (5 studies, total n = 54) and on haemoglobin (6 studies, n = 52) and haematocrit (6 studies, n = 52). The meta-analysis revealed that the increase in plasma volume was 3.3% larger (95%-CI: 1.1-5.5%) after glycerol administration when compared to fluid only. Reductions in haemoglobin were 0.2 g/dl (95%-CI: -0.3, 0.0) larger and there was no difference in the changes in haematocrit between glycerol and fluid administration (95%-CI: -0.7-0.8%). In comparison with other plasma-volume expanding agents, glycerol hyperhydration has a very limited potential in increasing plasma volume and altering doping-relevant blood parameters. PMID:24353192

  4. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon [Hallym University College of Medicine, Seoul (Korea, Republic of)

    2002-09-15

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  5. Guidelines on routine cerebrospinal fluid analysis. Report from an EFNS task force

    DEFF Research Database (Denmark)

    Deisenhammer, F; Bartos, A; Egg, R;

    2006-01-01

    total protein, albumin, immunoglobulins, glucose, lactate, cell count, cytological staining, and investigation of infectious CSF. The methods included a Systematic Medline search for the above-mentioned variables and review of appropriate publications by one or more of the task force members. Grading...... of evidence and recommendations was based on consensus by all task force members. It is recommended that CSF should be analysed immediately after collection. If storage is needed 12 ml of CSF should be partitioned into three to four sterile tubes. Albumin CSF/serum ratio (Qalb) should be preferred to total...

  6. A 259.6 μW HRV-EEG Processor With Nonlinear Chaotic Analysis During Mental Tasks.

    Science.gov (United States)

    Roh, Taehwan; Hong, Sunjoo; Cho, Hyunwoo; Yoo, Hoi-Jun

    2016-02-01

    A system-on-chip (SoC) with nonlinear chaotic analysis (NCA) is presented for mental task monitoring. The proposed processor treats both heart rate variability (HRV) and electroencephalography (EEG). An independent component analysis (ICA) accelerator decreases the error of HRV extraction from 5.94% to 1.84% in the preprocessing step. Largest Lyapunov exponents (LLE), as well as linear features such as mean and standard variation and sub-band power, are calculated with NCA acceleration. Measurements with mental task protocols result in confidence level of 95%. Thanks to the hardware acceleration, the chaos-processor fabricated in 0.13 μm CMOS technology consumes only 259.6 μW. PMID:25616073

  7. Task and person-focused leadership behaviors and team performance: A meta-analysis.

    NARCIS (Netherlands)

    Ceri-Booms, Meltem; Curseu, P.L.; Oerlemans, L.A.G.

    2017-01-01

    This paper reports the results of a meta-analytic review of the relationship between person and task oriented leader behaviors, on the one hand, and team performance, on the other hand. The results, based on 89 independent samples, show a moderate positive (ρ=.33) association between both types of l

  8. Development and Confirmatory Factory Analysis of the Achievement Task Value Scale for University Students

    Science.gov (United States)

    Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen

    2013-01-01

    The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…

  9. Path Analysis Examining Self-Efficacy and Decision-Making Performance on a Simulated Baseball Task

    Science.gov (United States)

    Hepler, Teri J.; Feltz, Deborah L.

    2012-01-01

    The purpose of this study was to examine the relationship between decision-making self-efficacy and decision-making performance in sport. Undergraduate students (N = 78) performed 10 trials of a decision-making task in baseball. Self-efficacy was measured before performing each trial. Decision-making performance was assessed by decision speed and…

  10. Agriculture Technical Advisory Committee on Curriculum Development. Job Clusters, Competencies and Task Analysis.

    Science.gov (United States)

    Northern Montana Coll., Havre. Montana Center for Vocational Education, Research, Curriculum and Personnel Development.

    This skills inventory for agricultural occupations was developed by a technical committee in Montana to assist in the development of model curricula and to address state labor market needs. The committee included employers from the forestry industry, members from trade and professional associations, and educators. The validated task list and…

  11. Forestry Technical Advisory Committee on Curriculum Development. Job Clusters, Competencies and Task Analysis.

    Science.gov (United States)

    Northern Montana Coll., Havre. Montana Center for Vocational Education, Research, Curriculum and Personnel Development.

    This skills inventory for forestry occupations was developed by a technical committee in Montana to assist in the development of model curricula and to address state labor market needs. The committee included employers from the forestry industry, members from trade and professional associations, and educators. The validated task list and defined…

  12. A meta-analysis of the impact of situationally induced achievement goals on task performance

    NARCIS (Netherlands)

    Van Yperen, Nico W.; Blaga, Monica; Postmes, Thomas

    2015-01-01

    The purpose of this research was to meta-analyze studies which experimentally induced an achieve- ment goal state to examine its causal effect on the individual’s performance at the task at hand, and to investigate the moderator effects of feedback anticipation and time pressure. The data set compri

  13. Task analysis of information technology-mediated medication management in outpatient care

    NARCIS (Netherlands)

    Stiphout, F. van; Zwart-van Rijkom, J.E.F.; Maggio, L.A.; Aarts, J.E.C.M.; Bates, D.W.; Gelder, T. van; Jansen, P.A.F.; Schraagen, J.M.C.; Egberts, A.C.G.; Braak, E.W.M.T. ter

    2015-01-01

    Aims Educating physicians in the procedural as well as cognitive skills of information technology (IT)-mediated medication management could be one of the missing links for the improvement of patient safety. We aimed to compose a framework of tasks that need to be addressed to optimize medication man

  14. Task analysis of IT-mediated medication management in outpatient care

    NARCIS (Netherlands)

    Stiphout, van F.; Zwart-van Rijkom, J.E.F.; Maggio, L.A.; Aarts, J.E.C.M.; Bates, D.W.; Gelder, van T.; Jansen, P.A.F.; Schraagen, J.M.C.; Egberts, A.C.G.; Braak, ter E.W.M.T.

    2015-01-01

    Aim Educating physicians in the procedural as well as cognitive skills of IT-mediated medication management could be one of the missing links for improvement of patient safety. We aimed to compose a framework of tasks that need to be addressed to optimize medication management in outpatient care. M

  15. The effect of exercise-induced arousal on cognitive task performance: a meta-regression analysis.

    Science.gov (United States)

    Lambourne, Kate; Tomporowski, Phillip

    2010-06-23

    The effects of acute exercise on cognitive performance were examined using meta-analytic techniques. The overall mean effect size was dependent on the timing of cognitive assessment. During exercise, cognitive task performance was impaired by a mean effect of -0.14. However, impairments were only observed during the first 20min of exercise. Otherwise, exercise-induced arousal enhanced performance on tasks that involved rapid decisions and automatized behaviors. Following exercise, cognitive task performance improved by a mean effect of 0.20. Arousal continued to facilitate speeded mental processes and also enhanced memory storage and retrieval. Positive effects were observed following exercise regardless of whether the study protocol was designed to measure the effects of steady-state exercise, fatiguing exercise, or the inverted-U hypothesis. Finally, cognitive performance was affected differentially by exercise mode. Cycling was associated with enhanced performance during and after exercise, whereas treadmill running led to impaired performance during exercise and a small improvement in performance following exercise. These results are indicative of the complex relation between exercise and cognition. Cognitive performance may be enhanced or impaired depending on when it is measured, the type of cognitive task selected, and the type of exercise performed. PMID:20381468

  16. An Analysis of Image Retrieval Tasks in the Field of Art History.

    Science.gov (United States)

    Chen, Hsin-liang

    2001-01-01

    Investigated undergraduate art history majors' image retrieval tasks and image query modes. Discusses gender differences; prior information retrieval experience; significant differences between the number of search terms users planned to use and the number they actually used; and implications for image indexing tools, image retrieval system…

  17. Effects of Task Analysis and Self-Monitoring for Children with Autism in Multiple Social Settings

    Science.gov (United States)

    Parker, Daniel; Kamps, Debra

    2011-01-01

    In this study, written task analyses with self-monitoring were used to teach functional skills and verbal interactions to two high-functioning students with autism in social settings with peers. A social script language intervention was included in two of the activities to increase the quantity of verbal interaction between the students and peers.…

  18. Oak Ridge Reservation volume I. Y-12 mercury task force files: A guide to record series of the Department of Energy and its contractors

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-17

    The purpose of this guide is to describe each of the series of records identified in the documents of the Y-12 Mercury Task Force Files that pertain to the use of mercury in the separation and enrichment of lithium isotopes at the Department of Energy`s (DOE) Y-12 Plant in Oak Ridge, Tennessee. History Associates Incorporated (HAI) prepared this guide as part of DOE`s Epidemiologic Records Inventory Project, which seeks to verify and conduct inventories of epidemiologic and health-related records at various DOE and DOE contractor sites. This introduction briefly describes the Epidemiologic Records Inventory Project and HAI`s role in the project. Specific attention will be given to the history of the DOE-Oak Ridge Reservation, the development of the Y-12 Plant, and the use of mercury in the production of nuclear weapons during the 1950s and early 1960s. This introduction provides background information on the Y-12 Mercury Task Force Files, an assembly of documents resulting from the 1983 investigation of the Mercury Task Force into the effects of mercury toxicity upon workplace hygiene and worker health, the unaccountable loss of mercury, and the impact of those losses upon the environment. This introduction also explains the methodology used in the selection and inventory of these record series. Other topics include the methodology used to produce this guide, the arrangement of the detailed record series descriptions, and information concerning access to the collection.

  19. Oak Ridge Reservation volume I. Y-12 mercury task force files: A guide to record series of the Department of Energy and its contractors

    International Nuclear Information System (INIS)

    The purpose of this guide is to describe each of the series of records identified in the documents of the Y-12 Mercury Task Force Files that pertain to the use of mercury in the separation and enrichment of lithium isotopes at the Department of Energy's (DOE) Y-12 Plant in Oak Ridge, Tennessee. History Associates Incorporated (HAI) prepared this guide as part of DOE's Epidemiologic Records Inventory Project, which seeks to verify and conduct inventories of epidemiologic and health-related records at various DOE and DOE contractor sites. This introduction briefly describes the Epidemiologic Records Inventory Project and HAI's role in the project. Specific attention will be given to the history of the DOE-Oak Ridge Reservation, the development of the Y-12 Plant, and the use of mercury in the production of nuclear weapons during the 1950s and early 1960s. This introduction provides background information on the Y-12 Mercury Task Force Files, an assembly of documents resulting from the 1983 investigation of the Mercury Task Force into the effects of mercury toxicity upon workplace hygiene and worker health, the unaccountable loss of mercury, and the impact of those losses upon the environment. This introduction also explains the methodology used in the selection and inventory of these record series. Other topics include the methodology used to produce this guide, the arrangement of the detailed record series descriptions, and information concerning access to the collection

  20. Organizational analysis and safety for utilities with nuclear power plants: an organizational overview. Volume 1

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. A model is introduced for the purposes of organizing the literature review and showing key relationships among identified organizational factors and nuclear power plant safety. Volume I of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety

  1. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  2. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes. PMID:24179734

  3. Estimation of cell volume and biomass of penicillium chrysogenum using image analysis.

    Science.gov (United States)

    Packer, H L; Keshavarz-Moore, E; Lilly, M D; Thomas, C R

    1992-02-20

    A methodology for the estimation of biomass for the penicillin fermentation using image analysis is presented. Two regions of hyphae are defined to describe the growth of mycelia during fermentation: (1) the cytoplasmic region, and (2) the degenerated region including large vacuoles. The volume occupied by each of these regions in a fixed volume of sample is estimated from area measurements using image analysis. Areas are converted to volumes by treating the hyphae as solid cylinders with the hyphal diameter as the cylinder diameter. The volumes of the cytoplasmic and degenerated regions are converted into dry weight estimations using hyphal density values available from the literature. The image analysis technique is able to estimate biomass even in the presence of nondissolved solids of a concentration of up to 30 gL(-1). It is shown to estimate successfully concentrations of mycelia from 0.03 to 38 gL(-1). Although the technique has been developed for the penicillin fermentation, it should be applicable to other (nonpellected) fungal fermentations.

  4. Leukocyte telomere length and hippocampus volume: a meta-analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Gustav Nilsonne

    2015-10-01

    Full Text Available Leukocyte telomere length has been shown to correlate to hippocampus volume, but effect estimates differ in magnitude and are not uniformly positive. This study aimed primarily to investigate the relationship between leukocyte telomere length and hippocampus gray matter volume by meta-analysis and secondarily to investigate possible effect moderators. Five studies were included with a total of 2107 participants, of which 1960 were contributed by one single influential study. A random-effects meta-analysis estimated the effect to r = 0.12 [95% CI -0.13, 0.37] in the presence of heterogeneity and a subjectively estimated moderate to high risk of bias. There was no evidence that apolipoprotein E (APOE genotype was an effect moderator, nor that the ratio of leukocyte telomerase activity to telomere length was a better predictor than leukocyte telomere length for hippocampus volume. This meta-analysis, while not proving a positive relationship, also is not able to disprove the earlier finding of a positive correlation in the one large study included in analyses. We propose that a relationship between leukocyte telomere length and hippocamus volume may be mediated by transmigrating monocytes which differentiate into microglia in the brain parenchyma.

  5. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  6. Molecular modeling and structural analysis of two-pore domain potassium channels TASK1 interactions with the blocker A1899

    Directory of Open Access Journals (Sweden)

    David Mauricio Ramirez

    2015-03-01

    Full Text Available A1899 is a potent and highly selective blocker of the Two-pore domain potassium (K2P channel TASK-1, it acts as an antagonist blocking the K+ flux and binds to TASK-1 in the inner cavity and shows an activity in nanomolar order. This drug travels through the central cavity and finally binds in the bottom of the selectivity filter with some threonines and waters molecules forming a H-bond network and several hydrophobic interactions. Using alanine mutagenesis screens the binding site was identify involving residues in the P1 and P2 pore loops, the M2 and M4 transmembrane segments, and the halothane response element; mutations were introduced in the human TASK-1 (KCNK3, NM_002246 expressed in Oocytes from anesthetized Xenopus laevis frogs. Based in molecular modeling and structural analysis as such as molecular docking and binding free energy calculations a pose was suggested using a TASK-1 homology models. Recently, various K2P crystal structures have been obtained. We want redefined – from a structural point of view – the binding mode of A1899 in TASK-1 homology models using as a template the K2P crystal structures. By computational structural analysis we describe the molecular basis of the A1899 binding mode, how A1899 travel to its binding site and suggest an interacting pose (Figure 1. after 100 ns of molecular dynamics simulation (MDs we found an intra H-Bond (80% of the total MDs, a H-Bond whit Thr93 (42% of the total MDs, a pi-pi stacking interaction between a ring and Phe125 (88% of the total MDs and several water bridges. Our experimental and computational results allow the molecular understanding of the structural binding mechanism of the selective blocker A1899 to TASK-1 channels. We identified the structural common and divergent features of TASK-1 channel through our theoretical and experimental studies of A1899 drug action.

  7. Do depressive symptoms "blunt" effort? An analysis of cardiac engagement and withdrawal for an increasingly difficult task.

    Science.gov (United States)

    Silvia, Paul J; Mironovová, Zuzana; McHone, Ashley N; Sperry, Sarah H; Harper, Kelly L; Kwapil, Thomas R; Eddington, Kari M

    2016-07-01

    Research on depression and effort has suggested "depressive blunting"-lower cardiovascular reactivity in response to challenges and stressors. Many studies, however, find null effects or higher reactivity. The present research draws upon motivational intensity theory, a broad model of effort that predicts cases in which depressive symptoms should increase or decrease effort. Because depressive symptoms can influence task-difficulty appraisals-people see tasks as subjectively harder-people high in depressive symptoms should engage higher effort at objectively easier levels of difficulty but also quit sooner. A sample of adults completed a mental effort challenge with four levels of difficulty, from very easy to difficult-but-feasible. Depressive symptoms were assessed with the CESD and DASS; effort-related cardiac activity was assessed via markers of contractility (e.g., the cardiac pre-ejection period [PEP]) obtained with impedance cardiography. The findings supported the theory's predictions. When the task was relatively easier, people high in depressive symptoms showed higher contractility (shorter PEP), consistent with greater effort. When the task was relatively harder, people high in depressive symptoms showed diminished contractility, consistent with quitting. The results suggest that past research has been observing a small part of a larger trajectory of trying and quitting, and they illustrate the value of a theoretically grounded analysis of depressive symptoms and effort-related cardiac activity. PMID:27174723

  8. Do depressive symptoms "blunt" effort? An analysis of cardiac engagement and withdrawal for an increasingly difficult task.

    Science.gov (United States)

    Silvia, Paul J; Mironovová, Zuzana; McHone, Ashley N; Sperry, Sarah H; Harper, Kelly L; Kwapil, Thomas R; Eddington, Kari M

    2016-07-01

    Research on depression and effort has suggested "depressive blunting"-lower cardiovascular reactivity in response to challenges and stressors. Many studies, however, find null effects or higher reactivity. The present research draws upon motivational intensity theory, a broad model of effort that predicts cases in which depressive symptoms should increase or decrease effort. Because depressive symptoms can influence task-difficulty appraisals-people see tasks as subjectively harder-people high in depressive symptoms should engage higher effort at objectively easier levels of difficulty but also quit sooner. A sample of adults completed a mental effort challenge with four levels of difficulty, from very easy to difficult-but-feasible. Depressive symptoms were assessed with the CESD and DASS; effort-related cardiac activity was assessed via markers of contractility (e.g., the cardiac pre-ejection period [PEP]) obtained with impedance cardiography. The findings supported the theory's predictions. When the task was relatively easier, people high in depressive symptoms showed higher contractility (shorter PEP), consistent with greater effort. When the task was relatively harder, people high in depressive symptoms showed diminished contractility, consistent with quitting. The results suggest that past research has been observing a small part of a larger trajectory of trying and quitting, and they illustrate the value of a theoretically grounded analysis of depressive symptoms and effort-related cardiac activity.

  9. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis. Computer program user's manual

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.; Adamczyk, John J.; Miller, Christopher J.; Arnone, Andrea; Swanson, Charles

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOACR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This report is intended to serve as a computer program user's manual for the ADPAC-AOACR codes developed under Task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  10. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 2: Unsteady ducted propfan analysis computer program users manual

    Science.gov (United States)

    Hall, Edward J.; Delaney, Robert A.; Bettner, James L.

    1991-01-01

    The primary objective of this study was the development of a time-dependent three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The computer codes resulting from this study are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). This report is intended to serve as a computer program user's manual for the ADPAC developed under Task 2 of NASA Contract NAS3-25270, Unsteady Ducted Propfan Analysis. Aerodynamic calculations were based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. A time-accurate implicit residual smoothing operator was utilized for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C-grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted propfan flows. The solution scheme demonstrated efficiency and accuracy comparable with other schemes of this class.

  11. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators

  12. Volumes of the hippocampus and amygdala in patients with borderline personality disorder: a meta-analysis.

    Science.gov (United States)

    Nunes, Paulo Menezes; Wenzel, Amy; Borges, Karinne Tavares; Porto, Cristianne Ribeiro; Caminha, Renato Maiato; de Oliveira, Irismar Reis

    2009-08-01

    Individuals with borderline personality disorder (BPD) often exhibit impulsive and aggressive behavior. The hippocampus and amygdala form part of the limbic system, which plays a central role in controlling such expressions of emotional reactivity. There are mixed results in the literature regarding whether patients with BPD have smaller hippocampal and amygdalar volume relative to healthy controls. To clarify the precise nature of these mixed results, we performed a meta-analysis to aggregate data on the size of the hippocampus and amygdala in patients with BPD. Seven publications involving six studies and a total of 104 patients with BPD and 122 healthy controls were included. A significantly smaller volume was found in both the right and left hippocampi and amygdala of patients with BPD compared to healthy controls. These findings raise the possibility that reduced hippocampal and amygdalar volumes are biological substrates of some symptoms of BPD. PMID:19663654

  13. Laser Raman spectroscopic analysis of polymorphic forms in microliter fluid volumes.

    Science.gov (United States)

    Anquetil, Patrick A; Brenan, Colin J H; Marcolli, Claudia; Hunter, Ian W

    2003-01-01

    Knowledge and control of the polymorphic phase of chemical compounds are important aspects of drug development in the pharmaceutical industry. We report herein in situ and real-time Raman spectroscopic polymorphic analysis of optically trapped microcrystals in a microliter volume format. The system studied in particular was the recrystallization of carbamazepine (CBZ) in methanol. Raman spectrometry enabled noninvasive measurement of the amount of dissolved CBZ in a sample as well as polymorphic characterization, whereas exclusive recrystallization of either CBZ form I or CBZ form III from saturated solutions was achieved by specific selection of sample cell cooling profiles. Additionally, using a microcell versus a macroscopic volume gives the advantage of reaching equilibrium much faster while using little compound quantity. We demonstrate that laser Raman spectral polymorphic analysis in a microliter cell is a potentially viable screening platform for polymorphic analysis and could lead to a new high throughput method for polymorph screening.

  14. EEG Analysis during complex diagnostic tasks in Nuclear Power Plants - Simulator-based Experimental Study

    International Nuclear Information System (INIS)

    In literature, there are a lot of studies based on EEG signals during cognitive activities of human-beings but most of them dealt with simple cognitive activities such as transforming letters into Morse code, subtraction, reading, semantic memory search, visual search, memorizing a set of words and so on. In this work, EEG signals were analyzed during complex diagnostic tasks in NPP simulator-based environment. Investigated are the theta, alpha, beta, and gamma band EEG powers during the diagnostic tasks. The experimental design and procedure are represented in section 2 and the results are shown in section 3. Finally some considerations are discussed and the direction for the further work is proposed in section 4

  15. Brain wave correlates of attentional states: Event related potentials and quantitative EEG analysis during performance of cognitive and perceptual tasks

    Science.gov (United States)

    Freeman, Frederick G.

    1993-01-01

    presented target stimulus. In addition to the task requirements, irrelevant tones were presented in the background. Research has shown that even though these stimuli are not attended, ERP's to them can still be elicited. The amplitude of the ERP waves has been shown to change as a function of a person's level of alertness. ERP's were also collected and analyzed for the target stimuli for each task. Brain maps were produced based on the ERP voltages for the different stimuli. In addition to the ERP's, a quantitative EEG (QEEG) was performed on the data using a fast Fourier technique to produce a power spectral analysis of the EEG. This analysis was conducted on the continuous EEG while the subjects were performing the tasks. Finally, a QEEG was performed on periods during the task when subjects indicated that they were in an altered state of awareness. During the tasks, subjects were asked to indicate by pressing a button when they realized their level of task awareness had changed. EEG epochs were collected for times just before and just after subjects made this reponse. The purpose of this final analysis was to determine whether or not subjective indices of level of awareness could be correlated with different patterns of EEG.

  16. Perfusion analysis using a wide coverage flat-panel volume CT: feasibility study

    Science.gov (United States)

    Grasruck, M.; Gupta, R.; Reichardt, B.; Klotz, E.; Schmidt, B.; Flohr, T.

    2007-03-01

    We developed a Flat-panel detector based Volume CT (VCT) prototype scanner with large z-coverage. In that prototype scanner a Varian 4030CB a-Si flat-panel detector was mounted in a multi slice CT-gantry (Siemens Medical Solutions) which provides a 25 cm field of view with 18 cm z-coverage at isocenter. The large volume covered in one rotation can be used for visualization of complete organs of small animals, e.g. rabbits. By implementing a mode with continuous scanning, we are able to reconstruct the complete volume at any point in time during the propagation of a contrast bolus. Multiple volumetric reconstructions over time elucidate the first pass dynamics of a bolus of contrast resulting in 4-D angiography and potentially allowing whole organ perfusion analysis. We studied to which extent pixel based permeability and blood volume calculation with a modified Patlak approach was possible. Experimental validation was performed by imaging evolution of contrast bolus in New Zealand rabbits. Despite the short circulation time of a rabbit, the temporal resolution was sufficient to visually resolve various phases of the first pass of the contrast bolus. Perfusion imaging required substantial spatial smoothing but allowed a qualitative discrimination of different types of parenchyma in brain and liver. If a true quantitative analysis is possible, requires further studies.

  17. Analysis of Load-Carrying Capacity for Redundant Free-Floating Space Manipulators in Trajectory Tracking Task

    Directory of Open Access Journals (Sweden)

    Qingxuan Jia

    2014-01-01

    Full Text Available The aim of this paper is to analyze load-carrying capacity of redundant free-floating space manipulators (FFSM in trajectory tracking task. Combined with the analysis of influential factors in load-carrying process, evaluation of maximum load-carrying capacity (MLCC is described as multiconstrained nonlinear programming problem. An efficient algorithm based on repeated line search within discontinuous feasible region is presented to determine MLCC for a given trajectory of the end-effector and corresponding joint path. Then, considering the influence of MLCC caused by different initial configurations for the starting point of given trajectory, a kind of maximum payload initial configuration planning method is proposed by using PSO algorithm. Simulations are performed for a particular trajectory tracking task of the 7-DOF space manipulator, of which MLCC is evaluated quantitatively. By in-depth research of the simulation results, significant gap between the values of MLCC when using different initial configurations is analyzed, and the discontinuity of allowable load-carrying capacity is illustrated. The proposed analytical method can be taken as theoretical foundation of feasibility analysis, trajectory optimization, and optimal control of trajectory tracking task in on-orbit load-carrying operations.

  18. Team situation awareness in nuclear power plant process control: A literature review, task analysis and future research

    International Nuclear Information System (INIS)

    Operator achievement and maintenance of situation awareness (SA) in nuclear power plant (NPP) process control has emerged as an important concept in defining effective relationships between humans and automation in this complex system. A literature review on factors influencing SA revealed several variables to be important to team SA, including the overall task and team goals, individual tasks, team member roles, and the team members themselves. Team SA can also be adversely affected by a range of factors, including stress, mental over- or under-loading, system design (including human-machine interface design), complexity, human error in perception, and automation. Our research focused on the analysis of 'shared' SA and team SA among an assumed three-person, main-control-room team. Shared SA requirements represent the knowledge that is held in common by NPP operators, and team SA represents the collective, unique knowledge of all operators. The paper describes an approach to goal-directed task analysis (GDTA) applied to NPP main control room operations. In general, the GDTA method reveals critical operator decision and information requirements. It identifies operator SA requirements relevant to performing complex systems control. The GDTA can reveal requirements at various levels of cognitive processing, including perception, comprehension and projection, in NPP process control. Based on the literature review and GDTA approach, a number of potential research issues are proposed with an aim toward understanding and facilitating team SA in NPP process control. (authors)

  19. A Work-Demand Analysis Compatible with Preemption-Aware Scheduling for Power-Aware Real-Time Tasks

    Directory of Open Access Journals (Sweden)

    Da-Ren Chen

    2013-01-01

    Full Text Available Due to the importance of slack time utilization for power-aware scheduling algorithms,we propose a work-demand analysis method called parareclamation algorithm (PRA to increase slack time utilization of the existing real-time DVS algorithms. PRA is an online scheduling for power-aware real-time tasks under rate-monotonic (RM policy. It can be implemented and fully compatible with preemption-aware or transition-aware scheduling algorithms without increasing their computational complexities. The key technique of the heuristics method doubles the analytical interval and turns the deferrable workload out the potential slack time. Theoretical proofs show that PRA guarantees the task deadlines in a feasible RM schedule and takes linear time and space complexities. Experimental results indicate that the proposed method combining the preemption-aware methods seamlessly reduces the energy consumption by 14% on average over their original algorithms.

  20. FINITE VOLUME NUMERICAL ANALYSIS FOR PARABOLIC EQUATION WITH ROBIN BOUNDARY CONDITION

    Institute of Scientific and Technical Information of China (English)

    Xia Cui

    2005-01-01

    In this paper, finite volume method on unstructured meshes is studied for a parabolic convection-diffusion problem on an open bounded set of Rd (d = 2 or 3) with Robin boundary condition. Upwinding approximations are adapted to treat both the convection term and Robin boundary condition. By directly getting start from the formulation of the finite volume scheme, numerical analysis is done. By using several discrete functional analysis techniques such as summation by parts, discrete norm inequality, et al, the stability and error estimates on the approximate solution are established, existence and uniqueness of the approximate solution and the 1st order temporal norm and L2 and H1 spacial norm convergence properties are obtained.

  1. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    International Nuclear Information System (INIS)

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs

  2. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    International Nuclear Information System (INIS)

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs

  3. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.

  4. Predicting Nonauditory Adverse Radiation Effects Following Radiosurgery for Vestibular Schwannoma: A Volume and Dosimetric Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hayhurst, Caroline; Monsalves, Eric; Bernstein, Mark; Gentili, Fred [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada); Heydarian, Mostafa; Tsao, May [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Schwartz, Michael [Radiation Oncology Program and Division of Neurosurgery, Sunnybrook Hospital, Toronto (Canada); Prooijen, Monique van [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Millar, Barbara-Ann; Menard, Cynthia [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Kulkarni, Abhaya V. [Division of Neurosurgery, Hospital for Sick Children, University of Toronto (Canada); Laperriere, Norm [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Zadeh, Gelareh, E-mail: Gelareh.Zadeh@uhn.on.ca [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada)

    2012-04-01

    Purpose: To define clinical and dosimetric predictors of nonauditory adverse radiation effects after radiosurgery for vestibular schwannoma treated with a 12 Gy prescription dose. Methods: We retrospectively reviewed our experience of vestibular schwannoma patients treated between September 2005 and December 2009. Two hundred patients were treated at a 12 Gy prescription dose; 80 had complete clinical and radiological follow-up for at least 24 months (median, 28.5 months). All treatment plans were reviewed for target volume and dosimetry characteristics; gradient index; homogeneity index, defined as the maximum dose in the treatment volume divided by the prescription dose; conformity index; brainstem; and trigeminal nerve dose. All adverse radiation effects (ARE) were recorded. Because the intent of our study was to focus on the nonauditory adverse effects, hearing outcome was not evaluated in this study. Results: Twenty-seven (33.8%) patients developed ARE, 5 (6%) developed hydrocephalus, 10 (12.5%) reported new ataxia, 17 (21%) developed trigeminal dysfunction, 3 (3.75%) had facial weakness, and 1 patient developed hemifacial spasm. The development of edema within the pons was significantly associated with ARE (p = 0.001). On multivariate analysis, only target volume is a significant predictor of ARE (p = 0.001). There is a target volume threshold of 5 cm3, above which ARE are more likely. The treatment plan dosimetric characteristics are not associated with ARE, although the maximum dose to the 5th nerve is a significant predictor of trigeminal dysfunction, with a threshold of 9 Gy. The overall 2-year tumor control rate was 96%. Conclusions: Target volume is the most important predictor of adverse radiation effects, and we identified the significant treatment volume threshold to be 5 cm3. We also established through our series that the maximum tolerable dose to the 5th nerve is 9 Gy.

  5. Evaluation of Cylinder Volume Estimation Methods for In–Cylinder Pressure Trace Analysis

    OpenAIRE

    Adrian Irimescu

    2012-01-01

    In–cylinder pressure trace analysis is an important investigation tool frequently employed in the study of internal combustion engines. While technical data is usually available for experimental engines, in some cases measurements are performed on automotive engines for which only the most basic geometry features are available. Therefore, several authors aimed to determine the cylinder volume and length of the connecting rod by other methods than direct measurement. This stu...

  6. 3D photography in the objective analysis of volume augmentation including fat augmentation and dermal fillers.

    Science.gov (United States)

    Meier, Jason D; Glasgold, Robert A; Glasgold, Mark J

    2011-11-01

    The authors present quantitative and objective 3D data from their studies showing long-term results with facial volume augmentation. The first study analyzes fat grafting of the midface and the second study presents augmentation of the tear trough with hyaluronic filler. Surgeons using 3D quantitative analysis can learn the duration of results and the optimal amount to inject, as well as showing patients results that are not demonstrable with standard, 2D photography. PMID:22004863

  7. TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 2: User's manual

    Science.gov (United States)

    Myers, R. A.; Topp, D. A.; Delaney, R. A.

    1995-01-01

    The primary objective of this study was the development of a computational fluid dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a graphical user interface (GUI). The computer codes resulting from this effort are referred to as the Turbomachinery Analysis and Design System (TADS). This document is intended to serve as a user's manual for the computer programs which comprise the TADS system. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of various programs was done in a way that alternative solvers or grid generators could be easily incorporated into the TADS framework.

  8. Using cognitive complexity analysis for the grading and sequencing of isiXhosa tasks in the curriculum design of a communication course for education students

    Directory of Open Access Journals (Sweden)

    Marianna Visser

    2011-09-01

    Full Text Available This article investigates the use of cognitive complexity analysis to inform the grading and sequencing of tasks for the purposes of curriculum design of a specific purposes isiXhosa course for student teachers. Two frameworks of cognitive complexity, that of Skehan and Robinson, are discussed, after which two communication tasks are analysed in terms of Robinson’s framework.

  9. Analysis of Occupants’ Visual Perception to Refine Indoor Lighting Environment for Office Tasks

    Directory of Open Access Journals (Sweden)

    Ji-Hyun Lee

    2014-06-01

    Full Text Available The combined effects of color temperature and illuminance in a small office on visual response and mood under various lighting conditions were examined in this study. Visual annoyance tests were conducted using a sample of 20 subjects in a full-scale mock-up test space. Computer and paper-based reading tasks were conducted for 500 lx and 750 lx illuminance levels under 3,000 K, 4,000 K and 6,500 K conditions. Two hypotheses were considered for the test in this study. The primary hypothesis was that visual perception is affected by the color temperatures of light sources. The secondary hypothesis was that better moods, such as relaxed and cozy feelings, are associated with low color temperatures given equal illuminance levels. The visual environment under the 3,000 K condition was characterized by glare and brightness, resulting in visual discomfort when target illuminance was higher than 500 lx. Occupants preferred 500 lx under the 6,500 K condition, and 500 lx and 750 lx under the 4,000 K condition, reporting better visual satisfaction when performing office tasks. Prediction models for visual comfort suggest that the less that subjects are visually bothered by light during tasks, the more visual comfort they feel. User satisfaction with light source color is critical for the prediction of visual comfort under different lighting conditions. Visual comfort was the most influential factor on mood. Lower color temperature was associated with better mood at lower illuminance levels, while higher color temperature was preferred at higher illuminance levels.

  10. Human Factors Process Task Analysis Liquid Oxygen Pump Acceptance Test Procedure for the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.

    2002-01-01

    A process task analysis effort was undertaken by Dynacs Inc. commencing in June 2002 under contract from NASA YA-D6. Funding was provided through NASA's Ames Research Center (ARC), Code M/HQ, and Industrial Engineering and Safety (IES). The John F. Kennedy Space Center (KSC) Engineering Development Contract (EDC) Task Order was 5SMA768. The scope of the effort was to conduct a Human Factors Process Failure Modes and Effects Analysis (HF PFMEA) of a hazardous activity and provide recommendations to eliminate or reduce the effects of errors caused by human factors. The Liquid Oxygen (LOX) Pump Acceptance Test Procedure (ATP) was selected for this analysis. The HF PFMEA table (see appendix A) provides an analysis of six major categories evaluated for this study. These categories include Personnel Certification, Test Procedure Format, Test Procedure Safety Controls, Test Article Data, Instrumentation, and Voice Communication. For each specific requirement listed in appendix A, the following topics were addressed: Requirement, Potential Human Error, Performance-Shaping Factors, Potential Effects of the Error, Barriers and Controls, Risk Priority Numbers, and Recommended Actions. This report summarizes findings and gives recommendations as determined by the data contained in appendix A. It also includes a discussion of technology barriers and challenges to performing task analyses, as well as lessons learned. The HF PFMEA table in appendix A recommends the use of accepted and required safety criteria in order to reduce the risk of human error. The items with the highest risk priority numbers should receive the greatest amount of consideration. Implementation of the recommendations will result in a safer operation for all personnel.

  11. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L. [Pacific Science and Engineering Group, San Diego, CA (United States)] [and others

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.

  12. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    International Nuclear Information System (INIS)

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  13. Effect of varicocelectomy on testis volume and semen parameters in adolescents: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Tie Zhou

    2015-01-01

    Full Text Available Varicocele repair in adolescent remains controversial. Our aim is to identify and combine clinical trials results published thus far to ascertain the efficacy of varicocelectomy in improving testis volume and semen parameters compared with nontreatment control. A literature search was performed using Medline, Embase and Web of Science, which included results obtained from meta-analysis, randomized and nonrandomized controlled studies. The study population was adolescents with clinically palpable varicocele with or without the testicular asymmetry or abnormal semen parameters. Cases were allocated to treatment and observation groups, and testis volume or semen parameters were adopted as outcome measures. As a result, seven randomized controlled trials (RCTs and nonrandomized controlled trials studying bilateral testis volume or semen parameters in both treatment and observation groups were identified. Using a random effect model, mean difference of testis volume between the treatment group and the observation group was 2.9 ml (95% confidence interval [CI]: 0.6, 5.2; P< 0.05 for the varicocele side and 1.5 ml (95% CI: 0.3, 2.7; P< 0.05 for the healthy side. The random effect model analysis demonstrated that the mean difference of semen concentration, total semen motility, and normal morphology between the two groups was 13.7 × 10 6 ml−1 (95% CI: −1.4, 28.8; P = 0.075, 2.5% (95% CI: −3.6, 8.6; P= 0.424, and 2.9% (95% CI: −3.0, 8.7; P= 0.336 respectively. In conclusion, although varicocelectomy significantly improved bilateral testis volume in adolescents with varicocele compared with observation cases, semen parameters did not have any statistically significant difference between two groups. Well-planned, properly conducted RCTs are needed in order to confirm the above-mentioned conclusion further and to explore whether varicocele repair in adolescents could improve subsequently spontaneous pregnancy rates.

  14. A distributional analysis of the effect of physical exercise on a choice reaction time task.

    Science.gov (United States)

    Davranche, Karen; Audiffren, Michel; Denjean, André

    2006-03-01

    The aim of this study was to examine the facilitating effects of physical exercise on the reaction process. Eleven participants with specific expertise in decision-making sports performed a choice reaction time task during moderate sub-maximal exercise (90% of their ventilatory threshold power). Participants were tested at rest and while cycling. During exercise, the participants were faster, without being more variable. We suggest that the effect of exercise on cognitive performance was due to a major generalized improvement of the whole distribution of response time and, although the benefit effect was small, it was consistent throughout the entire range of reaction times. PMID:16368641

  15. Magnetic resonance velocity imaging derived pressure differential using control volume analysis

    Directory of Open Access Journals (Sweden)

    Cohen Benjamin

    2011-03-01

    Full Text Available Abstract Background Diagnosis and treatment of hydrocephalus is hindered by a lack of systemic understanding of the interrelationships between pressures and flow of cerebrospinal fluid in the brain. Control volume analysis provides a fluid physics approach to quantify and relate pressure and flow information. The objective of this study was to use control volume analysis and magnetic resonance velocity imaging to non-invasively estimate pressure differentials in vitro. Method A flow phantom was constructed and water was the experimental fluid. The phantom was connected to a high-resolution differential pressure sensor and a computer controlled pump producing sinusoidal flow. Magnetic resonance velocity measurements were taken and subsequently analyzed to derive pressure differential waveforms using momentum conservation principles. Independent sensor measurements were obtained for comparison. Results Using magnetic resonance data the momentum balance in the phantom was computed. The measured differential pressure force had amplitude of 14.4 dynes (pressure gradient amplitude 0.30 Pa/cm. A 12.5% normalized root mean square deviation between derived and directly measured pressure differential was obtained. These experiments demonstrate one example of the potential utility of control volume analysis and the concepts involved in its application. Conclusions This study validates a non-invasive measurement technique for relating velocity measurements to pressure differential. These methods may be applied to clinical measurements to estimate pressure differentials in vivo which could not be obtained with current clinical sensors.

  16. Lobar analysis of collapsibility indices to assess functional lung volumes in COPD patients

    Directory of Open Access Journals (Sweden)

    Kitano M

    2014-12-01

    Full Text Available Mariko Kitano,1 Shingo Iwano,1 Naozumi Hashimoto,2 Keiji Matsuo,3 Yoshinori Hasegawa,2 Shinji Naganawa1 1Department of Radiology, 2Department of Respiratory Medicine, Graduate School of Medicine, Nagoya University, Nagoya, Aichi, Japan; 3Department of Radiology, Ichinomiya Municipal Hospital, Ichinomiya, Aichi, Japan Background: We investigated correlations between lung volume collapsibility indices and pulmonary function test (PFT results and assessed lobar differences in chronic obstructive pulmonary disease (COPD patients, using paired inspiratory and expiratory three dimensional (3D computed tomography (CT images. Methods: We retrospectively assessed 28 COPD patients who underwent paired inspiratory and expiratory CT and PFT exams on the same day. A computer-aided diagnostic system calculated total lobar volume and emphysematous lobar volume (ELV. Normal lobar volume (NLV was determined by subtracting ELV from total lobar volume, both for inspiratory phase (NLVI and for expiratory phase (NLVE. We also determined lobar collapsibility indices: NLV collapsibility ratio (NLVCR (% = (1 - NLVE/NLVI × 100%. Associations between lobar volumes and PFT results, and collapsibility indices and PFT results were determined by Pearson correlation analysis. Results: NLVCR values were significantly correlated with PFT results. Forced expiratory volume in 1 second, measured as percent of predicted results (FEV1%P was significantly correlated with NLVCR values for the lower lobes (P<0.01, whereas this correlation was not significant for the upper lobes (P=0.05. FEV1%P results were also moderately correlated with inspiratory, expiratory ELV (ELVI,E for the lower lobes (P<0.05. In contrast, the ratio of the diffusion capacity for carbon monoxide to alveolar gas volume, measured as percent of predicted (DLCO/VA%P results were strongly correlated with ELVI for the upper lobes (P<0.001, whereas this correlation with NLVCR values was weaker for upper lobes (P<0

  17. Fabrication, testing, and analysis of anisotropic carbon/glass hybrid composites: volume 1: technical report.

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, Kyle K. (Wetzel Engineering, Inc. Lawrence, Kansas); Hermann, Thomas M. (Wichita state University, Wichita, Kansas); Locke, James (Wichita state University, Wichita, Kansas)

    2005-11-01

    o} from the long axis for approximately two-thirds of the laminate volume (discounting skin layers), with reinforcing carbon fibers oriented axially comprising the remaining one-third of the volume. Finite element analysis of each laminate has been performed to examine first ply failure. Three failure criteria--maximum stress, maximum strain, and Tsai-Wu--have been compared. Failure predicted by all three criteria proves generally conservative, with the stress-based criteria the most conservative. For laminates that respond nonlinearly to loading, large error is observed in the prediction of failure using maximum strain as the criterion. This report documents the methods and results in two volumes. Volume 1 contains descriptions of the laminates, their fabrication and testing, the methods of analysis, the results, and the conclusions and recommendations. Volume 2 contains a comprehensive summary of the individual test results for all laminates.

  18. Correcting Working Postures in Water Pump AssemblyTasks using the OVAKO Work Analysis System (OWAS

    Directory of Open Access Journals (Sweden)

    Atiya Kadhim Al-Zuheri

    2008-01-01

    Full Text Available Ovako Working Postures Analyzing System (OWAS is a widely used method for studying awkward working postures in workplaces. This study with OWAS, analyzed working postures for manual material handling of laminations at stacking workstation for water pump assembly line in Electrical Industrial Company (EICO / Baghdad. A computer program, WinOWAS, was used for the study. In real life workstation was found that more than 26% of the working postures observed were classified as either AC2 (slightly harmful, AC3 (distinctly harmful. Postures that needed to be corrected soon (AC3 and corresponding tasks, were identified. The most stressful tasks observed were grasping, handling, and positioning of the laminations from workers. The construction of real life workstation is modified simultaneously by redesign suggestions in the values of location (positioning factors for stacking workstation. The simulation workstation executed by mean of parametric CAD software. That modifications lead to improvement in the percentage of harmful postures. It was therefore recommended the use of supplementary methods is required to identify ergonomic risk factors for handling work or other hand-intensive activities on industry sites.

  19. Mathematical tasks, study approaches, and course grades in undergraduate mathematics: a year-by-year analysis

    Science.gov (United States)

    Maciejewski, Wes; Merchant, Sandra

    2016-04-01

    Students approach learning in different ways, depending on the experienced learning situation. A deep approach is geared toward long-term retention and conceptual change while a surface approach focuses on quickly acquiring knowledge for immediate use. These approaches ultimately affect the students' academic outcomes. This study takes a cross-sectional look at the approaches to learning used by students from courses across all four years of undergraduate mathematics and analyses how these relate to the students' grades. We find that deep learning correlates with grade in the first year and not in the upper years. Surficial learning has no correlation with grades in the first year and a strong negative correlation with grades in the upper years. Using Bloom's taxonomy, we argue that the nature of the tasks given to students is fundamentally different in lower and upper year courses. We find that first-year courses emphasize tasks that require only low-level cognitive processes. Upper year courses require higher level processes but, surprisingly, have a simultaneous greater emphasis on recall and understanding. These observations explain the differences in correlations between approaches to learning and course grades. We conclude with some concerns about the disconnect between first year and upper year mathematics courses and the effect this may have on students.

  20. Performance Prediction Modelling for Flexible Pavement on Low Volume Roads Using Multiple Linear Regression Analysis

    Directory of Open Access Journals (Sweden)

    C. Makendran

    2015-01-01

    Full Text Available Prediction models for low volume village roads in India are developed to evaluate the progression of different types of distress such as roughness, cracking, and potholes. Even though the Government of India is investing huge quantum of money on road construction every year, poor control over the quality of road construction and its subsequent maintenance is leading to the faster road deterioration. In this regard, it is essential that scientific maintenance procedures are to be evolved on the basis of performance of low volume flexible pavements. Considering the above, an attempt has been made in this research endeavor to develop prediction models to understand the progression of roughness, cracking, and potholes in flexible pavements exposed to least or nil routine maintenance. Distress data were collected from the low volume rural roads covering about 173 stretches spread across Tamil Nadu state in India. Based on the above collected data, distress prediction models have been developed using multiple linear regression analysis. Further, the models have been validated using independent field data. It can be concluded that the models developed in this study can serve as useful tools for the practicing engineers maintaining flexible pavements on low volume roads.

  1. Cortical networks for rotational uncertainty effect in mental rotation task by partial directed coherence analysis of EEG.

    Science.gov (United States)

    Yan, Jing; Guo, Xiaoli; Sun, Junfeng; Tong, Shanbao

    2011-01-01

    Partial directed coherence (PDC) as a frequency-domain representation of Granger casuality (GC) could detect both strength and direction of cortical interactions by multivariate autoregressive (MVAR) model of electroencephalography (EEG). In the present study, we investigate the underlying neural networks mechanisms of "rotational uncertainty effect" during mental rotation (MR) task by PDC analysis of multichannel EEG signals before and after the visual stimuli presented, we found that (i) temporally the "rotational uncertainty effect" involved an activated network before the visual stimuli presented, which could also affect the cognitive process of MR later; (ii) the causality functional connectivity network indicated that the bi-directional frontal [symbol see text] parietal networks played critical roles in maintaining the readiness during the MR task. These findings suggest that functional networks of un-cued preparation before visual stimuli presented are worth to be paid more attention. And these networks provide crucial casuality information to understand the neural mechanism for "rotational uncertainty effect" in MR task. PMID:22254583

  2. Pressurized fluidized-bed hydroretorting of eastern oil shales. Volume 2, Task 3, Testing of process improvement concepts: Final report, September 1987--May 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    This final report, Volume 2, on ``Process Improvement Concepts`` presents the results of work conducted by the Institute of Gas Technology (IGT), the Illinois Institute of Technology (IIT), and the Ohio State University (OSU) to develop three novel approaches for desulfurization that have shown good potential with coal and could be cost-effective for oil shales. These are (1) In-Bed Sulfur Capture using different sorbents (IGT), (2) Electrostatic Desulfurization (IIT), and (3) Microbial Desulfurization and Denitrification (OSU and IGT). Results of work on electroseparation of shale oil and fines conducted by IIT is included in this report, as well as work conducted by IGT to evaluate the restricted pipe discharge system. The work was conducted as part of the overall program on ``Pressurized Fluidized-Bed Hydroretorting of Eastern Oil Shales.``

  3. Multifidus Muscle Volume Estimation Based on Three Dimensional Wavelet Multi Resolution Analysis: MRA with Buttocks Computer-Tomography: CT Images

    OpenAIRE

    Kohei Arai

    2013-01-01

    Multi-Resolution Analysis:. MRA based edge detection algorithm is proposed for estimation of volume of multifidus muscle in the Computer Tomography: CT scanned image The volume of multifidus muscle would be a good measure for metabolic syndrome rather than internal fat from a point of view from processing complexity. The proposed measure shows 0.178 of R square which corresponds to mutual correlation between internal fat and the volume of multifidus muscle. It is also fund that R square betwe...

  4. Multi-Task Learning for Food Identification and Analysis with Deep Convolutional Neural Networks

    Institute of Scientific and Technical Information of China (English)

    Xi-Jin Zhang; Yi-Fan Lu; Song-Hai Zhang

    2016-01-01

    In this paper, we proposed a multi-task system that can identify dish types, food ingredients, and cooking methods from food images with deep convolutional neural networks. We built up a dataset of 360 classes of different foods with at least 500 images for each class. To reduce the noises of the data, which was collected from the Internet, outlier images were detected and eliminated through a one-class SVM trained with deep convolutional features. We simultaneously trained a dish identifier, a cooking method recognizer, and a multi-label ingredient detector. They share a few low-level layers in the deep network architecture. The proposed framework shows higher accuracy than traditional method with handcrafted features, and the cooking method recognizer and ingredient detector can be applied to dishes which are not included in the training dataset to provide reference information for users.

  5. Final safety analysis report for the Galileo Mission: Volume 1, Reference design document

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    The Galileo mission uses nuclear power sources called Radioisotope Thermoelectric Generators (RTGs) to provide the spacecraft's primary electrical power. Because these generators contain nuclear material, a Safety Analysis Report (SAR) is required. A preliminary SAR and an updated SAR were previously issued that provided an evolving status report on the safety analysis. As a result of the Challenger accident, the launch dates for both Galileo and Ulysses missions were later rescheduled for November 1989 and October 1990, respectively. The decision was made by agreement between the DOE and the NASA to have a revised safety evaluation and report (FSAR) prepared on the basis of these revised vehicle accidents and environments. The results of this latest revised safety evaluation are presented in this document (Galileo FSAR). Volume I, this document, provides the background design information required to understand the analyses presented in Volumes II and III. It contains descriptions of the RTGs, the Galileo spacecraft, the Space Shuttle, the Inertial Upper Stage (IUS), the trajectory and flight characteristics including flight contingency modes, and the launch site. There are two appendices in Volume I which provide detailed material properties for the RTG.

  6. Fast implementation of kernel simplex volume analysis based on modified Cholesky factorization for endmember extraction

    Institute of Scientific and Technical Information of China (English)

    Jing LI; Xiao-run LI; Li-jiao WANG; Liao-ying ZHAO

    2016-01-01

    Endmember extraction is a key step in the hyperspectral image analysis process. The kernel new simplex growing algorithm (KNSGA), recently developed as a nonlinear alternative to the simplex growing algorithm (SGA), has proven a prom-ising endmember extraction technique. However, KNSGA still suffers from two issues limiting its application. First, its random initialization leads to inconsistency in final results; second, excessive computation is caused by the iterations of a simplex volume calculation. To solve the first issue, the spatial pixel purity index (SPPI) method is used in this study to extract the first endmember, eliminating the initialization dependence. A novel approach tackles the second issue by initially using a modified Cholesky fac-torization to decompose the volume matrix into triangular matrices, in order to avoid directly computing the determinant tauto-logically in the simplex volume formula. Theoretical analysis and experiments on both simulated and real spectral data demon-strate that the proposed algorithm significantly reduces computational complexity, and runs faster than the original algorithm.

  7. Economic analysis of a volume reduction/polyethylene solidification system for low-level radioactive wastes

    International Nuclear Information System (INIS)

    A study was conducted at Brookhaven National Laboratory to determine the economic feasibility of a fluidized bed volume reduction/polyethylene solidification system for low-level radioactive wastes. These results are compared with the ''null'' alternative of no volume reduction and solidification of aqueous waste streams in hydraulic cement. The economic analysis employed a levelized revenue requirement (LRR) technique conducted over a ten year period. An interactive computer program was written to conduct the LRR calculations. Both of the treatment/solidification options were considered for a number of scenarios including type of plant (BWR or PWR) and transportation distance to the disposal site. If current trends in the escalation rates of cost components continue, the volume reduction/polyethylene solidification option will be cost effective for both BWRs and PWRs. Data indicate that a minimum net annual savings of $0.8 million per year (for a PWR shipping its waste 750 miles) and a maximum net annual savings of $9 million per year (for a BWR shipping its waste 2500 miles) can be achieved. A sensitivity analysis was performed for the burial cost escalation rate, which indicated that variation of this factor will impact the total levelized revenue requirement. The burial cost escalation rate which yields a break-even condition was determined for each scenario considered. 11 refs., 8 figs., 39 tabs

  8. Systems Studies Department FY 78 activity report. Volume 2. Systems analysis. [Sandia Laboratories, Livermore

    Energy Technology Data Exchange (ETDEWEB)

    Gold, T.S.

    1979-02-01

    The Systems Studies Department at Sandia Laboratories Livermore (SLL) has two primary responsibilities: to provide computational and mathematical services and to perform systems analysis studies. This document (Volume 2) describes the FY Systems Analysis highlights. The description is an unclassified overview of activities and is not complete or exhaustive. The objective of the systems analysis activities is to evaluate the relative value of alternative concepts and systems. SLL systems analysis activities reflect Sandia Laboratory programs and in 1978 consisted of study efforts in three areas: national security: evaluations of strategic, theater, and navy nuclear weapons issues; energy technology: particularly in support of Sandia's solar thermal programs; and nuclear fuel cycle physical security: a special project conducted for the Nuclear Regulatory Commission. Highlights of these activities are described in the following sections. 7 figures. (RWR)

  9. Evaluation of Cylinder Volume Estimation Methods for In–Cylinder Pressure Trace Analysis

    Directory of Open Access Journals (Sweden)

    Adrian Irimescu

    2012-09-01

    Full Text Available In–cylinder pressure trace analysis is an important investigation tool frequently employed in the study of internal combustion engines. While technical data is usually available for experimental engines, in some cases measurements are performed on automotive engines for which only the most basic geometry features are available. Therefore, several authors aimed to determine the cylinder volume and length of the connecting rod by other methods than direct measurement. This study performs an evaluation of two such methods. The most appropriate way was found to be the estimation of connecting rod length based on general engine category as opposed to the use of an equation that predicts cylinder volume with good accuracy around top dead centre for most geometries.

  10. PRESSURE-VOLUME ANALYSIS OF THE LUNG WITH AN EXPONENTIAL AND LINEAR-EXPONENTIAL MODEL IN ASTHMA AND COPD

    NARCIS (Netherlands)

    BOGAARD, JM; OVERBEEK, SE; VERBRAAK, AFM; VONS, C; FOLGERING, HTM; VANDERMARK, TW; ROOS, CM; STERK, PJ

    1995-01-01

    The prevalence of abnormalities in lung elasticity in patients with asthma or chronic obstructive pulmonary disease (COPD) is still unclear, This might be due to uncertainties concerning the method of analysis of quasistatic deflation long pressure-volume curves. Pressure-volume curves were obtained

  11. Crucial role of detailed function, task, timeline, link and human vulnerability analyses in HRA. [Human Reliability Analysis (HRA)

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, T.G.; Haney, L.N.; Ostrom, L.T.

    1992-01-01

    This paper addresses one major cause for large uncertainties in human reliability analysis (HRA) results, that is, an absence of detailed function, task, timeline, link and human vulnerability analyses. All too often this crucial step in the HRA process is done in a cursory fashion using word of mouth or written procedures which themselves may incompletely or inaccurately represent the human action sequences and human error vulnerabilities being analyzed. The paper examines the potential contributions these detailed analyses can make in achieving quantitative and qualitative HRA results which are: (1) creditable, that is, minimize uncertainty, (2) auditable, that is, systematically linking quantitative results and qualitative information from which the results are derived, (3) capable of supporting root cause analyses on human reliability factors determined to be major contributors to risk, and (4) capable of repeated measures and being combined with similar results from other analyses to examine HRA issues transcending individual systems and facilities. Based on experience analyzing test and commercial nuclear reactors, and medical applications of nuclear technology, an iterative process is suggested for doing detailed function, task, timeline, link and human vulnerability analyses using documentation reviews, open-ended and structured interviews, direct observations, and group techniques. Finally, the paper concludes that detailed analyses done in this manner by knowledgeable human factors practitioners, can contribute significantly to the credibility, auditability, causal factor analysis, and combining goals of the HRA.

  12. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    Science.gov (United States)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  13. Georgetown University Integrated Community Energy System (GU-ICES). Phase III, Stage I. Feasibility analysis. Final report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-10-01

    This Feasibility Analysis covers a wide range of studies and evaluations. The Report is divided into five parts. Section 1 contains all material relating to the Institutional Assessment including consideration of the requirements and position of the Potomac Electric Co. as they relate to cogeneration at Georgetown in parallel with the utility (Task 1). Sections 2 through 7 contain all technical information relating to the Alternative Subsystems Analysis (Task 4). This includes the energy demand profiles upon which the evaluations were based (Task 3). It further includes the results of the Life-Cycle-Cost Analyses (Task 5) which are developed in detail in the Appendix for evaluation in the Technical Report. Also included is the material relating to Incremental Savings and Optimization (Task 6) and the Conceptual Design for candidate alternate subsystems (Task 7). Section 8 contains all material relating to the Environmental Impact Assessment (Task 2). The Appendix contains supplementary material including the budget cost estimates used in the life-cycle-cost analyses, the basic assumptions upon which the life-cycle analyses were developed, and the detailed life-cycle-cost anlysis for each subsystem considered in detail.

  14. Engineering Task Plan for Development and Fabrication and Deployment of a mobile, variable depth sampling At-Tank Analysis Systems

    International Nuclear Information System (INIS)

    This engineering task plan identifies the resources, responsibilities, and schedules for the development and deployment of a mobile, variable depth sampling system and an at-tank analysis system. The mobile, variable depth sampling system concept was developed after a cost assessment indicated a high cost for multiple deployments of the nested, fixed-depth sampling system. The sampling will provide double-shell tank (DST) staging tank waste samples for assuring the readiness of the waste for shipment to the LAW/HLW plant for treatment and immobilization. The at-tank analysis system will provide ''real-time'' assessments of the samples' chemical and physical properties. These systems support the Hanford Phase 1B vitrification project

  15. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  16. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  17. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    Science.gov (United States)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  18. Fatigue monitoring and analysis of orthotropic steel deck considering traffic volume and ambient temperature

    Institute of Scientific and Technical Information of China (English)

    SONG; YongSheng; DING; YouLiang

    2013-01-01

    Fatigue has gradually become a serious issue for orthotropic steel deck used for long-span bridges. Two fatigue effects, namely number of stress cycles and equivalent stress amplitude, were introduced as investigated parameters in this paper. Investigation was focused on their relationships with traffic volume and ambient temperature by using 7-months fatigue monitoring data of an actual bridge. A fatigue analytical model considering temperature-induced changes in material property of asphalt pavement was established for verifying these relationships. The analysis results revealed that the number of stress cycles and equivalent stress amplitude showed a linear correlation with the traffic volume and ambient temperature, respectively, and that the rib-to-deck welded joint was much more sensitive to the traffic volume and ambient temperature than the rib-to-rib welded joint. The applicability of the code-recommended model for fatigue vehicle loading was also discussed, which revealed that the deterministic vehicle loading model requires improvement to account for significant randomness of the actual traffic conditions.

  19. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    OpenAIRE

    Chernov, Vladimir; Oleksandr DOROKHOV; Liudmyla DOROKHOVA

    2016-01-01

    The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the eva...

  20. Study of the free volume fraction in polylactic acid (PLA) by thermal analysis

    Science.gov (United States)

    Abdallah, A.; Benrekaa, N.

    2015-10-01

    The poly (lactic acid) or polylactide (PLA) is a biodegradable polymer with high modulus, strength and thermoplastic properties. In this work, the evolution of various properties of PLA is studied, such as glass transition temperature, mechanical modules and elongation percentage with the aim of investigating the free volume fraction. To do so, two thermal techniques have been used: the dynamic mechanical analysis (DMA) and dilatometry. The results obtained by these techniques are combined to go back to the structural properties of the studied material.

  1. Composite materials. Volume 3 - Engineering applications of composites. Volume 4 - Metallic matrix composites. Volume 8 - Structural design and analysis, Part 2

    Science.gov (United States)

    Noton, B. R. (Editor); Kreider, K. G.; Chamis, C. C.

    1974-01-01

    This volume discusses a vaety of applications of both low- and high-cost composite materials in a number of selected engineering fields. The text stresses the use of fiber-reinforced composites, along with interesting material systems used in the electrical and nuclear industries. As to technology transfer, a similarity is noted between many of the reasons responsible for the utilization of composites and those problems requiring urgent solution, such as mechanized fabrication processes and design for production. Features topics include road transportation, rail transportation, civil aircraft, space vehicles, builing industry, chemical plants, and appliances and equipment. The laminate orientation code devised by Air Force materials laboratory is included. Individual items are announced in this issue.

  2. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    Science.gov (United States)

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  3. Effects of Physical Exercise Interventions on Gait-Related Dual-Task Interference in Older Adults: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Plummer, Prudence; Zukowski, Lisa A; Giuliani, Carol; Hall, Amber M; Zurakowski, David

    2015-01-01

    Dual-task interference during walking can substantially limit mobility and increase the risk of falls among community-dwelling older adults. Previous systematic reviews examining intervention effects on dual-task gait and mobility have not assessed relative dual-task costs (DTC) or investigated whether there are differences in treatment-related changes based on the type of dual task or the type of control group. The purpose of this systematic review was to examine the effects of physical exercise interventions on dual-task performance during walking in older adults. A meta-analysis of randomized controlled trials (RCTs) compared treatment effects between physical exercise intervention and control groups on single- and dual-task gait speed and relative DTC on gait speed. A systematic search of the literature was conducted using the electronic databases PubMed, CINAHL, EMBASE, Web of Science, and PsycINFO searched up to September 19, 2014. Randomized, nonrandomized, and uncontrolled studies published in English and involving older adults were selected. Studies had to include a physical exercise intervention protocol and measure gait parameters during continuous, unobstructed walking in single- and dual-task conditions before and after the intervention. Of 614 abstracts, 21 studies met the inclusion criteria and were included in the systematic review. Fourteen RCTs were included in the meta-analysis. The mean difference between the intervention and control groups significantly favored the intervention for single-task gait speed (mean difference: 0.06 m/s, 95% CI: 0.03, 0.10, p gait speed (mean difference: 0.11 m/s, 95% CI 0.07, 0.15, p gait speed (mean difference: 5.23%, 95% CI 1.40, 9.05, p = 0.007). Evidence from subgroup comparisons showed no difference in treatment-related changes between cognitive-motor and motor-motor dual tasks, or when interventions were compared to active or inactive controls. In summary, physical exercise interventions can improve dual-task

  4. Final Report: LSAC Skills Analysis. Law School Task Survey. LSAC Research Report Series.

    Science.gov (United States)

    Luebke, Stephen W.; Swygert, Kimberly A.; McLeod, Lori D.; Dalessandro, Susan P.; Roussos, Louis A.

    The Law School Admission Council (LSAC) Skills Analysis Survey identifies the skills that are important for success in law school. This information provides validity evidence for the current Law School Admission Test (LSAT) and guides the development of new test items and test specifications. The key question of the survey is "what academic tasks…

  5. A Meta-Analysis of the Wisconsin Card Sort Task in Autism

    Science.gov (United States)

    Landry, Oriane; Al-Taie, Shems

    2016-01-01

    We conducted a meta-analysis of 31 studies, spanning 30 years, utilizing the WCST in participants with autism. We calculated Cohen's d effect sizes for four measures of performance: sets completed, perseveration, failure-to-maintain-set, and non-perseverative errors. The average weighted effect size ranged from 0.30 to 0.74 for each measure, all…

  6. Cognitive Task Analysis of Experts in Designing Multimedia Learning Object Guideline (M-LOG)

    Science.gov (United States)

    Razak, Rafiza Abdul; Palanisamy, Punithavathy

    2013-01-01

    The purpose of this study was to design and develop a set of guidelines for multimedia learning objects to inform instructional designers (IDs) about the procedures involved in the process of content analysis. This study was motivated by the absence of standardized procedures in the beginning phase of the multimedia learning object design which is…

  7. Review of staff training plans Licensed Benchmarking on task analysis and selection of learning environments

    International Nuclear Information System (INIS)

    The purpose of this paper is to present the findings and possible improvement actions taken after a work of technical exchange with U.S. Surry nuclear power plant. The visit focused on the study of the methodology for the analysis and design of training programs according to the standards of INPO.

  8. Analysis of volume expansion data for periclase, lime, corundum and spinel at high temperatures

    Indian Academy of Sciences (India)

    B P Singh; H Chandra; R Shyam; A Singh

    2012-08-01

    We have presented an analysis of the volume expansion data for periclase (MgO), lime (CaO), corundum (Al2O3) and spinel (MgAl2O4) determined experimentally by Fiquet et al (1999) from 300K up to 3000K. The thermal equation of state due to Suzuki et al (1979) and Shanker et al (1997) are used to study the relationships between thermal pressure and volume expansion for the entire range of temperatures starting from room temperature up to the melting temperatures of the solids under study. Comparison of the results obtained in the present study with the corresponding experimental data reveal that the thermal pressure changes with temperature almost linearly up to quite high temperatures. At extremely high temperatures close to the melting temperatures thermal pressure deviates significantly from linearity. This prediction is consistent with other recent investigations. A quantitative analysis based on the theory of anharmonic effects has been presented to account for the nonlinear variation of the thermal pressure at high temperatures.

  9. Network analysis of returns and volume trading in stock markets: The Euro Stoxx case

    Science.gov (United States)

    Brida, Juan Gabriel; Matesanz, David; Seijas, Maria Nela

    2016-02-01

    This study applies network analysis to analyze the structure of the Euro Stoxx market during the long period from 2002 up to 2014. The paper generalizes previous research on stock market networks by including asset returns and volume trading as the main variables to study the financial market. A multidimensional generalization of the minimal spanning tree (MST) concept is introduced, by adding the role of trading volume to the traditional approach which only includes price returns. Additionally, we use symbolization methods to the raw data to study the behavior of the market structure in different, normal and critical, situations. The hierarchical organization of the network is derived, and the MST for different sub-periods of 2002-2014 is created to illustrate how the structure of the market evolves over time. From the structural topologies of these trees, different clusters of companies are identified and analyzed according to their geographical and economic links. Two important results are achieved. Firstly, as other studies have highlighted, at the time of the financial crisis after 2008 the network becomes a more centralized one. Secondly and most important, during our second period of analysis, 2008-2014, we observe that hierarchy becomes more country-specific where different sub-clusters of stocks belonging to France, Germany, Spain or Italy are found apart from their business sector group. This result may suggest that during this period of time financial investors seem to be worried most about country specific economic circumstances.

  10. JV Task 99-Integrated Risk Analysis and Contaminant Reduction, Watford City, North Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Jaroslav Solc; Barry W. Botnen

    2007-05-31

    The Energy & Environmental Research Center (EERC) conducted a limited site investigation and risk analyses for hydrocarbon-contaminated soils and groundwater at a Construction Services, Inc., site in Watford City, North Dakota. Site investigation confirmed the presence of free product and high concentrations of residual gasoline-based contaminants in several wells, the presence of 1,2-dichloroethane, and extremely high levels of electrical conductivity indicative of brine residuals in the tank area south of the facility. The risk analysis was based on compilation of information from the site-specific geotechnical investigation, including multiphase extraction pilot test, laser induced fluorescence probing, evaluation of contaminant properties, receptor survey, capture zone analysis and evaluation of well head protection area for municipal well field. The project results indicate that the risks associated with contaminant occurrence at the Construction Services, Inc. site are low and, under current conditions, there is no direct or indirect exposure pathway between the contaminated groundwater and soils and potential receptors.

  11. Improved Duct Systems Task Report with StageGate 2 Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moyer, Neil [Florida Solar Energy Center, Cocoa, FL (United States); Stroer, Dennis [Calcs-Plus, Venice, FL (United States)

    2007-12-31

    This report is about Building America Industrialized Housing Partnership's work with two industry partners, Davalier Homes and Southern Energy Homes, in constructing and evaluating prototype interior duct systems. Issues of energy performance, comfort, DAPIA approval, manufacturability and cost is addressed. A stage gate 2 analysis addresses the current status of project showing that there are still refinements needed to the process of incorporating all of the ducts within the air and thermal boundaries of the envelope.

  12. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume I. Data analysis methodology and hardware description

    International Nuclear Information System (INIS)

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and had dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241Pu and 208-keV 237U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  13. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    International Nuclear Information System (INIS)

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate

  14. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Henriksen, K.; Kaye, R.D.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate.

  15. A simple algorithm for subregional striatal uptake analysis with partial volume correction in dopaminergic PET imaging

    International Nuclear Information System (INIS)

    In positron emission tomography (PET) of the dopaminergic system, quantitative measurements of nigrostriatal dopamine function are useful for differential diagnosis. A subregional analysis of striatal uptake enables the diagnostic performance to be more powerful. However, the partial volume effect (PVE) induces an underestimation of the true radioactivity concentration in small structures. This work proposes a simple algorithm for subregional analysis of striatal uptake with partial volume correction (PVC) in dopaminergic PET imaging. The PVC algorithm analyzes the separate striatal subregions and takes into account the PVE based on the recovery coefficient (RC). The RC is defined as the ratio of the PVE-uncorrected to PVE-corrected radioactivity concentration, and is derived from a combination of the traditional volume of interest (VOI) analysis and the large VOI technique. The clinical studies, comprising 11 patients with Parkinson's disease (PD) and 6 healthy subjects, were used to assess the impact of PVC on the quantitative measurements. Simulations on a numerical phantom that mimicked realistic healthy and neurodegenerative situations were used to evaluate the performance of the proposed PVC algorithm. In both the clinical and the simulation studies, the striatal-to-occipital ratio (SOR) values for the entire striatum and its subregions were calculated with and without PVC. In the clinical studies, the SOR values in each structure (caudate, anterior putamen, posterior putamen, putamen, and striatum) were significantly higher by using PVC in contrast to those without. Among the PD patients, the SOR values in each structure and quantitative disease severity ratings were shown to be significantly related only when PVC was used. For the simulation studies, the average absolute percentage error of the SOR estimates before and after PVC were 22.74% and 1.54% in the healthy situation, respectively; those in the neurodegenerative situation were 20.69% and 2

  16. Effects of elevated vacuum on in-socket residual limb fluid volume: Case study results using bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Joan E. Sanders, PhD

    2011-12-01

    Full Text Available Bioimpedance analysis was used to measure the residual limb fluid volume of seven transtibial amputee subjects using elevated vacuum sockets and nonelevated vacuum sockets. Fluid volume changes were assessed during sessions with the subjects sitting, standing, and walking. In general, fluid volume losses during 3 or 5 min walks and losses over the course of the 30 min test session were less for elevated vacuum than for suction. Numerous variables, including the time of day that data were collected, soft tissue consistency, socket-to-limb size and shape differences, and subject health, may have affected the results and had an equivalent or greater effect on limb fluid volume compared with elevated vacuum. Researchers should well consider these variables in the study design of future investigations on the effects of elevated vacuum on residual limb volume.

  17. The history of NATO TNF policy: The role of studies, analysis and exercises conference proceedings. Volume 1, Introduction and summary

    Energy Technology Data Exchange (ETDEWEB)

    Rinne, R.L. [ed.

    1994-02-01

    This conference was organized to study and analyze the role of simulation, analysis, modeling, and exercises in the history of NATO policy. The premise was not that the results of past studies will apply to future policy, but rather that understanding what influenced the decision process -- and how -- would be of value. The structure of the conference was built around discussion panels. The panels were augmented by a series of papers and presentations focusing on particular TNF events, issues, studies or exercise. The conference proceedings consist of three volumes. This volume, Volume 1, contains the conference introduction, agenda, biographical sketches of principal participants, and analytical summary of the presentations and discussion panels. Volume 2 contains a short introduction and the papers and presentations from the conference. Volume 3 contains selected papers by Brig. Gen. Robert C. Richardson III (Ret.).

  18. The history of NATO TNF policy: The role of studies, analysis and exercises conference proceedings. Volume 2: Papers and presentations

    Energy Technology Data Exchange (ETDEWEB)

    Rinne, R.L.

    1994-02-01

    This conference was organized to study and analyze the role of simulation, analysis, modeling, and exercises in the history of NATO policy. The premise was not that the results of past studies will apply to future policy, but rather that understanding what influenced the decision process -- and how -- would be of value. The structure of the conference was built around discussion panels. The panels were augmented by a series of papers and presentations focusing on particular TNF events, issues, studies, or exercises. The conference proceedings consist of three volumes. Volume 1 contains the conference introduction, agenda, biographical sketches of principal participants, and analytical summary of the presentations and panels. This volume contains a short introduction and the papers and presentations from the conference. Volume 3 contains selected papers by Brig. Gen. Robert C. Richardson III (Ret.). Individual papers in this volume were abstracted and indexed for the database.

  19. Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    Science.gov (United States)

    Davenport, David A.

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than +/- 3%. The average uncertainty was shown to be less than +/- 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any major software

  20. A Meta-analysis of the Wisconsin Card Sort Task in Autism.

    Science.gov (United States)

    Landry, Oriane; Al-Taie, Shems

    2016-04-01

    We conducted a meta-analysis of 31 studies, spanning 30 years, utilizing the WCST in participants with autism. We calculated Cohen's d effect sizes for four measures of performance: sets completed, perseveration, failure-to-maintain-set, and non-perseverative errors. The average weighted effect size ranged from 0.30 to 0.74 for each measure, all statistically greater than 0. No evidence was found for reduced impairment when WCST is administered by computer. Age and PIQ predicted perseverative error rates, while VIQ predicted non-perseverative error rates, and both perseverative and non-perseverative error rates in turn predicted number of sets completed. No correlates of failure-to-maintain set errors were found; further research is warranted on this aspect of WCST performance in autism. PMID:26614085

  1. Thermoeconomic analysis of storage systems for solar heating and cooling systems: A comparison between variable-volume and fixed-volume tanks

    International Nuclear Information System (INIS)

    The paper investigates different control strategies for the thermal storage management in SHC (Solar Heating and Cooling) systems. The SHC system under investigation is based on a field of evacuated solar collectors coupled with a single-stage LiBr–H2O absorption chiller; auxiliary thermal energy is supplied by a gas-fired boiler. The SHC is also equipped with a novel thermal storage system, consisting in a variable volume storage tank. It includes three separate tanks and a number of mixers and diverters managed by novel control strategies, based on combinations of series/parallel charging and discharging approaches. The aim of this component is to vary the thermal storage capacity as a function of the combinations of solar radiation availability and user thermal/cooling energy demands. The system allows one to increase the number of active tanks when the time shift between solar energy and user demand is high. Conversely, when this time shift is low, the number of active tanks is automatically reduced. In addition, when the solar energy in excess cannot be stored in such tanks, a heat exchanger is also used in the solar loop for producing DHW (Domestic Hot Water). The analysis is carried out by means of a zero-dimensional transient simulation model, developed by using the TRNSYS software. In order to assess the operating and capital costs of the systems under analysis, an economic model is also proposed. In addition, in order to determine the set of the synthesis/design variables which maximize the system profitability, a parametric analysis was implemented. The novel variable-volume storage system, in both the proposed configurations, was also compared with a constant-volume storage system from the energy and economic points of view. The results showed that the presented storage system allows one to save up to 20% of the natural gas used by the auxiliary boiler only for very high solar fractions. In all the other cases, marginal savings are achieved by the

  2. Predicted costs of environmental controls for a commercial oil shale industry. Volume 1. An engineering analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nevens, T.D.; Culbertson, W.J. Jr.; Wallace, J.R.; Taylor, G.C.; Jovanovich, A.P.; Prien, C.H.; Hicks, R.E.; Probstein, R.F.; Domahidy, G.

    1979-07-01

    The pollution control costs for a commercial oil shale industry were determined in a joint effort by Denver Research Institute, Water Purification Associates of Cambridge, and Stone and Webster Engineering of Boston and Denver. Four commercial oil shale processes were considered. The results in terms of cost per barrel of syncrude oil are predicted to be as follows: Paraho Process, $0.67 to $1.01; TOSCO II Process, $1.43 to $1.91; MIS Process, $2.02 to $3.03; and MIS/Lurgi-Ruhrgas Process, $1.68 to $2.43. Alternative pollution control equipment and integrated pollution control strategies were considered and optimal systems selected for each full-scale plant. A detailed inventory of equipment (along with the rationale for selection), a detailed description of control strategies, itemized costs and predicted emission levels are presented for each process. Capital and operating cost data are converted to a cost per barrel basis using detailed economic evaluation procedures. Ranges of cost are determined using a subjective self-assessment of uncertainty approach. An accepted methodology for probability encoding was used, and cost ranges are presented as subjective probability distributions. Volume I presents the detailed engineering results. Volume II presents the detailed analysis of uncertainty in the predicted costs.

  3. Simplification of the spectral analysis of the volume operator in loop quantum gravity

    International Nuclear Information System (INIS)

    The volume operator plays a crucial role in the definition of the quantum dynamics of loop quantum gravity (LQG). Efficient calculations for dynamical problems of LQG can therefore be performed only if one has sufficient control over the volume spectrum. While closed formulae for the matrix elements are currently available in the literature, these are complicated polynomials in 6j symbols which in turn are given in terms of Racah's formula which is too complicated in order to perform even numerical calculations for the semiclassically important regime of large spins. Hence, so far not even numerically the spectrum could be accessed. In this paper, we demonstrate that by means of the Elliot-Biedenharn identity one can get rid of all the 6j symbols for any valence of the gauge-invariant vertex, thus immensely reducing the computational effort. We use the resulting compact formula to study numerically the spectrum of the gauge-invariant 4-vertex. The techniques derived in this paper could also be of use for the analysis of spin-spin interaction Hamiltonians of many-particle problems in atomic and nuclear physics

  4. Precise measurement of liquid petroleum tank volume based on data cloud analysis

    Science.gov (United States)

    Wang, Jintao; Liu, Ziyong; Zhang, Long; Guo, Ligong; Bao, Xuesong; Tong, Lin

    2010-08-01

    Metal tanks are generally used for the measurement of liquid petroleum products for fiscal or custody transfer application. One tank volume precise measurement method based on data cloud analysis was studied, which was acquired by laser scanning principle. Method of distance measurement by laser phase shift and angular measurement by optical grating were applied to acquire coordinates of points in tank shell under the control of a servo system. Direct Iterative Method (DIM) and Section Area Method (SAM) were used to process measured data for vertical and horizontal tanks respectively. In comparison experiment, one 1000m3 vertical tank and one 30m3 horizontal tank were used as test objects. In the vertical tank experiment, the largest measured radius difference between the new laser method and strapping method (international arbitrary standard) is 2.8mm. In the horizontal tank experiment, the calibration result from laser scanning method is more close to reference than manual geometric method, and the mean deviation in full-scale range of the former and latter method are 75L and 141L respectively; with the increase of liquid level, the relative errors of laser scanning method and manual geometric method become smaller, and the mean relative errors are 0.6% and 1.5% respectively. By using the method discussed, the calibration efficiency of tank volume can be improved.

  5. Wind-electric icemaking project: Analysis and dynamometer testing. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Holz, R.; Gervorgian, V.; Drouilhet, S.; Muljadi, E.

    1998-07-01

    The wind/hybrid systems group at the National Renewable Energy Laboratory has been researching the most practical and cost-effective methods for producing ice from off-grid wind-electric power systems. The first phase of the project, conducted in 1993--1994, included full-scale dynamometer and field testing of two different electric ice makers directly connected to a permanent magnet alternator. The results of that phase were encouraging and the second phase of the project was launched in which steady-state and dynamic numerical models of these systems were developed and experimentally validated. The third phase of the project was the dynamometer testing of the North Star ice maker, which is powered by a 12-kilowatt Bergey Windpower Company, Inc., alternator. This report describes both the second and third project phases. Also included are detailed economic analyses and a discussion of the future prospects of wind-electric ice-making systems. The main report is contained in Volume 1. Volume 2 consists of the report appendices, which include the actual computer programs used in the analysis and the detailed test results.

  6. Performance Analysis of Fractured Wells with Stimulated Reservoir Volume in Coal Seam Reservoirs

    Directory of Open Access Journals (Sweden)

    Yu-long Zhao

    2016-01-01

    Full Text Available CoalBed Methane (CBM, as one kind of unconventional gas, is an important energy resource, attracting industry interest in research and development. Using the Langmuir adsorption isotherm, Fick’s law in the matrix and Darcy flow in cleat fractures, and treating the Stimulated Reservoir Volume (SRV induced by hydraulic fracturing as a radial composite model, the continuous linear source function with constant production is derived by the methods of the Laplace transform and Duhamel theory. Based on the linear source function, semi-analytical solutions are obtained for a fractured vertical well producing at a constant production rate or constant bottom-hole pressure. With the help of the Stehfest numerical algorithm and computer programing, the well test and rate decline type curves are obtained, and the key flow regimes of fractured CBM wells are: wellbore storage, linear flow in SRV region, diffusion flow and later pseudo-radial flow. Finally, we analyze the effect of various parameters, such as the Langmuir volume, radius and permeability in the SRV region, on the production performance. The research results concluded in this paper have significant importance in terms of the development, well test interpretations and production performance analysis of unconventional gas.

  7. Simplification of the spectral analysis of the volume operator in loop quantum gravity

    Energy Technology Data Exchange (ETDEWEB)

    Brunnemann, J; Thiemann, T [Perimeter Institute for Theoretical Physics and University of Waterloo Waterloo, Ontario (Canada)

    2006-02-21

    The volume operator plays a crucial role in the definition of the quantum dynamics of loop quantum gravity (LQG). Efficient calculations for dynamical problems of LQG can therefore be performed only if one has sufficient control over the volume spectrum. While closed formulae for the matrix elements are currently available in the literature, these are complicated polynomials in 6j symbols which in turn are given in terms of Racah's formula which is too complicated in order to perform even numerical calculations for the semiclassically important regime of large spins. Hence, so far not even numerically the spectrum could be accessed. In this paper, we demonstrate that by means of the Elliot-Biedenharn identity one can get rid of all the 6j symbols for any valence of the gauge-invariant vertex, thus immensely reducing the computational effort. We use the resulting compact formula to study numerically the spectrum of the gauge-invariant 4-vertex. The techniques derived in this paper could also be of use for the analysis of spin-spin interaction Hamiltonians of many-particle problems in atomic and nuclear physics.

  8. Wind-electric icemaking project: Analysis and dynamometer testing. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Holz, R; Gervorgian, V; Drouilhet, S; Muljadi, E

    1998-07-01

    The wind/hybrid systems group at the National Renewable Energy Laboratory has been researching the most practical and cost-effective methods for producing ice from off-grid wind-electric power systems. The first phase of the project, conducted in 1993--1994, included full-scale dynamometer and field testing of two different electric ice makers directly connected to a permanent magnet alternator. The results of that phase were encouraging and the second phase of the project was launched in which steady-state and dynamic numerical models of these systems were developed and experimentally validated. The third phase of the project was the dynamometer testing of the North Star ice maker, which is powered by a 12-kilowatt Bergey Windpower Company, Inc., alternator. This report describes both the second and third project phases. Also included are detailed economic analyses and a discussion of the future prospects of wind-electric ice-making systems. The main report is contained in Volume 1. Volume 2 consists of the report appendices, which include the actual computer programs used in the analysis and the detailed test results.

  9. Principal components analysis of reward prediction errors in a reinforcement learning task.

    Science.gov (United States)

    Sambrook, Thomas D; Goslin, Jeremy

    2016-01-01

    Models of reinforcement learning represent reward and punishment in terms of reward prediction errors (RPEs), quantitative signed terms describing the degree to which outcomes are better than expected (positive RPEs) or worse (negative RPEs). An electrophysiological component known as feedback related negativity (FRN) occurs at frontocentral sites 240-340ms after feedback on whether a reward or punishment is obtained, and has been claimed to neurally encode an RPE. An outstanding question however, is whether the FRN is sensitive to the size of both positive RPEs and negative RPEs. Previous attempts to answer this question have examined the simple effects of RPE size for positive RPEs and negative RPEs separately. However, this methodology can be compromised by overlap from components coding for unsigned prediction error size, or "salience", which are sensitive to the absolute size of a prediction error but not its valence. In our study, positive and negative RPEs were parametrically modulated using both reward likelihood and magnitude, with principal components analysis used to separate out overlying components. This revealed a single RPE encoding component responsive to the size of positive RPEs, peaking at ~330ms, and occupying the delta frequency band. Other components responsive to unsigned prediction error size were shown, but no component sensitive to negative RPE size was found. PMID:26196667

  10. Principal components analysis of reward prediction errors in a reinforcement learning task.

    Science.gov (United States)

    Sambrook, Thomas D; Goslin, Jeremy

    2016-01-01

    Models of reinforcement learning represent reward and punishment in terms of reward prediction errors (RPEs), quantitative signed terms describing the degree to which outcomes are better than expected (positive RPEs) or worse (negative RPEs). An electrophysiological component known as feedback related negativity (FRN) occurs at frontocentral sites 240-340ms after feedback on whether a reward or punishment is obtained, and has been claimed to neurally encode an RPE. An outstanding question however, is whether the FRN is sensitive to the size of both positive RPEs and negative RPEs. Previous attempts to answer this question have examined the simple effects of RPE size for positive RPEs and negative RPEs separately. However, this methodology can be compromised by overlap from components coding for unsigned prediction error size, or "salience", which are sensitive to the absolute size of a prediction error but not its valence. In our study, positive and negative RPEs were parametrically modulated using both reward likelihood and magnitude, with principal components analysis used to separate out overlying components. This revealed a single RPE encoding component responsive to the size of positive RPEs, peaking at ~330ms, and occupying the delta frequency band. Other components responsive to unsigned prediction error size were shown, but no component sensitive to negative RPE size was found.

  11. Multifidus Muscle Volume Estimation Based on Three Dimensional Wavelet Multi Resolution Analysis: MRA with Buttocks Computer-Tomography: CT Images

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-12-01

    Full Text Available Multi-Resolution Analysis:. MRA based edge detection algorithm is proposed for estimation of volume of multifidus muscle in the Computer Tomography: CT scanned image The volume of multifidus muscle would be a good measure for metabolic syndrome rather than internal fat from a point of view from processing complexity. The proposed measure shows 0.178 of R square which corresponds to mutual correlation between internal fat and the volume of multifidus muscle. It is also fund that R square between internal fat and the other possible measures shows smaller than that of multifidus muscle.

  12. Vigilance Task-Related Change in Brain Functional Connectivity as Revealed by Wavelet Phase Coherence Analysis of Near-Infrared Spectroscopy Signals.

    Science.gov (United States)

    Wang, Wei; Wang, Bitian; Bu, Lingguo; Xu, Liwei; Li, Zengyong; Fan, Yubo

    2016-01-01

    This study aims to assess the vigilance task-related change in connectivity in healthy adults using wavelet phase coherence (WPCO) analysis of near-infrared spectroscopy signals (NIRS). NIRS is a non-invasive neuroimaging technique for assessing brain activity. Continuous recordings of the NIRS signals were obtained from the prefrontal cortex (PFC) and sensorimotor cortical areas of 20 young healthy adults (24.9 ± 3.3 years) during a 10-min resting state and a 20-min vigilance task state. The vigilance task was used to simulate driving mental load by judging three random numbers (i.e., whether odd numbers). The task was divided into two sessions: the first 10 min (Task t1) and the second 10 min (Task t2). The WPCO of six channel pairs were calculated in five frequency intervals: 0.6-2 Hz (I), 0.145-0.6 Hz (II), 0.052-0.145 Hz (III), 0.021-0.052 Hz (IV), and 0.0095-0.021 Hz (V). The significant WPCO formed global connectivity (GC) maps in intervals I and II and functional connectivity (FC) maps in intervals III to V. Results show that the GC levels in interval I and FC levels in interval III were significantly lower in the Task t2 than in the resting state (p < 0.05), particularly between the left PFC and bilateral sensorimotor regions. Also, the reaction time (RT) shows an increase in Task t2 compared with that in Task t1. However, no significant difference in WPCO was found between Task t1 and resting state. The results showed that the change in FC at the range of 0.6-2 Hz was not attributed to the vigilance task per se, but the interaction effect of vigilance task and time factors. The findings suggest that the decreased attention level might be partly attributed to the reduced GC levels between the left prefrontal region and sensorimotor area. The present results provide a new insight into the vigilance task-related brain activity. PMID:27547182

  13. Vigilance task-related change in brain functional connectivity as revealed by wavelet phase coherence analysis of near-infrared spectroscopy signals

    Directory of Open Access Journals (Sweden)

    Wang Wei

    2016-08-01

    Full Text Available This study aims to assess the vigilance task-related change in connectivity in healthy adults using wavelet phase coherence (WPCO analysis of near-infrared spectroscopy signals (NIRS. NIRS is a non-invasive neuroimaging technique for assessing brain activity. Continuous recordings of the NIRS signals were obtained from the prefrontal cortex (PFC and sensorimotor cortical areas of 20 young healthy adults (24.9±3.3 years during a 10-min resting state and a 20-min vigilance task state. The vigilance task was used to simulate driving mental load by judging three random numbers (i.e., whether odd numbers. The task was divided into two sessions: the first 10 minutes (Task t1 and the second 10 minutes (Task t2. The WPCO of six channel pairs were calculated in five frequency intervals: 0.6–2 Hz (I, 0.145–0.6 Hz (II, 0.052–0.145 Hz (III, 0.021–0.052 Hz (IV, and 0.0095–0.021 Hz (V. The significant WPCO formed global connectivity (GC maps in intervals I and II and functional connectivity (FC maps in intervals III to V. Results show that the GC levels in interval I and FC levels in interval III were significantly lower in the Task t2 than in the resting state (p < 0.05, particularly between the left PFC and bilateral sensorimotor regions. Also, the reaction time shows an increase in Task t2 compared with that in Task t1. However, no significant difference in WPCO was found between Task t1 and resting state. The results showed that the change in FC at the range of 0.6-2 Hz was not attributed to the vigilance task pe se, but the interaction effect of vigilance task and time factors. The findings suggest that the decreased attention level might be partly attributed to the reduced GC levels between the left prefrontal region and sensorimotor area. The present results provide a new insight into the vigilance task-related brain activity.

  14. Vigilance Task-Related Change in Brain Functional Connectivity as Revealed by Wavelet Phase Coherence Analysis of Near-Infrared Spectroscopy Signals

    Science.gov (United States)

    Wang, Wei; Wang, Bitian; Bu, Lingguo; Xu, Liwei; Li, Zengyong; Fan, Yubo

    2016-01-01

    This study aims to assess the vigilance task-related change in connectivity in healthy adults using wavelet phase coherence (WPCO) analysis of near-infrared spectroscopy signals (NIRS). NIRS is a non-invasive neuroimaging technique for assessing brain activity. Continuous recordings of the NIRS signals were obtained from the prefrontal cortex (PFC) and sensorimotor cortical areas of 20 young healthy adults (24.9 ± 3.3 years) during a 10-min resting state and a 20-min vigilance task state. The vigilance task was used to simulate driving mental load by judging three random numbers (i.e., whether odd numbers). The task was divided into two sessions: the first 10 min (Task t1) and the second 10 min (Task t2). The WPCO of six channel pairs were calculated in five frequency intervals: 0.6–2 Hz (I), 0.145–0.6 Hz (II), 0.052–0.145 Hz (III), 0.021–0.052 Hz (IV), and 0.0095–0.021 Hz (V). The significant WPCO formed global connectivity (GC) maps in intervals I and II and functional connectivity (FC) maps in intervals III to V. Results show that the GC levels in interval I and FC levels in interval III were significantly lower in the Task t2 than in the resting state (p < 0.05), particularly between the left PFC and bilateral sensorimotor regions. Also, the reaction time (RT) shows an increase in Task t2 compared with that in Task t1. However, no significant difference in WPCO was found between Task t1 and resting state. The results showed that the change in FC at the range of 0.6–2 Hz was not attributed to the vigilance task per se, but the interaction effect of vigilance task and time factors. The findings suggest that the decreased attention level might be partly attributed to the reduced GC levels between the left prefrontal region and sensorimotor area. The present results provide a new insight into the vigilance task-related brain activity. PMID:27547182

  15. Calcium Isolation from Large-Volume Human Urine Samples for 41Ca Analysis by Accelerator Mass Spectrometry

    Science.gov (United States)

    Miller, James J; Hui, Susanta K; Jackson, George S; Clark, Sara P; Einstein, Jane; Weaver, Connie M; Bhattacharyya, Maryka H

    2013-01-01

    Calcium oxalate precipitation is the first step in preparation of biological samples for 41Ca analysis by accelerator mass spectrometry. A simplified protocol for large-volume human urine samples was characterized, with statistically significant increases in ion current and decreases in interference. This large-volume assay minimizes cost and effort and maximizes time after 41Ca administration during which human samples, collected over a lifetime, provide 41Ca:Ca ratios that are significantly above background. PMID:23672965

  16. The 1999-2000 ACC task analysis of nurse-midwifery/midwifery practice: a consideration of the concept of professional issues.

    Science.gov (United States)

    Johnson, P G; Oshio, S; Fisher, M C; Fullerton, J T

    2001-01-01

    The American College of Nurse-Midwives (ACNM) Certification Council periodically conducts a task analysis study as evidence supporting the content validity of the national certification examination in nurse-midwifery and midwifery. The purpose of this article is to report findings related to the examination of the relationship between professional issues and safe beginning-level midwifery as measured by the 1999-2000 Task Analysis of American Nurse Midwifery and Midwifery Practice. Study findings suggest that newly certified midwives place strong emphasis on the importance of tasks related to the ACNM "Hallmarks of Midwifery," which characterize the art and science of the profession: these include tasks dealing with health promotion and cultural competency. The beginning midwives, however, gave consistently low ratings to tasks related to ACNM "Core Competencies" that mirror the professional responsibilities of midwives; these include tasks related to the history of midwifery, research, or health policy. The study has implications for nurse-midwifery/midwifery educators, experienced midwifery mentors, and other persons interested in reinforcing the relevance of these important professional issues to the new midwife.

  17. Dissociated multi-unit activity and local field potentials: a theory inspired analysis of a motor decision task.

    Science.gov (United States)

    Mattia, Maurizio; Ferraina, Stefano; Del Giudice, Paolo

    2010-09-01

    Local field potentials (LFP) and multi-unit activity (MUA) recorded in vivo are known to convey different information about the underlying neural activity. Here we extend and support the idea that single-electrode LFP-MUA task-related modulations can shed light on the involved large-scale, multi-modular neural dynamics. We first illustrate a theoretical scheme and associated simulation evidence, proposing that in a multi-modular neural architecture local and distributed dynamic properties can be extracted from the local spiking activity of one pool of neurons in the network. From this new perspective, the spectral features of the field potentials reflect the time structure of the ongoing fluctuations of the probed local neuronal pool on a wide frequency range. We then report results obtained recording from the dorsal premotor (PMd) cortex of monkeys performing a countermanding task, in which a reaching movement is performed, unless a visual stop signal is presented. We find that the LFP and MUA spectral components on a wide frequency band (3-2000 Hz) are very differently modulated in time for successful reaching, successful and wrong stop trials, suggesting an interplay of local and distributed components of the underlying neural activity in different periods of the trials and for different behavioural outcomes. Besides, the MUA spectral power is shown to possess a time-dependent structure, which we suggest could help in understanding the successive involvement of different local neuronal populations. Finally, we compare signals recorded from PMd and dorso-lateral prefrontal (PFCd) cortex in the same experiment, and speculate that the comparative time-dependent spectral analysis of LFP and MUA can help reveal patterns of functional connectivity in the brain.

  18. Performance Analysis and Design Synthesis (PADS) computer program. Volume 2: Program description, part 2

    Science.gov (United States)

    1972-01-01

    The QL module of the Performance Analysis and Design Synthesis (PADS) computer program is described. Execution of this module is initiated when and if subroutine PADSI calls subroutine GROPE. Subroutine GROPE controls the high level logical flow of the QL module. The purpose of the module is to determine a trajectory that satisfies the necessary variational conditions for optimal performance. The module achieves this by solving a nonlinear multi-point boundary value problem. The numerical method employed is described. It is an iterative technique that converges quadratically when it does converge. The three basic steps of the module are: (1) initialization, (2) iteration, and (3) culmination. For Volume 1 see N73-13199.

  19. Performance Analysis and Design Synthesis (PADS) computer program. Volume 2: Program description, part 1

    Science.gov (United States)

    1972-01-01

    The Performance Analysis and Design Synthesis (PADS) computer program has a two-fold purpose. It can size launch vehicles in conjunction with calculus-of-variations optimal trajectories and can also be used as a general-purpose branched trajectory optimization program. In the former use, it has the Space Shuttle Synthesis Program as well as a simplified stage weight module for optimally sizing manned recoverable launch vehicles. For trajectory optimization alone or with sizing, PADS has two trajectory modules. The first trajectory module uses the method of steepest descent; the second employs the method of quasilinearization, which requires a starting solution from the first trajectory module. For Volume 1 see N73-13199.

  20. The analysis of deviations on measured volumes between OSBRA pipeline tank farms and its customers

    Energy Technology Data Exchange (ETDEWEB)

    Kotchetkoff Neto, Andre Paulo; Kawamoto, Fabio Yoshikazu [Petrobas Transporte S.A. - Transpetro, (Brazil)

    2010-07-01

    The Sao Paulo - Brasilia pipeline (OSBRA) is a very long multi product pipeline in Brazil. It has a network of several tank farms, pump stations and truck loadings to deliver oil, gas and LPG to customers. The volume of these fuels is usually measured before delivery at two measure points. This paper reports the use of statistical tools to analyze the measurement data. The purpose was to understand the computed differences between the OSBRA tank farm installations and the customer tank farm installations. The use of statistical tools brought new insights, such as the existence of systematic error or the variability of each individual system. These tools were also used to verify the accuracy of operational measurement devices. An analysis based on in-field data was carried out between two OSBRA tank farms. This paper showed that the use of statistical tools, rather than fixed limits, can provide more precise information about measurement systems behaviors.

  1. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  2. Multiplexed Volume Bragg Gratings in Narrowand Broad-band Spectral Systems: Analysis and Application

    Science.gov (United States)

    Ingersoll, Gregory B.

    Volume Bragg gratings (VBGs) are important holographic optical elements in many spectral systems. Using multiple volume gratings, whether multiplexed or arranged sequentially, provides advantages to many types of systems in overall efficiency, dispersion performance, flexibility of design, etc. However, the use of multiple gratings---particularly when the gratings are multiplexed in a single holographic optical element (HOE)---is subject to inter-grating coupling effects that ultimately limit system performance. Analyzing these coupling effects requires a more complex mathematical model than the straightforward analysis of a single volume grating. We present a matrix-based algorithm for determining diffraction efficiencies of significant coupled waves in these multiplexed grating holographic optical elements (HOEs). Several carefully constructed experiments with spectrally multiplexed gratings in dichromated gelatin verify our conclusions. Applications of this theory to broad- and narrow-band systems are explored in detailed simulations. Broadband systems include spectrum splitters for diverse-bandgap photovoltaic (PV) cells. Volume Bragg gratings can serve as effective spectrum splitters, but the inherent dispersion of a VBG can be detrimental given a broad-spectrum input. The performance of a holographic spectrum splitter element can be improved by utilizing multiple volume gratings, each operating in a slightly different spectral band. However, care must be taken to avoid inter-grating coupling effects that limit ultimate performance. We explore broadband multi-grating holographic optical elements (HOEs) in sandwiched arrangements where individual single-grating HOEs are placed in series, and in multiplexed arrangements where multiple gratings are recorded in a single HOE. Particle swarm optimization (PSO) is used to tailor these systems to the solar spectrum taking into account both efficiency and dispersion. Both multiplexed and sandwiched two-grating systems

  3. Open Educational Resources from Performance Task using Video Analysis and Modeling - Tracker and K12 science education framework

    CERN Document Server

    Wee, Loo Kang

    2014-01-01

    This invited paper discusses why Physics performance task by grade 9 students in Singapore is worth participating in for two reasons; 1) the video analysis and modeling are open access, licensed creative commons attribution for advancing open educational resources in the world and 2) allows students to be like physicists, where the K12 science education framework is adopted. Personal reflections on how physics education can be made more meaningful in particular Practice 1: Ask Questions, Practice 2: Use Models and Practice 5: Mathematical and Computational Thinking using Video Modeling supported by evidence based data from video analysis. This paper hopes to spur fellow colleagues to look into open education initiatives such as our Singapore Tracker community open educational resources curate on http://weelookang.blogspot.sg/p/physics-applets-virtual-lab.html as well as digital libraries http://iwant2study.org/lookangejss/ directly accessible through Tracker 4.86, EJSS reader app on Android and iOS and EJS 5....

  4. Computational micromechanical analysis of the representative volume element of bituminous composite materials

    Science.gov (United States)

    Ozer, Hasan; Ghauch, Ziad G.; Dhasmana, Heena; Al-Qadi, Imad L.

    2016-08-01

    Micromechanical computational modeling is used in this study to determine the smallest domain, or Representative Volume Element (RVE), that can be used to characterize the effective properties of composite materials such as Asphalt Concrete (AC). Computational Finite Element (FE) micromechanical modeling was coupled with digital image analysis of surface scans of AC specimens. Three mixtures with varying Nominal Maximum Aggregate Size (NMAS) of 4.75 mm, 12.5 mm, and 25 mm, were prepared for digital image analysis and computational micromechanical modeling. The effects of window size and phase modulus mismatch on the apparent viscoelastic response of the composite were numerically examined. A good agreement was observed in the RVE size predictions based on micromechanical computational modeling and image analysis. Micromechanical results indicated that a degradation in the matrix stiffness increases the corresponding RVE size. Statistical homogeneity was observed for window sizes equal to two to three times the NMAS. A model was presented for relating the degree of statistical homogeneity associated with each window size for materials with varying inclusion dimensions.

  5. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  6. Beyond the initial 140 ms, lexical decision and reading aloud are different tasks: An ERP study with topographic analysis.

    Science.gov (United States)

    Mahé, Gwendoline; Zesiger, Pascal; Laganaro, Marina

    2015-11-15

    Most of our knowledge on the time-course of the mechanisms involved in reading derived from electrophysiological studies is based on lexical decision tasks. By contrast, very few ERP studies investigated the processes involved in reading aloud. It has been suggested that the lexical decision task provides a good index of the processes occurring during reading aloud, with only late processing differences related to task response modalities. However, some behavioral studies reported different sensitivity to psycholinguistic factors between the two tasks, suggesting that print processing could differ at earlier processing stages. The aim of the present study was thus to carry out an ERP comparison between lexical decision and reading aloud in order to determine when print processing differs between these two tasks. Twenty native French speakers performed a lexical decision task and a reading aloud task with the same written stimuli. Results revealed different electrophysiological patterns on both waveform amplitudes and global topography between lexical decision and reading aloud from about 140 ms after stimulus presentation for both words and pseudowords, i.e., as early as the N170 component. These results suggest that only very early, low-level visual processes are common to the two tasks which differ in core processes. Taken together, our main finding questions the use of the lexical decision task as an appropriate paradigm to investigate reading processes and warns against generalizing its results to word reading. PMID:26244274

  7. Conceptual design and systems analysis of photovoltaic power systems. Final report. Volume V. Additional studies

    Energy Technology Data Exchange (ETDEWEB)

    Pittman, P.F.

    1977-03-01

    In the first of four tasks, the performances of autonomous (stand-alone) residences were determined in seven locations throughout the country. A non-autonomous residence must obtain its supplemental energy from a utility. The second task dealt with considerations of the rate to be charged by the utility for this energy in an effort to define the pertinent issues of this utility/residence interface. In the third task, the configuration of a fixed linear Fresnel lens provided with a tracking absorber was analyzed optically. The fourth task explored utility Loss-of-Load probability methodology.

  8. Industrial Fuel Gas Demonstration Plant Program. Volume 1. Demonstration plant environmental analysis (Deliverable No. 27)

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Robert W.; Swift, Richard J.; Krause, Arthur J.; Berkey, Edgar

    1979-08-01

    This environmental report describes the proposed action to construct, test and operate a coal gasification demonstration plant in Memphis, Tennessee, under the co-sponsorship of the Memphis Light, Gas and Water Division (MLGW) and the US Department of Energy (DOE). This document is Volume I of a three-volume Environmental Report. Volume I consists of the Summary, Introduction and the Description of the Proposed Action. Volume II consists of the Description of the Existing Environment. Volume III contains the Environmental Impacts of the Proposed Action, Mitigating Measures and Alternatives to the Proposed Action.

  9. Industrial Fuel Gas Demonstration Plant Program. Volume III. Demonstration plant environmental analysis (Deliverable No. 27)

    Energy Technology Data Exchange (ETDEWEB)

    1979-08-01

    An Environmental Report on the Memphis Light, Gas and Water Division Industrial Fuel Demonstration Plant was prepared for submission to the US Department of Energy under Contract ET-77-C-01-2582. This document is Volume III of a three-volume Environmental Report. Volume I consists of the Summary, Introduction and the Description of the Proposed Action. Volume II consists of the Description of the Existing Environment. Volume III contains the Environmental Impacts of the Proposed Action, Mitigating Measures and Alternatives to the Proposed Action.

  10. The stability analysis using two fluids (SAT) code for boiling flow systems: Volume 4, Experiments and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Roy, R.P.; Dykhuizen, R.C.; Su, M.G.; Jain, P.

    1988-12-01

    This report presents analyses of dynamic instability and frequency response characteristics of boiling flow systems based on an unequal velocity, unequal temperature two-fluid model of such flow. The dynamic instability analyses in the time domain are incorporated into three options of a computer code SAT (steady state, or equilibrium point analyses; linear stability analysis; and nonlinear analysis). The frequency response analysis is incorporated into a fourth option FREQ. Results from dynamic instability experiments carried out in a Refrigerant-113 boiling flow rig are also reported as are comparison of these with linear stability analysis predictions. Descriptions of the model, the computational techniques, the computer codes, the experiments and model validation are divided into the following volumes: Volume 1, theoretical model and computational formulation; Volume 2, coding description; Volume 3, user's manual; and Volume 4, experiments and model validation. Instability experiments run in our Refrigerant-113 boiling flow facility are described in this document. Results from these experiments are compared with predictions of the theoretical model. Instability experiment data from two other facilities and frequency response results from one are compared with theoretical model predictions also. 19 refs., 41 figs.

  11. Specimen Preparation for Metal Matrix Composites with a High Volume Fraction of Reinforcing Particles for EBSD Analysis

    Science.gov (United States)

    Smirnov, A. S.; Belozerov, G. A.; Smirnova, E. O.; Konovalov, A. V.; Shveikin, V. P.; Muizemnek, O. Yu.

    2016-07-01

    The paper deals with a procedure of preparing a specimen surface for the EBSD analysis of a metal matrix composite (MMC) with a high volume fraction of reinforcing particles. Unlike standard procedures of preparing a specimen surface for the EBSD analysis, the proposed procedure is iterative with consecutive application of mechanical and electrochemical polishing. This procedure significantly improves the results of an indexed MMC matrix in comparison with the standard procedure of specimen preparation. The procedure was verified on a MMC with pure aluminum (99.8% Al) as the matrix, SiC particles being used as reinforcing elements. The average size of the SiC particles is 14 μm, and their volume fraction amounts to 50% of the total volume of the composite. It has been experimentally found that, for making the EBSD analysis of a material matrix near reinforcing particles, the difference in height between the particles and the matrix should not exceed 2 µm.

  12. Parameter uncertainty analysis in the task of internal dose reconstruction based on 241Am organ activity measurements

    International Nuclear Information System (INIS)

    Retrospective individual dose assessment of workers chronically exposed to plutonium is an important task in investigation of possible health effects from internal plutonium depositions. In most cases inhalation is the primary mode of the plutonium exposures, though an additional route of plutonium intake through wounds (pinpricks) also occurs in some cases. Estimating of systemic and total body deposition of plutonium from urinalysis is usually used to carry out the task of individual dose assessment. But this technique used alone gives no information about the routes of intake and solubility of aerosol particles, which could result in wrong lung dose calculations. Direct in vivo measurements of 241Am content in lungs, skeleton and liver by whole-body counting technique allow one to improve the estimation of the organ deposition of plutonium and the estimation of the individual lung doses. Our method of plutonium dose calculating uses the fact that plutonium of reactor origin is accompanied by 241Am that is grown in from decay of the 241Pu parent. The algorithm applies the new ICRP Publication 66/67 models and takes into account such parameters as 241Pu/239Pu ratio, the activity ratio of 239Pu to the sum of alpha emitting plutonium isotopes, effective aging time of mixed fission products and actinides, particle size distribution of plutonium aerosols (AMAD) and a rate of intake within the period of employment. The purpose of this paper is to quantify the reliability of the model's prediction and retrospective dose calculations by the parameter uncertainty analysis. It takes into account the uncertainty of all parameters describing the components of the plutonium aerosol, AMAD and the rate of intake. The parameter uncertainty analysis involved assigning probability distributions to each parameter and the use of Monte Carlo simulation technique to produce a quantitative statement of confidence in the model's prediction. It is shown that cumulative distribution

  13. A Typology of Tasks for Mobile-Assisted Language Learning: Recommendations from a Small-Scale Needs Analysis

    Science.gov (United States)

    Park, Moonyoung; Slater, Tammy

    2014-01-01

    In response to the research priorities of members of TESOL (Teachers of English to Speakers of Other Languages), this study investigated language learners' realworld tasks in mobile-assisted language learning (MALL) to inform the future development of pedagogic tasks for academic English as a second language (ESL) courses. The data included…

  14. Reflective Analysis as a Tool for Task Redesign: The Case of Prospective Elementary Teachers Solving and Posing Fraction Comparison Problems

    Science.gov (United States)

    Thanheiser, Eva; Olanoff, Dana; Hillen, Amy; Feldman, Ziv; Tobias, Jennifer M.; Welder, Rachael M.

    2016-01-01

    Mathematical task design has been a central focus of the mathematics education research community over the last few years. In this study, six university teacher educators from six different US institutions formed a community of practice to explore key aspects of task design (planning, implementing, reflecting, and modifying) in the context of…

  15. Solar Pilot Plant, Phase I. Preliminary design report. Volume II. System description and system analysis. CDRL item 2

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-05-01

    Honeywell conducted a parametric analysis of the 10-MW(e) solar pilot plant requirements and expected performance and established an optimum system design. The main analytical simulation tools were the optical (ray trace) and the dynamic simulation models. These are described in detail in Books 2 and 3 of this volume under separate cover. In making design decisions, available performance and cost data were used to provide a design reflecting the overall requirements and economics of a commercial-scale plant. This volume contains a description of this analysis/design process and resultant system/subsystem design and performance.

  16. Density-viscosity product of small-volume ionic liquid samples using quartz crystal impedance analysis.

    Science.gov (United States)

    McHale, Glen; Hardacre, Chris; Ge, Rile; Doy, Nicola; Allen, Ray W K; MacInnes, Jordan M; Bown, Mark R; Newton, Michael I

    2008-08-01

    Quartz crystal impedance analysis has been developed as a technique to assess whether room-temperature ionic liquids are Newtonian fluids and as a small-volume method for determining the values of their viscosity-density product, rho eta. Changes in the impedance spectrum of a 5-MHz fundamental frequency quartz crystal induced by a water-miscible room-temperature ionic liquid, 1-butyl-3-methylimiclazolium trifluoromethylsulfonate ([C4mim][OTf]), were measured. From coupled frequency shift and bandwidth changes as the concentration was varied from 0 to 100% ionic liquid, it was determined that this liquid provided a Newtonian response. A second water-immiscible ionic liquid, 1-butyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide [C4mim][NTf2], with concentration varied using methanol, was tested and also found to provide a Newtonian response. In both cases, the values of the square root of the viscosity-density product deduced from the small-volume quartz crystal technique were consistent with those measured using a viscometer and density meter. The third harmonic of the crystal was found to provide the closest agreement between the two measurement methods; the pure ionic liquids had the largest difference of approximately 10%. In addition, 18 pure ionic liquids were tested, and for 11 of these, good-quality frequency shift and bandwidth data were obtained; these 12 all had a Newtonian response. The frequency shift of the third harmonic was found to vary linearly with square root of viscosity-density product of the pure ionic liquids up to a value of square root(rho eta) approximately 18 kg m(-2) s(-1/2), but with a slope 10% smaller than that predicted by the Kanazawa and Gordon equation. It is envisaged that the quartz crystal technique could be used in a high-throughput microfluidic system for characterizing ionic liquids.

  17. Approach, avoidance, and affect: A meta-analysis of approach-avoidance tendencies in manual reaction time tasks

    Directory of Open Access Journals (Sweden)

    Hans ePhaf

    2014-05-01

    Full Text Available Approach action tendencies towards positive stimuli and avoidance tendencies from negative stimuli are widely seen to foster survival. Many studies have shown that approach and avoidance arm movements are facilitated by positive and negative affect, respectively. There is considerable debate whether positively and negatively valenced stimuli prime approach and avoidance movements directly (i.e., immediate, unintentional, implicit, automatic, and stimulus-based, or indirectly (i.e., after conscious or nonconscious interpretation of the situation. The direction and size of these effects were often found to depend on the instructions referring to the stimulus object or the self, and on explicit vs. implicit stimulus evaluation. We present a meta-analysis of 29 studies included for their use of strongly positive and negative stimuli, with 81 effect sizes derived solely from the means and standard deviations (combined N = 1538, to examine the automaticity of the link between affective information processing and approach and avoidance, and to test whether it depends on instruction, type of approach-avoidance task, and stimulus type. Results show a significant small to medium-sized effect after correction for publication bias. The strongest arguments for an indirect link between affect and approach-avoidance were the absence of evidence for an effect with implicit evaluation, and the opposite directions of the effect with self and object-related interpretations. The link appears to be influenced by conscious or nonconscious intentions to deal with affective stimuli.

  18. Latent growth curve analysis of fear during a speech task before and after treatment for social phobia.

    Science.gov (United States)

    Price, Matthew; Anderson, Page L

    2011-11-01

    Models of social phobia highlight the importance of anticipatory anxiety in the experience of fear during a social situation. Anticipatory anxiety has been shown to be highly correlated with performance anxiety for a variety of social situations. A few studies show that average ratings of anxiety during the anticipation and performance phases of a social situation decline following treatment. Evidence also suggests that the point of confrontation with the feared stimulus is the peak level of fear. No study to date has evaluated the pattern of anxious responding across the anticipation, confrontation, and performance phases before and after treatment, which is the focus of the current study. Socially phobic individuals (N = 51) completed a behavioral avoidance task before and after two types of manualized cognitive behavioral therapy, and gave ratings of fear during the anticipation and performance phases. Results from latent growth curve analysis were the same for the two treatments and suggested that before treatment, anxiety sharply increased during the anticipation phase, was highly elevated at the confrontation, and gradually increased during the performance phase. After treatment, anxiety increased during the anticipation phase, although at a much slower rate than at pretreatment, peaked at confrontation, and declined during the performance phase. The findings suggest that anticipatory experiences are critical to the experience of fear for public speaking and should be incorporated into exposures. PMID:21907972

  19. Use of Human Modeling Simulation Software in the Task Analysis of the Environmental Control and Life Support System Component Installation Procedures

    Science.gov (United States)

    Estes, Samantha; Parker, Nelson C. (Technical Monitor)

    2001-01-01

    Virtual reality and simulation applications are becoming widespread in human task analysis. These programs have many benefits for the Human Factors Engineering field. Not only do creating and using virtual environments for human engineering analyses save money and time, this approach also promotes user experimentation and provides increased quality of analyses. This paper explains the human engineering task analysis performed on the Environmental Control and Life Support System (ECLSS) space station rack and its Distillation Assembly (DA) subsystem using EAI's human modeling simulation software, Jack. When installed on the International Space Station (ISS), ECLSS will provide the life and environment support needed to adequately sustain crew life. The DA is an Orbital Replaceable Unit (ORU) that provides means of wastewater (primarily urine from flight crew and experimental animals) reclamation. Jack was used to create a model of the weightless environment of the ISS Node 3, where the ECLSS is housed. Computer aided drawings of the ECLSS rack and DA system were also brought into the environment. Anthropometric models of a 95th percentile male and 5th percentile female were used to examine the human interfaces encountered during various ECLSS and DA tasks. The results of the task analyses were used in suggesting modifications to hardware and crew task procedures to improve accessibility, conserve crew time, and add convenience for the crew. This paper will address some of those suggested modifications and the method of presenting final analyses for requirements verification.

  20. fMRI-constrained source analysis reveals early top-down modulations of interference processing using a flanker task.

    Science.gov (United States)

    Siemann, Julia; Herrmann, Manfred; Galashan, Daniela

    2016-08-01

    Usually, incongruent flanker stimuli provoke conflict processing whereas congruent flankers should facilitate task performance. Various behavioral studies reported improved or even absent conflict processing with correctly oriented selective attention. In the present study we attempted to reinvestigate these behavioral effects and to disentangle neuronal activity patterns underlying the attentional cueing effect taking advantage of a combination of the high temporal resolution of Electroencephalographic (EEG) and the spatial resolution of functional magnetic resonance imaging (fMRI). Data from 20 participants were acquired in different sessions per method. We expected the conflict-related N200 event-related potential (ERP) component and areas associated with flanker processing to show validity-specific modulations. Additionally, the spatio-temporal dynamics during cued flanker processing were examined using an fMRI-constrained source analysis approach. In the ERP data we found early differences in flanker processing between validity levels. An early centro-parietal relative positivity for incongruent stimuli occurred only with valid cueing during the N200 time window, while a subsequent fronto-central negativity was specific to invalidly cued interference processing. The source analysis additionally pointed to separate neural generators of these effects. Regional sources in visual areas were involved in conflict processing with valid cueing, while a regional source in the anterior cingulate cortex (ACC) seemed to contribute to the ERP differences with invalid cueing. Moreover, the ACC and precentral gyrus demonstrated an early and a late phase of congruency-related activity differences with invalid cueing. We discuss the first effect to reflect conflict detection and response activation while the latter more likely originated from conflict monitoring and control processes during response competition. PMID:27181762

  1. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis

    Science.gov (United States)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The purpose of this study is the development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates and the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The procedure was also modified to allow coarse parallelization of the solution algorithm. This document is a final report outlining the development and techniques used in the procedure. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Numerical dissipation is used to gain solution stability but is reduced in viscous dominated flow regions. Local time stepping and implicit residual smoothing are used to increase the rate of convergence. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes being generated by the system (TIGG3D) developed earlier under this contract. The grid generation scheme meets the average-passage requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. Pure internal flow solutions were obtained as well as solutions with flow about the cowl/nacelle and various engine core flow conditions. The efficiency of the solution procedure was shown to be the same as the original analysis.

  2. Analysis of a finite volume element method for the Stokes problem

    OpenAIRE

    Quarteroni, Alfio; Ruiz-Baier, Ricardo

    2011-01-01

    In this paper we propose a stabilized conforming finite volume element method for the Stokes equations. On stating the convergence of the method, optimal a priori error estimates in different norms are obtained by establishing the adequate connection between the finite volume and stabilized finite element formulations. A superconvergence result is also derived by using a postprocessing projection method. In particular, the stabilization of the continuous lowest equal order pair finite volum...

  3. Comparative analysis of two methods for measuring sales volumes during malaria medicine outlet surveys.

    OpenAIRE

    Patouillard, E; Kleinschmidt, I; Hanson, K.; Pok, S; Palafox, B; Tougher, S; O'Connell, K; Goodman, C.

    2013-01-01

    BACKGROUND There is increased interest in using commercial providers for improving access to quality malaria treatment. Understanding their current role is an essential first step, notably in terms of the volume of diagnostics and anti-malarials they sell. Sales volume data can be used to measure the importance of different provider and product types, frequency of parasitological diagnosis and impact of interventions. Several methods for measuring sales volumes are available, yet all have met...

  4. Comparative analysis of two methods for measuring sales volumes during malaria medicine outlet surveys

    OpenAIRE

    Patouillard, Edith; Kleinschmidt, Immo; Hanson, Kara; Pok, Sochea; Palafox, Benjamin; Tougher, Sarah; O’Connell, Kate; Goodman, Catherine

    2013-01-01

    Background There is increased interest in using commercial providers for improving access to quality malaria treatment. Understanding their current role is an essential first step, notably in terms of the volume of diagnostics and anti-malarials they sell. Sales volume data can be used to measure the importance of different provider and product types, frequency of parasitological diagnosis and impact of interventions. Several methods for measuring sales volumes are available, yet all have met...

  5. Temporal discrimination thresholds in adult-onset primary torsion dystonia: an analysis by task type and by dystonia phenotype.

    LENUS (Irish Health Repository)

    Bradley, D

    2012-01-01

    Adult-onset primary torsion dystonia (AOPTD) is an autosomal dominant disorder with markedly reduced penetrance. Sensory abnormalities are present in AOPTD and also in unaffected relatives, possibly indicating non-manifesting gene carriage (acting as an endophenotype). The temporal discrimination threshold (TDT) is the shortest time interval at which two stimuli are detected to be asynchronous. We aimed to compare the sensitivity and specificity of three different TDT tasks (visual, tactile and mixed\\/visual-tactile). We also aimed to examine the sensitivity of TDTs in different AOPTD phenotypes. To examine tasks, we tested TDT in 41 patients and 51 controls using visual (2 lights), tactile (non-painful electrical stimulation) and mixed (1 light, 1 electrical) stimuli. To investigate phenotypes, we examined 71 AOPTD patients (37 cervical dystonia, 14 writer\\'s cramp, 9 blepharospasm, 11 spasmodic dysphonia) and 8 musician\\'s dystonia patients. The upper limit of normal was defined as control mean +2.5 SD. In dystonia patients, the visual task detected abnormalities in 35\\/41 (85%), the tactile task in 35\\/41 (85%) and the mixed task in 26\\/41 (63%); the mixed task was less sensitive than the other two (p = 0.04). Specificity was 100% for the visual and tactile tasks. Abnormal TDTs were found in 36 of 37 (97.3%) cervical dystonia, 12 of 14 (85.7%) writer\\'s cramp, 8 of 9 (88.8%) blepharospasm, 10 of 11 (90.1%) spasmodic dysphonia patients and 5 of 8 (62.5%) musicians. The visual and tactile tasks were found to be more sensitive than the mixed task. Temporal discrimination threshold results were comparable across common adult-onset primary torsion dystonia phenotypes, with lower sensitivity in the musicians.

  6. Molar Volume Analysis of Molten Ni-Al-Co Alloy by Measuring the Density

    Institute of Scientific and Technical Information of China (English)

    XIAO Feng; FANG Liang; FU Yuechao; YANG Lingchuan

    2004-01-01

    The density of molten Ni-Al-Co alloys was measured in the temperature range of 1714~1873K using a modified pycnometric method, and the molar volume of molten alloys was analyzed. The density of molten Ni-Al-Co alloys was found to decrease with increasing temperature and Co concentration in alloys. The molar volume of molten Ni-Al-Co alloys increases with increasing Co concentration in alloys. The molar volume of molten Ni-Al-Co alloys shows a negative deviation from the linear molar volume.

  7. Site specific analysis of geothermal development-data files of prospective sites. Volume II

    Energy Technology Data Exchange (ETDEWEB)

    Trehan, R.; Cohen, A.; Gupta, J.; Jacobsen, W.; Leigh, J.; True, S.

    1978-08-01

    Development scenarios for 37 hydrothermal and geopressured prospects in the United States were analyzed to assist DOE's Division of Geothermal Energy in mission-oriented planning of geothermal resource development. This second volume of the three-volume series contains the detailed site-specific analyses in terms of technological, economic, and other requirements for meeting the postulated schedules. This presentation should be used in conjunction with Volume III, which contains detailed descriptive data files for each of the 37 prospects. These data files were used for the analyses contained in Volume II and should be useful for other geothermal resource studies. (JGB)

  8. Predictability and Market Efficiency in Agricultural Futures Markets: a Perspective from Price-Volume Correlation Based on Wavelet Coherency Analysis

    Science.gov (United States)

    He, Ling-Yun; Wen, Xing-Chun

    2015-12-01

    In this paper, we use a time-frequency domain technique, namely, wavelet squared coherency, to examine the associations between the trading volumes of three agricultural futures and three different forms of these futures' daily closing prices, i.e. prices, returns and volatilities, over the past several years. These agricultural futures markets are selected from China as a typical case of the emerging countries, and from the US as a representative of the developed economies. We investigate correlations and lead-lag relationships between the trading volumes and the prices to detect the predictability and efficiency of these futures markets. The results suggest that the information contained in the trading volumes of the three agricultural futures markets in China can be applied to predict the prices or returns, while that in US has extremely weak predictive power for prices or returns. We also conduct the wavelet analysis on the relationships between the volumes and returns or volatilities to examine the existence of the two "stylized facts" proposed by Karpoff [J. M. Karpoff, The relation between price changes and trading volume: A survey, J. Financ. Quant. Anal.22(1) (1987) 109-126]. Different markets in the two countries perform differently in reproducing the two stylized facts. As the wavelet tools can decode nonlinear regularities and hidden patterns behind price-volume relationship in time-frequency space, different from the conventional econometric framework, this paper offers a new perspective into the market predictability and efficiency.

  9. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients.

    Science.gov (United States)

    Lledó, Luis D; Díez, Jorge A; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J; Sabater-Navarro, José M; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  10. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients.

    Science.gov (United States)

    Lledó, Luis D; Díez, Jorge A; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J; Sabater-Navarro, José M; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  11. A comparative analysis of 2D and 3D tasks for virtual reality therapies based on robotic-assisted neurorehabilitation for post-stroke patients

    Directory of Open Access Journals (Sweden)

    Luis Daniel Lledó

    2016-08-01

    Full Text Available Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding

  12. Quantitative gait analysis under dual-task in older people with mild cognitive impairment: a reliability study

    Directory of Open Access Journals (Sweden)

    Gutmanis Iris

    2009-09-01

    Full Text Available Abstract Background Reliability of quantitative gait assessment while dual-tasking (walking while doing a secondary task such as talking in people with cognitive impairment is unknown. Dual-tasking gait assessment is becoming highly important for mobility research with older adults since better reflects their performance in the basic activities of daily living. Our purpose was to establish the test-retest reliability of assessing quantitative gait variables using an electronic walkway in older adults with mild cognitive impairment (MCI under single and dual-task conditions. Methods The gait performance of 11 elderly individuals with MCI was evaluated using an electronic walkway (GAITRite® System in two sessions, one week apart. Six gait parameters (gait velocity, step length, stride length, step time, stride time, and double support time were assessed under two conditions: single-task (sG: usual walking and dual-task (dG: counting backwards from 100 while walking. Test-retest reliability was determined using intra-class correlation coefficient (ICC. Gait variability was measured using coefficient of variation (CoV. Results Eleven participants (average age = 76.6 years, SD = 7.3 were assessed. They were high functioning (Clinical Dementia Rating Score = 0.5 with a mean Mini-Mental Status Exam (MMSE score of 28 (SD = 1.56, and a mean Montreal Cognitive Assessment (MoCA score of 22.8 (SD = 1.23. Under dual-task conditions, mean gait velocity (GV decreased significantly (sGV = 119.11 ± 20.20 cm/s; dGV = 110.88 ± 19.76 cm/s; p = 0.005. Additionally, under dual-task conditions, higher gait variability was found on stride time, step time, and double support time. Test-retest reliability was high (ICC>0.85 for the six parameters evaluated under both conditions. Conclusion In older people with MCI, variability of time-related gait parameters increased with dual-tasking suggesting cognitive control of gait performance. Assessment of quantitative gait

  13. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  14. Left ventricular time volume curve analysis in the detection of limited ischaemic heart disease

    International Nuclear Information System (INIS)

    The aim of the study was to determine whether limited coronary artery disease (CAD) could be accurately detected using the Cardiac Gated Blood Pool (CGBP) scan with exercise. Regional left ventricular time volume curves (RLTVD) were generated from 52 studies (46 patients: 22 normals, 24 abnormals). The parameters assessed both globally and regionally and at rest (R) and exercise (Ex) were: (1) the ejection fraction (EF) (2) the change in ejection fraction from R to Ex (δEF) (3) an early contraction index (ECI) (4) a maximal emptying index (DR) and (5) a maximal refilling index (AR). After careful analysis of these parameters it was decided that our diagnostic criteria would rely on the following: (1) the EF at R and Ex (2) the δ EF (3) the ECI at Ex (4) the AR at Ex This study showed that both the sensitivity and the specificity of the CGBP scan can be improved considerably with the inclusion of RLTVC from the levels obtained when the EF parameters alone are considered. It is possible with this technique to accurately diagnose limited CAD. (Author)

  15. An Analysis of Finite-Difference and Finite-Volume Formulations of Convervation Laws

    Science.gov (United States)

    Vinokur, Marcel

    1989-03-01

    Finite-difference and finite-volume formulations are analyzed in order to clear up the confusion concerning their application to the numerical solution of conservation laws. A new coordinate-free formulation of systems of conservation laws is developed, which clearly distinguishes the role of physical vectors from that of algebraic vectors which characterize the system. The analysis considers general types of equations-potential, Euler, and Navier-Stokes. Three-dimensional unsteady flows with time-varying grids are described using a single, consistent nomenclature for both formulations. Grid motion due to a non-inertial reference frame as well as flow adaptation is covered. In comparing the two formulations, it is found useful to distinguish between differences in numerical methods and differences in grid definition. The former plays a role for non-Cartesian grids and results in only cosmetic differences in the manner in which geometric terms are handled. The differences in grid definition for the two formulations is found to be more important, since it affects the manner in which boundary conditions, zonal procedures, and grid singularities are handled at computational boundaries. The proper interpretation of strong and weak conservation-law forms for quasi-one-dimensional and axisymmetric flows is brought out.

  16. An analysis of finite-difference and finite-volume formulations of conservation laws

    Science.gov (United States)

    Vinokur, Marcel

    1986-06-01

    Finite-difference and finite-volume formulations are analyzed in order to clear up the confusion concerning their application to the numerical solution of conservation laws. A new coordinate-free formulation of systems of conservation laws is developed, which clearly distinguishes the role of physical vectors from that of algebraic vectors which characterize the system. The analysis considers general types of equations--potential, Euler, and Navier-Stokes. Three-dimensional unsteady flows with time-varying grids are described using a single, consistent nomeclature for both formulations. Grid motion due to a non-inertial reference frame as well as flow adaptation is covered. In comparing the two formulations, it is found useful to distinguish between differences in numerical methods and differences in grid definition. The former plays a role for non-Cartesian grids, and results in only cosmetic differences in the manner in which geometric terms are handled. The differences in grid definition for the two formulations is found to be more important, since it affects the manner in which boundary conditions, zonal procedures, and grid singularities are handled at computational boundaries. The proper interpretation of strong and weak conservation-law forms for quasi-one-dimensional and axisymmetric flows is brought out.

  17. Two-dimensional thermal analysis of a fuel rod by finite volume method

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Rhayanne Y.N.; Silva, Mario A.B. da; Lira, Carlos A.B. de O., E-mail: ryncosta@gmail.com, E-mail: mabs500@gmail.com, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamaento de Energia Nuclear

    2015-07-01

    In a nuclear reactor, the amount of power generation is limited by thermal and physic limitations rather than by nuclear parameters. The operation of a reactor core, considering the best heat removal system, must take into account the fact that the temperatures of fuel and cladding shall not exceed safety limits anywhere in the core. If such considerations are not considered, damages in the fuel element may release huge quantities of radioactive materials in the coolant or even core meltdown. Thermal analyses for fuel rods are often accomplished by considering one-dimensional heat diffusion equation. The aim of this study is to develop the first paper to verify the temperature distribution for a two-dimensional heat transfer problem in an advanced reactor. The methodology is based on the Finite Volume Method (FVM), which considers a balance for the property of interest. The validation for such methodology is made by comparing numerical and analytical solutions. For the two-dimensional analysis, the results indicate that the temperature profile agree with expected physical considerations, providing quantitative information for the development of advanced reactors. (author)

  18. Office for Analysis and Evaluation of Operational Data. 1992 annual report: Nonreactors: Volume 7, No. 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-10-01

    The annual report of the US Nuclear Regulatory Commission`s Office for Analysis and Evaluation of Operational Data (AEOD) is devoted to the activities performed during 1992. The report is published in two separate parts. NUREG-1272, Vol. 7, No. 1, covers power reactors and presents an overview of the operating experience of the nuclear power industry from the NRC perspective, including comments about the trends of some key performance measures. The report also includes the principal findings and issues identified in AEOD studies over the past year and summarizes information from such sources as licensee event reports, diagnostic evaluations, and reports to the NRC`s Operations Center. NUREG-1272, Vol. 7, No. 2, covers nonreactors and presents a review of the events and concerns during 1992 associated with the use of licensed material in nonreactor applications, such as personnel overexposures and medical misadministrations. Both reports also contain a discussion of the Incident Investigation Team program and summarize both the Incident Investigation Team and Augmented Inspection Team reports. Each volume contains a list of the AEOD reports issued for 1981--1992.

  19. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  20. A statistical approach to the initial volume problem in Single Particle Analysis by Electron Microscopy.

    Science.gov (United States)

    Sorzano, C O S; Vargas, J; de la Rosa-Trevín, J M; Otón, J; Álvarez-Cabrera, A L; Abrishami, V; Sesmero, E; Marabini, R; Carazo, J M

    2015-03-01

    Cryo Electron Microscopy is a powerful Structural Biology technique, allowing the elucidation of the three-dimensional structure of biological macromolecules. In particular, the structural study of purified macromolecules -often referred as Single Particle Analysis(SPA)- is normally performed through an iterative process that needs a first estimation of the three-dimensional structure that is progressively refined using experimental data. It is well-known the local optimisation nature of this refinement, so that the initial choice of this first structure may substantially change the final result. Computational algorithms aiming to providing this first structure already exist. However, the question is far from settled and more robust algorithms are still needed so that the refinement process can be performed with sufficient guarantees. In this article we present a new algorithm that addresses the initial volume problem in SPA by setting it in a Weighted Least Squares framework and calculating the weights through a statistical approach based on the cumulative density function of different image similarity measures. We show that the new algorithm is significantly more robust than other state-of-the-art algorithms currently in use in the field. The algorithm is available as part of the software suite Xmipp (http://xmipp.cnb.csic.es) and Scipion (http://scipion.cnb.csic.es) under the name "Significant".

  1. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Office for Analysis and Evaluation of Operational Data 1996 annual report. Volume 10, Number 1: Reactors

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-01

    This annual report of the US Nuclear Regulatory Commission`s Office for Analysis and Evaluation of Operational Data (AEOD) describes activities conducted during 1996. The report is published in three parts. NUREG-1272, Vol. 10, No. 1, covers power reactors and presents an overview of the operating experience of the nuclear power industry from the NRC perspective, including comments about trends of some key performance measures. The report also includes the principal findings and issues identified in AEOD studies over the past year and summarizes information from such sources as licensee event reports and reports to the NRC`s Operations Center. NUREG-1272, Vol. 10, No. 2, covers nuclear materials and presents a review of the events and concerns during 1996 associated with the use of licensed material in nonreactor applications, such as personnel overexposures and medical misadministrations. Both reports also contain a discussion of the Incident Investigation Team program and summarize both the Incident Investigation Team and Augmented Inspection Team reports. Each volume contains a list of the AEOD reports issued from CY 1980 through 1996. NUREG-1272, Vol. 10, No. 3, covers technical training and presents the activities of the Technical Training Center in support of the NRC`s mission in 1996.

  4. Office for Analysis and Evaluation of Operational Data 1993 annual report: Volume 8, Number 1

    International Nuclear Information System (INIS)

    This annual report of the US Nuclear Regulatory Commission's Office for Analysis and Evaluation of Operational Data (AEOD) describes activities conducted during 1993. The report is published in two parts. NUREG-1272, Vol. 8, No. 1, covers power reactors and presents an overview of the operating experience of the nuclear power industry from the NRC perspective, including comments about the trends of some key performance measures. The report also includes the principal findings and issues identified in AEOD studies over the past year and summarizes information from such sources as licensee event reports, diagnostic evaluations, and reports to the NRC's Operations Center. NUREG-1272, Vol. 8, No. 2, covers nuclear materials and presents a review of the events and concerns during 1993 associated with the use of licensed material in nonreactor applications, such as personnel overexposures and medical misadministrations. Both reports also contain a discussion of the Incident Investigation Team program and summarize both the Incident Investigation Team and Augmented Inspection Team reports. Each volume contains a list of the AEOD reports issued from 1980 through 1993

  5. Parameter Analysis of the VPIN (Volume synchronized Probability of Informed Trading) Metric

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jung Heon; Wu, Kesheng; Simon, Horst D.

    2014-03-01

    VPIN (Volume synchronized Probability of Informed trading) is a leading indicator of liquidity-induced volatility. It is best known for having produced a signal more than hours before the Flash Crash of 2010. On that day, the market saw the biggest one-day point decline in the Dow Jones Industrial Average, which culminated to the market value of $1 trillion disappearing, but only to recover those losses twenty minutes later (Lauricella 2010). The computation of VPIN requires the user to set up a handful of free parameters. The values of these parameters significantly affect the effectiveness of VPIN as measured by the false positive rate (FPR). An earlier publication reported that a brute-force search of simple parameter combinations yielded a number of parameter combinations with FPR of 7%. This work is a systematic attempt to find an optimal parameter set using an optimization package, NOMAD (Nonlinear Optimization by Mesh Adaptive Direct Search) by Audet, le digabel, and tribes (2009) and le digabel (2011). We have implemented a number of techniques to reduce the computation time with NOMAD. Tests show that we can reduce the FPR to only 2%. To better understand the parameter choices, we have conducted a series of sensitivity analysis via uncertainty quantification on the parameter spaces using UQTK (Uncertainty Quantification Toolkit). Results have shown dominance of 2 parameters in the computation of FPR. Using the outputs from NOMAD optimization and sensitivity analysis, We recommend A range of values for each of the free parameters that perform well on a large set of futures trading records.

  6. Analysis of end-systolic pressure-volume relation by gated radionuclide angiocardiography

    International Nuclear Information System (INIS)

    Left ventricular end-systolic pressure-volume relation has been proved experimentally to b e an useful index of left ventricular contractility relatively independent of preload or afterload. But less clinical application has been reported because of its invasive nature, and we evaluated this relationship non-invasively using gated radionuclide angiocardiography as volume determination and cuff sphyngomanometer in the arm as pressure measurement. Gated equilibrium blood pool scintigrams were obtained at rest and during intravenous infusion of angiotensin or nitrate. Ventricular volumes were derived from ventricular activity and peripheral blood volume and activity. The peak systolic pressure (PSP) by cuff method to end-systolic volume index (ESVI) relations showed good linearity (r gt .930 in 84% of consecutive 50 cases) and were gentler in the groups with more impaired left ventricular function. Emax was related exponentially to ejection fraction (EF) and hyperbolically to end-diastolic volume index. The dead volume (VoI) was unfixed and fell into positive or negative value, and was not related to EF under control condition. PSP/ESVI in each loading condition was less variable with the alteration of blood pressure than EF. The linear relation was found between PSP/ESVI under control condition and Emax (PSP/ESVI = 0.651.Emax + 0.958, r = 0.841, p lt .001). Thus in measuring ventricular volume, gated radionuclide angiocardiography is a non-invasive method less affected by the geometry of the left ventricle. Non-invasive determination of end-systolic pressure-volume relation using the volume by radionuclide and the blood pressure by cuff method is clinically useful in the assessment of left ventricular contractility. (author)

  7. Whole-brain, time-locked activation with simple tasks revealed using massive averaging and model-free analysis

    Science.gov (United States)

    Gonzalez-Castillo, Javier; Saad, Ziad S.; Handwerker, Daniel A.; Inati, Souheil J.; Brenowitz, Noah; Bandettini, Peter A.

    2012-01-01

    The brain is the body's largest energy consumer, even in the absence of demanding tasks. Electrophysiologists report on-going neuronal firing during stimulation or task in regions beyond those of primary relationship to the perturbation. Although the biological origin of consciousness remains elusive, it is argued that it emerges from complex, continuous whole-brain neuronal collaboration. Despite converging evidence suggesting the whole brain is continuously working and adapting to anticipate and actuate in response to the environment, over the last 20 y, task-based functional MRI (fMRI) have emphasized a localizationist view of brain function, with fMRI showing only a handful of activated regions in response to task/stimulation. Here, we challenge that view with evidence that under optimal noise conditions, fMRI activations extend well beyond areas of primary relationship to the task; and blood-oxygen level-dependent signal changes correlated with task-timing appear in over 95% of the brain for a simple visual stimulation plus attention control task. Moreover, we show that response shape varies substantially across regions, and that whole-brain parcellations based on those differences produce distributed clusters that are anatomically and functionally meaningful, symmetrical across hemispheres, and reproducible across subjects. These findings highlight the exquisite detail lying in fMRI signals beyond what is normally examined, and emphasize both the pervasiveness of false negatives, and how the sparseness of fMRI maps is not a result of localized brain function, but a consequence of high noise and overly strict predictive response models. PMID:22431587

  8. Impact of Non-Gaussian Error Volumes on Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Ghrist, Richard W.; Plakalovic, Dragan

    2012-01-01

    An understanding of how an initially Gaussian error volume becomes non-Gaussian over time is an important consideration for space-vehicle conjunction assessment. Traditional assumptions applied to the error volume artificially suppress the true non-Gaussian nature of the space-vehicle position uncertainties. For typical conjunction assessment objects, representation of the error volume by a state error covariance matrix in a Cartesian reference frame is a more significant limitation than is the assumption of linearized dynamics for propagating the error volume. In this study, the impact of each assumption is examined and isolated for each point in the volume. Limitations arising from representing the error volume in a Cartesian reference frame is corrected by employing a Monte Carlo approach to probability of collision (Pc), using equinoctial samples from the Cartesian position covariance at the time of closest approach (TCA) between the pair of space objects. A set of actual, higher risk (Pc >= 10 (exp -4)+) conjunction events in various low-Earth orbits using Monte Carlo methods are analyzed. The impact of non-Gaussian error volumes on Pc for these cases is minimal, even when the deviation from a Gaussian distribution is significant.

  9. Percutaneous Vertebroplasty for Compression Fracture: Analysis of Vertebral Body Volume by CT Volumetry

    Energy Technology Data Exchange (ETDEWEB)

    Komemushi, A.; Tanigawa, N.; Kariya, S.; Kojima, H.; Shomura, Y.; Sawada, S. [Kansai Medical Univ., Osaka (Japan). Dept. of Radiology

    2005-05-01

    Purpose: To evaluate the relationships between volume of vertebral bodies with compression fracture (measured by CT volumetry) before percutaneous vertebroplasty, the amount of bone cement injected, and the effect of treatment. Material and Methods: We examined 49 consecutive patients, with 104 vertebral body compression fractures, who underwent percutaneous injection of bone cement. Vertebral body volume was measured by CT volumetry. The patient's pain level was assessed using a visual analog scale (VAS) before and after the procedure. Improvement in VAS was defined as the decrease in VAS after the procedure. Relationships between vertebral body volume, the amount of bone cement, and the effect of treatment were evaluated using Pearson's correlation coefficient test. Results: Average vertebral body volume was 26.3 {+-}8.1 cm{sup 3} ; average amount of bone cement was 3.2 {+-}1.1 ml; and average improvement in VAS was 4.9 {+-}2.7. The vertebral body volume was greater if a larger amount of bone cement was injected. There was a significant positive correlation between vertebral body volume and amount of bone cement ( r {approx} 0.44; P <0.0001). However, there was no correlation between vertebral body volume and improvement in VAS, or between amount of bone cement and improvement in VAS. Conclusion: In percutaneous vertebroplasty for vertebral body compression fracture, there is a positive correlation between vertebral body volume and amount of bone cement, but improvement in VAS does not correlate with vertebral body volume or amount of bone cement.

  10. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses…

  11. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, R.N.; Olson, J.; Sommers, P.E.; McLaughlin, S.D.; Jackson, M.S.; Nadel, M.V.; Scott, W.G.; Connor, P.E.; Kerwin, N.; Kennedy, J.K. Jr.

    1983-08-01

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators.

  12. The connection between selective referrals for radical cystectomy and radical prostatectomy and volume-outcome effects: an instrumental variables analysis.

    Science.gov (United States)

    Allareddy, Veerasathpurush; Ward, Marcia M; Wehby, George L; Konety, Badrinath R

    2012-01-01

    This study delineates the roles of "selective referrals" and "practice makes perfect" in the hospital procedure volume and in-hospital mortality association for radical cystectomy and radical prostatectomy. This is a retrospective analysis of the Nationwide Inpatient Sample (years 2000-2004). All hospitalizations with primary procedure codes for radical cystectomy and radical prostatectomy were selected. The association between hospital procedure volume and in-hospital mortality was examined using generalized estimating equations and by instrumental variables approaches. There was an inverse association between hospital procedure volume and in-hospital mortality for radical cystectomy (odds ratio = 0.57; 95% confidence interval = 0.38-0.87; P practice makes perfect." PMID:22205768

  13. COBRA-SFS [Spent Fuel Storage]: A thermal-hydraulic analysis computer code: Volume 1, Mathematical models and solution method

    International Nuclear Information System (INIS)

    COBRA-SFS (Spent Fuel Storage) is a general thermal-hydraulic analysis computer code used to predict temperatures and velocities in a wide variety of systems. The code was refined and specialized for spent fuel storage system analyses for the US Department of Energy's Commercial Spent Fuel Management Program. The finite-volume equations governing mass, momentum, and energy conservation are written for an incompressible, single-phase fluid. The flow equations model a wide range of conditions including natural circulation. The energy equations include the effects of solid and fluid conduction, natural convection, and thermal radiation. The COBRA-SFS code is structured to perform both steady-state and transient calculations: however, the transient capability has not yet been validated. This volume describes the finite-volume equations and the method used to solve these equations. It is directed toward the user who is interested in gaining a more complete understanding of these methods

  14. [Modeling and analysis of volume conduction based on field-circuit coupling].

    Science.gov (United States)

    Tang, Zhide; Liu, Hailong; Xie, Xiaohui; Chen, Xiufa; Hou, Deming

    2012-08-01

    Numerical simulations of volume conduction can be used to analyze the process of energy transfer and explore the effects of some physical factors on energy transfer efficiency. We analyzed the 3D quasi-static electric field by the finite element method, and developed A 3D coupled field-circuit model of volume conduction basing on the coupling between the circuit and the electric field. The model includes a circuit simulation of the volume conduction to provide direct theoretical guidance for energy transfer optimization design. A field-circuit coupling model with circular cylinder electrodes was established on the platform of the software FEM3.5. Based on this, the effects of electrode cross section area, electrode distance and circuit parameters on the performance of volume conduction system were obtained, which provided a basis for optimized design of energy transfer efficiency.

  15. Initial Northwest Power Act Power Sales Contracts : Final Environmental Impact Statement. Volume 1, Environmental Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1992-01-01

    This is volume 1 of the final environmental impact statement of the Bonneville Power Administration Information is included on the following: Purpose of and need for action; alternatives including the proposed action; affected environment; and environmental consequences.

  16. Conceptual design and systems analysis of photovoltaic systems. Volume II. Study results. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Kirpich, A.

    1977-03-19

    This investigation of terrestrial PV systems considered the technical and economic feasibility for systems in three size categories: a small system of about 12 kW peak output for on-site residential use; a large 1500 MW central power plant contributing to the bulk energy of a utility system power grid; and an intermediate size system of about 250 kW for use on public or commercial buildings. In each category, conceptual designs were developed, performance was analyzed for a range of climatic regions, economic analyses were performed, and assessments were made of pertinent institutional issues. The report consists of three volumes. Volume I contains a Study Summary of the major study results. This volume contains the detailed results pertaining to on-site residential photovoltaic systems, central power plant photovoltaic systems, and intermediate size systems applied to commercial and public buildings. Volume III contains supporting appendix material. (WHK)

  17. Finite volume method for analysis of stress and strain in wood

    OpenAIRE

    Izet Horman; Dunja Martinović; Seid Hajdarević

    2009-01-01

    This paper presents a numerical method (the finite volume method) for analysing stress and strain in wood as a solid body. The method is very simple and easy to use. It starts from an integral form of the equations governing momentum, heat and mass balance. Second-order in both time and space finite volume discretisation is performed using the corresponding constitutive relations, resulting in a set of algebraic equations, which are then solved by an efficient segregated iterative procedure. ...

  18. The analysis of subsidence associated with geothermal development. Volume 1. Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Atherton, R.W.; Finnemore, E.J.; Gillam, M.L.

    1976-09-01

    This study evaluates the state of knowledge of subsidence associated with geothermal development, and provides preliminary methods to assess the potential of land subsidence for any specific geothermal site. The results of this study are presented in three volumes. Volume 1 is designed to serve as a concise reference, a handbook, for the evaluation of the potential for land subsidence from the development of geothermal resources.

  19. Analysis of primary coolant pump seal water distribution influence to chemical and volume system design

    International Nuclear Information System (INIS)

    The possible influences to Chemical and Volume Control System design caused by coolant pump seal water distribution are discussed. The essential reason is picked out in this paper. The temperature drop of charging flow at the regenerative heat exchanger outlet is calculated, and the feasible retrofits of the Chemical and Volume Control System are illustrated. The thermal hydraulic software Flowmaster 7.5 is employed to numerically investigate the possible capability of charging pump with different coolant pump seal requirements. (authors)

  20. Geomorphic Analysis of Boulder Volumes and Surface Roughness Along Talus Slopes in Yosemite Valley, California

    Science.gov (United States)

    Takahashi, K.; Stock, G. M.; Finnegan, N. J.

    2015-12-01

    Talus slopes in Yosemite Valley, California, are a rich archive of rock fall processes occurring since deglaciation (~ 15 ka). The valley is an ideal natural laboratory for investigating rock fall processes because the cliffs display a wide range of heights, steepnesses, orientations, and granitic lithologies. We measured the spatial distribution of boulder volumes on rock fall-dominated talus slopes along 10 transects at 8 locations in Yosemite Valley. Boulder volumes span 6 orders of magnitude, from 0.003 to 3000 m3. As expected, boulder volumes increase non-linearly downslope, with the largest boulders located at or beyond the base of talus slopes. Boulder volumes are smaller below cliffs composed of more mafic lithologies, likely reflecting the greater fracture density in those cliffs. Moderately tall cliffs (400-550 m) tend to produce larger boulders than the tallest and shortest cliffs. Using airborne lidar data, we calculated talus surface roughness and found modest increases in roughness as a function of downslope distance, likely related to the downslope increase in boulder volume. By quantifying the spatial distribution of boulder volumes, our results can be used to improve future assessments of rockfall hazard adjacent to talus slopes.

  1. Quantitative analysis of the corpus callosum in children with cerebral palsy and developmental delay: correlation with cerebral white matter volume

    Energy Technology Data Exchange (ETDEWEB)

    Panigrahy, Ashok [Childrens Hospital Los Angeles, Department of Radiology, Los Angeles, CA (United States); Barnes, Patrick D. [Stanford University Medical Center, Department of Radiology, Lucile Salter Packard Children' s Hospital, Palo Alto, CA (United States); Robertson, Robert L. [Children' s Hospital Boston, Department of Radiology, Boston, MA (United States); Sleeper, Lynn A. [New England Research Institute, Watertown, MA (United States); Sayre, James W. [UCLA Medical Center, Departments of Radiology and Biostatistics, Los Angeles, CA (United States)

    2005-12-01

    This study was conducted to quantitatively correlate the thickness of the corpus callosum with the volume of cerebral white matter in children with cerebral palsy and developmental delay. Material and methods: A clinical database of 70 children with cerebral palsy and developmental delay was established with children between the ages of 1 and 5 years. These children also demonstrated abnormal periventricular T2 hyperintensities associated with and without ventriculomegaly. Mid-sagittal T1-weighted images were used to measure the thickness (genu, mid-body, and splenium) and length of the corpus callosum. Volumes of interest were digitized based on gray-scale densities to define the hemispheric cerebral white matter on axial T2-weighted and FLAIR images. The thickness of the mid-body of the corpus callosum was correlated with cerebral white matter volume. Subgroup analysis was also performed to examine the relationship of this correlation with both gestational age and neuromotor outcome. Statistical analysis was performed using analysis of variance and Pearson correlation coefficients. There was a positive correlation between the thickness of the mid-body of the corpus callosum and the volume of cerebral white matter across all children studied (R=0.665, P=0.0001). This correlation was not dependent on gestational age. The thickness of the mid-body of the corpus callosum was decreased in the spastic diplegia group compared to the two other groups (hypotonia and developmental delay only; P<0.0001). Within each neuromotor subgroup, there was a positive correlation between thickness of the mid-body of the corpus callosum and volume of the cerebral white matter. (orig.)

  2. Quantitative analysis of the corpus callosum in children with cerebral palsy and developmental delay: correlation with cerebral white matter volume

    International Nuclear Information System (INIS)

    This study was conducted to quantitatively correlate the thickness of the corpus callosum with the volume of cerebral white matter in children with cerebral palsy and developmental delay. Material and methods: A clinical database of 70 children with cerebral palsy and developmental delay was established with children between the ages of 1 and 5 years. These children also demonstrated abnormal periventricular T2 hyperintensities associated with and without ventriculomegaly. Mid-sagittal T1-weighted images were used to measure the thickness (genu, mid-body, and splenium) and length of the corpus callosum. Volumes of interest were digitized based on gray-scale densities to define the hemispheric cerebral white matter on axial T2-weighted and FLAIR images. The thickness of the mid-body of the corpus callosum was correlated with cerebral white matter volume. Subgroup analysis was also performed to examine the relationship of this correlation with both gestational age and neuromotor outcome. Statistical analysis was performed using analysis of variance and Pearson correlation coefficients. There was a positive correlation between the thickness of the mid-body of the corpus callosum and the volume of cerebral white matter across all children studied (R=0.665, P=0.0001). This correlation was not dependent on gestational age. The thickness of the mid-body of the corpus callosum was decreased in the spastic diplegia group compared to the two other groups (hypotonia and developmental delay only; P<0.0001). Within each neuromotor subgroup, there was a positive correlation between thickness of the mid-body of the corpus callosum and volume of the cerebral white matter. (orig.)

  3. Age-related changes in task related functional network connectivity.

    Directory of Open Access Journals (Sweden)

    Jason Steffener

    Full Text Available Aging has a multi-faceted impact on brain structure, brain function and cognitive task performance, but the interaction of these different age-related changes is largely unexplored. We hypothesize that age-related structural changes alter the functional connectivity within the brain, resulting in altered task performance during cognitive challenges. In this neuroimaging study, we used independent components analysis to identify spatial patterns of coordinated functional activity involved in the performance of a verbal delayed item recognition task from 75 healthy young and 37 healthy old adults. Strength of functional connectivity between spatial components was assessed for age group differences and related to speeded task performance. We then assessed whether age-related differences in global brain volume were associated with age-related differences in functional network connectivity. Both age groups used a series of spatial components during the verbal working memory task and the strength and distribution of functional network connectivity between these components differed across the age groups. Poorer task performance, i.e. slower speed with increasing memory load, in the old adults was associated with decreases in functional network connectivity between components comprised of the supplementary motor area and the middle cingulate and between the precuneus and the middle/superior frontal cortex. Advancing age also led to decreased brain volume; however, there was no evidence to support the hypothesis that age-related alterations in functional network connectivity were the result of global brain volume changes. These results suggest that age-related differences in the coordination of neural activity between brain regions partially underlie differences in cognitive performance.

  4. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository - Volume 3: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R. (INEEL); Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K. (SNL); Rath, J.S. (New Mexico Engineering Research Institute)

    1998-10-01

    The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  5. TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 1: Method and results

    Science.gov (United States)

    Topp, D. A.; Myers, R. A.; Delaney, R. A.

    1995-01-01

    The primary objective of this study was the development of a CFD (Computational Fluid Dynamics) based turbomachinery airfoil analysis and design system, controlled by a GUI (Graphical User Interface). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is the Final Report describing the theoretical basis and analytical results from the TADS system, developed under Task 18 of NASA Contract NAS3-25950, ADPAC System Coupling to Blade Analysis & Design System GUI. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low speed turbine blade and a transonic turbine vane.

  6. High Statistics Analysis using Anisotropic Clover Lattices: (IV) The Volume Dependence of the Light Hadron Masses

    Energy Technology Data Exchange (ETDEWEB)

    Beane, S R; Detmold, W; Lin, H W; Luu, T C; Orginos, K; Parreno, A; Savage, M J; Torok, A; Walker-Loud, A

    2011-07-01

    The volume dependence of the octet baryon masses and relations among them are explored with Lattice QCD. Calculations are performed with nf = 2 + 1 clover fermion discretization in four lattice volumes, with spatial extent L ? 2.0, 2.5, 3.0 and 4.0 fm, with an anisotropic lattice spacing of b_s ? 0.123 fm in the spatial direction, and b_t = b_s/3.5 in the time direction, and at a pion mass of m_\\pi ? 390 MeV. The typical precision of the ground-state baryon mass determination is volume dependence of the masses, the Gell-Mann Okubo mass-relation, and of other mass combinations. A comparison with the predictions of heavy baryon chiral perturbation theory is performed in both the SU(2)L ? SU(2)R and SU(3)L ? SU(3)R expansions. Predictions of the three-flavor expansion for the hadron masses are found to describe the observed volume dependences reasonably well. Further, the ?N? axial coupling constant is extracted from the volume dependence of the nucleon mass in the two-flavor expansion, with only small modifications in the three-flavor expansion from the inclusion of kaons and eta's. At a given value of m?L, the finite-volume contributions to the nucleon mass are predicted to be significantly smaller at m_\\pi ? 140 MeV than at m_\\pi ? 390 MeV due to a coefficient that scales as ? m_\\pi^3. This is relevant for the design of future ensembles of lattice gauge-field configurations. Finally, the volume dependence of the pion and kaon masses are analyzed with two-flavor and three-flavor chiral perturbation theory.

  7. A macroecological analysis of SERA derived forest heights and implications for forest volume remote sensing.

    Directory of Open Access Journals (Sweden)

    Matthew Brolly

    Full Text Available Individual trees have been shown to exhibit strong relationships between DBH, height and volume. Often such studies are cited as justification for forest volume or standing biomass estimation through remote sensing. With resolution of common satellite remote sensing systems generally too low to resolve individuals, and a need for larger coverage, these systems rely on descriptive heights, which account for tree collections in forests. For remote sensing and allometric applications, this height is not entirely understood in terms of its location. Here, a forest growth model (SERA analyzes forest canopy height relationships with forest wood volume. Maximum height, mean, H₁₀₀, and Lorey's height are examined for variability under plant number density, resource and species. Our findings, shown to be allometrically consistent with empirical measurements for forested communities world-wide, are analyzed for implications to forest remote sensing techniques such as LiDAR and RADAR. Traditional forestry measures of maximum height, and to a lesser extent H₁₀₀ and Lorey's, exhibit little consistent correlation with forest volume across modeled conditions. The implication is that using forest height to infer volume or biomass from remote sensing requires species and community behavioral information to infer accurate estimates using height alone. SERA predicts mean height to provide the most consistent relationship with volume of the height classifications studied and overall across forest variations. This prediction agrees with empirical data collected from conifer and angiosperm forests with plant densities ranging between 10²-10⁶ plants/hectare and heights 6-49 m. Height classifications investigated are potentially linked to radar scattering centers with implications for allometry. These findings may be used to advance forest biomass estimation accuracy through remote sensing. Furthermore, Lorey's height with its specific relationship to

  8. Integrating Information: An Analysis of the Processes Involved and the Products Generated in a Written Synthesis Task

    Science.gov (United States)

    Sole, Isabel; Miras, Mariana; Castells, Nuria; Espino, Sandra; Minguela, Marta

    2013-01-01

    The case study reported here explores the processes involved in producing a written synthesis of three history texts and their possible relation to the characteristics of the texts produced and the degree of comprehension achieved following the task. The processes carried out by 10 final-year compulsory education students (15 and 16 years old) to…

  9. When Negotiation of Meaning is Also Negotiation of Task: Analysis of the Communication in an Applied Mathematics High School Course.

    Science.gov (United States)

    Christiansen, Iben Maj

    1997-01-01

    The negotiation of meaning presupposes a taken-to-be-shared understanding of a situation. Uses an example to illustrate how negotiation of meaning and task may be linked in ways inappropriate to the learning of modeling and critical reflections. Contains 16 references. (Author/ASK)

  10. A Field-Tested Task Analysis for Creating Single-Subject Graphs Using Microsoft[R] Office Excel

    Science.gov (United States)

    Lo, Ya-yu; Konrad, Moira

    2007-01-01

    Creating single-subject (SS) graphs is challenging for many researchers and practitioners because it is a complex task with many steps. Although several authors have introduced guidelines for creating SS graphs, many users continue to experience frustration. The purpose of this article is to minimize these frustrations by providing a field-tested…

  11. Exploring General versus Task-Specific Assessments of Metacognition in University Chemistry Students: A Multitrait-Multimethod Analysis

    Science.gov (United States)

    Wang, Chia-Yu

    2015-01-01

    The purpose of this study was to use multiple assessments to investigate the general versus task-specific characteristics of metacognition in dissimilar chemistry topics. This mixed-method approach investigated the nature of undergraduate general chemistry students' metacognition using four assessments: a self-report questionnaire, assessment of…

  12. Designing simulator-based training: An approach integrating cognitive task analysis and four-component instructional design

    NARCIS (Netherlands)

    Tjiam, I.M.; Schout, B.M.; Hendrikx, A.J.M.; Scherpbier, A.J.J.A.; Witjes, J.A.; Merrienboer, J.J. van

    2012-01-01

    Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic p

  13. Analysis of nuclear waste disposal in space, phase 3. Volume 1: Executive summary of technical report

    Science.gov (United States)

    Rice, E. E.; Miller, N. E.; Yates, K. R.; Martin, W. E.; Friedlander, A. L.

    1980-01-01

    The objectives, approach, assumptions, and limitations of a study of nuclear waste disposal in space are discussed with emphasis on the following: (1) payload characterization; (2) safety assessment; (3) health effects assessment; (4) long-term risk assessment; and (5) program planning support to NASA and DOE. Conclusions are presented for each task.

  14. Functional Data Analysis in NTCP Modeling: A New Method to Explore the Radiation Dose-Volume Effects

    International Nuclear Information System (INIS)

    Purpose/Objective(s): To describe a novel method to explore radiation dose-volume effects. Functional data analysis is used to investigate the information contained in differential dose-volume histograms. The method is applied to the normal tissue complication probability modeling of rectal bleeding (RB) for patients irradiated in the prostatic bed by 3-dimensional conformal radiation therapy. Methods and Materials: Kernel density estimation was used to estimate the individual probability density functions from each of the 141 rectum differential dose-volume histograms. Functional principal component analysis was performed on the estimated probability density functions to explore the variation modes in the dose distribution. The functional principal components were then tested for association with RB using logistic regression adapted to functional covariates (FLR). For comparison, 3 other normal tissue complication probability models were considered: the Lyman-Kutcher-Burman model, logistic model based on standard dosimetric parameters (LM), and logistic model based on multivariate principal component analysis (PCA). Results: The incidence rate of grade ≥2 RB was 14%. V65Gy was the most predictive factor for the LM (P=.058). The best fit for the Lyman-Kutcher-Burman model was obtained with n=0.12, m = 0.17, and TD50 = 72.6 Gy. In PCA and FLR, the components that describe the interdependence between the relative volumes exposed at intermediate and high doses were the most correlated to the complication. The FLR parameter function leads to a better understanding of the volume effect by including the treatment specificity in the delivered mechanistic information. For RB grade ≥2, patients with advanced age are significantly at risk (odds ratio, 1.123; 95% confidence interval, 1.03-1.22), and the fits of the LM, PCA, and functional principal component analysis models are significantly improved by including this clinical factor. Conclusion: Functional data

  15. Finite volume analysis of temperature effects induced by active MRI implants: 2. Defects on active MRI implants causing hot spots

    Directory of Open Access Journals (Sweden)

    Grönemeyer Dietrich HW

    2006-05-01

    investigations. The finite volume analysis calculates the time developing temperature maps for the model of a broken linear metallic wire embedded in tissue. Half of the total hot spot power loss is assumed to diffuse into both wire parts at the location of a defect. The energy is distributed from there by heat conduction. Additionally the effect of blood perfusion and blood flow is respected in some simulations because the simultaneous appearance of all worst case conditions, especially the absence of blood perfusion and blood flow near the hot spot, is very unlikely for vessel implants. Results The analytical solution as worst case scenario as well as the finite volume analysis for near worst case situations show not negligible volumes with critical temperature increases for part of the modeled hot spot situations. MR investigations with a high rf-pulse density lasting below a minute can establish volumes of several cubic millimeters with temperature increases high enough to start cell destruction. Longer exposure times can involve volumes larger than 100 mm3. Even temperature increases in the range of thermal ablation are reached for substantial volumes. MR sequence exposure time and hot spot power loss are the primary factors influencing the volume with critical temperature increases. Wire radius, wire material as well as the physiological parameters blood perfusion and blood flow inside larger vessels reduce the volume with critical temperature increases, but do not exclude a volume with critical tissue heating for resonators with a large product of resonator volume and quality factor. Conclusion The worst case scenario assumes thermal equilibrium for a hot spot embedded in homogeneous tissue without any cooling due to blood perfusion or flow. The finite volume analysis can calculate the results for near and not close to worst case conditions. For both cases a substantial volume can reach a critical temperature increase in a short time. The analytical solution, as absolute

  16. Task modeling for collaborative authoring

    NARCIS (Netherlands)

    Veer, van der Gerrit; Kulyk, Olga; Vyas, Dhaval; Kubbe, Onno; Ebert, Achim; Dittmar, A.; Forbrig, P.

    2011-01-01

    Motivation –Task analysis for designing modern collaborative work needs a more fine grained approach. Especially in a complex task domain, like collaborative scientific authoring, when there is a single overall goal that can only be accomplished only by collaboration between multiple roles, each req

  17. Upper Airway Volume Segmentation Analysis Using Cine MRI Findings in Children with Tracheostomy Tubes

    Energy Technology Data Exchange (ETDEWEB)

    Fricke, Bradley L.; Abbott, M. Bret; Donnelly, Lane F.; Dardzinski, Bernard J.; Poe, Stacy A.; Kalra, Maninder; Amin, Raouf S.; Cotton, Robin T. [Cincinnati Children' s Hospital Medical Center, Cincinnati (United States)

    2007-12-15

    The purpose of this study is to evaluate the airway dynamics of the upper airway as depicted on cine MRI in children with tracheotomy tubes during two states of airflow through the upper airway. Sagittal fast gradient echo cine MR images of the supra-glottic airway were obtained with a 1.5T MRI scanner on seven children with tracheotomy tubes. Two sets of images were obtained with either the tubes capped or uncapped. The findings of the cine MRI were retrospectively reviewed. Volume segmentation of the cine images to compare the airway volume change over time (mean volume, standard deviation, normalized range, and coefficient of variance) was performed for the capped and uncapped tubes in both the nasopharynx and hypopharynx (Signed Rank Test). Graphical representation of the airway volume over time demonstrates a qualitative increased fluctuation in patients with the tracheotomy tube capped as compared to uncapped in both the nasopharyngeal and hypopharyngeal regions of interest. In the nasopharynx, the mean airway volume (capped 2.72 mL, uncapped 2.09 mL, p = 0.0313), the airway volume standard deviation (capped 0.42 mL, uncapped 0.20 mL, p = 0.0156), and the airway volume range (capped 2.10 mL, uncapped 1.09 mL, p = 0.0156) were significantly larger in the capped group of patients. In the hypopharynx, the airway volume standard deviation (capped 1.54 mL, uncapped 0.67 mL, p = 0.0156), and the airway volume range (capped 6.44 mL, uncapped 2.93 mL, p = 0.0156) were significantly larger in the capped tubes. The coefficient of variance (capped 0.37, uncapped 0.26, p = 0.0469) and the normalized range (capped 1.52, uncapped 1.09, p = 0.0313) were significantly larger in the capped tubes. There is a statistically significant change in airway dynamics in children with tracheotomy tubes when breathing via the airway as compared to breathing via the tracheotomy tube.

  18. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    International Nuclear Information System (INIS)

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  19. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  20. Multi-temporal MRI carpal bone volumes analysis by principal axes registration

    Science.gov (United States)

    Ferretti, Roberta; Dellepiane, Silvana

    2016-03-01

    In this paper, a principal axes registration technique is presented, with the relevant application to segmented volumes. The purpose of the proposed registration is to compare multi-temporal volumes of carpal bones from Magnetic Resonance Imaging (MRI) acquisitions. Starting from the study of the second-order moment matrix, the eigenvectors are calculated to allow the rotation of volumes with respect to reference axes. Then the volumes are spatially translated to become perfectly overlapped. A quantitative evaluation of the results obtained is carried out by computing classical indices from the confusion matrix, which depict similarity measures between the volumes of the same organ as extracted from MRI acquisitions executed at different moments. Within the medical field, the way a registration can be used to compare multi-temporal images is of great interest, since it provides the physician with a tool which allows a visual monitoring of a disease evolution. The segmentation method used herein is based on the graph theory and is a robust, unsupervised and parameters independent method. Patients affected by rheumatic diseases have been considered.

  1. Nicotine effects on brain function during a visual oddball task: a comparison between conventional and EEG-informed fMRI analysis.

    Science.gov (United States)

    Warbrick, Tracy; Mobascher, Arian; Brinkmeyer, Jürgen; Musso, Francesco; Stoecker, Tony; Shah, N Jon; Fink, Gereon R; Winterer, Georg

    2012-08-01

    In a previous oddball task study, it was shown that the inclusion of electrophysiology (EEG), that is, single-trial P3 ERP parameters, in the analysis of fMRI responses can detect activation that is not apparent with conventional fMRI data modeling strategies [Warbrick, T., Mobascher, A., Brinkmeyer, J., Musso, F., Richter, N., Stoecker, T., et al. Single-trial P3 amplitude and latency informed event-related fMRI models yield different BOLD response patterns to a target detection task. Neuroimage, 47, 1532-1544, 2009]. Given that P3 is modulated by nicotine, including P3 parameters in the fMRI analysis might provide additional information about nicotine effects on brain function. A 1-mg nasal nicotine spray (0.5 mg each nostril) or placebo (pepper) spray was administered in a double-blind, placebo-controlled, within-subject, randomized, cross-over design. Simultaneous EEG-fMRI and behavioral data were recorded from 19 current smokers in response to an oddball-type visual choice RT task. Conventional general linear model analysis and single-trial P3 amplitude informed general linear model analysis of the fMRI data were performed. Comparing the nicotine with the placebo condition, reduced RTs in the nicotine condition were related to decreased BOLD responses in the conventional analysis encompassing the superior parietal lobule, the precuneus, and the lateral occipital cortex. On the other hand, reduced RTs were related to increased BOLD responses in the precentral and postcentral gyri, and ACC in the EEG-informed fMRI analysis. Our results show how integrated analyses of simultaneous EEG-fMRI data can be used to detect nicotine effects that would not have been revealed through conventional analysis of either measure in isolation. This emphasizes the significance of applying multimodal imaging methods to pharmacoimaging. PMID:22452559

  2. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  3. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sperling, M.; Shreve, D.C.

    1978-12-01

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code.

  4. An Analysis of Finite Volume, Finite Element, and Finite Difference Methods Using Some Concepts from Algebraic Topology

    OpenAIRE

    Mattiussi, Claudio

    1997-01-01

    In this paper we apply the ideas of algebraic topology to the analysis of the finite volume and finite element methods, illuminating the similarity between the discretization strategies adopted by the two methods, in the light of a geometric interpretation proposed for the role played by the weighting functions in finite elements. We discuss the intrinsic discrete nature of some of the factors appearing in the field equations, underlining the exception represented by the constitutive term, th...

  5. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    International Nuclear Information System (INIS)

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code

  6. Analysis of Androgenic Steroids in Environmental Waters by Large-volume Injection Liquid Chromatography Tandem Mass Spectrometry

    OpenAIRE

    Backe, Will J.; Ort, Christoph; Brewer, Alex J.; Field, Jennifer A.

    2011-01-01

    A new method was developed for the analysis of natural and synthetic androgenic steroids and their selected metabolites in aquatic environmental matrices using direct large-volume injection (LVI) high performance liquid chromatography (HPLC) tandem mass spectrometry (MS/MS). Method accuracy ranged from 88 to 108% for analytes with well-matched internal standards. Precision, quantified by relative standard deviation (RSD), was less than 12%. Detection limits for the method ranged from 1.2 to 3...

  7. Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document

    International Nuclear Information System (INIS)

    Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere

  8. A preliminary analysis of lunar extra-mare basalts - Distribution, compositions, ages, volumes, and eruption styles

    Science.gov (United States)

    Whitford-Stark, J. L.

    1982-01-01

    Extra-mare basalts occupy 8.5% of the lunar basalt area and comprise 1% of the total mare basalt volume. They are preferentially located where the crust is thin and topographically low. In terms of age, eruption style, and composition they are as variable as the mare basalts. In some instances extrusion in extra-mare craters was preceded by floor-fracturing whereas in other cases it apparently was not. The volume of lava erupted may have been controlled more by the volume of magma produced than by hydrostatic effects. A minimum of nearly 1300 separate basalt eruptions is indicated; the true value could be nearer 30,000 separate eruptions.

  9. Finite volume element method for analysis of unsteady reaction-diffusion problems

    Institute of Scientific and Technical Information of China (English)

    Sutthisak Phongthanapanich; Pramote Dechaumphai

    2009-01-01

    A finite volume element method is developed for analyzing unsteady scalar reaction--diffusion problems in two dimensions. The method combines the concepts that are employed in the finite volume and the finite element method together. The finite volume method is used to discretize the unsteady reaction--diffusion equation, while the finite element method is applied to estimate the gradient quantities at cell faces. Robustness and efficiency of the combined method have been evaluated on uniform rectangular grids by using available numerical solutions of the two-dimensional reaction-diffusion problems. The numerical solutions demonstrate that the combined method is stable and can provide accurate solution without spurious oscillation along the highgradient boundary layers.

  10. Stereological analysis of the mediodorsal thalamic nucleus in schizophrenia: volume, neuron number, and cell types

    DEFF Research Database (Denmark)

    Dorph-Petersen, Karl-Anton; Pierri, Joseph N; Sun, Zhuoxin;

    2004-01-01

    The mediodorsal thalamic nucleus (MD) is the principal relay nucleus for the prefrontal cortex, a brain region thought to be dysfunctional in schizophrenia. Several, but not all, postmortem studies of the MD in schizophrenia have reported decreased volume and total neuronal number. However......, it is not clear whether the findings are specific for schizophrenia nor is it known which subtypes of thalamic neurons are affected. We studied the left MD in 11 subjects with schizophrenia, 9 control subjects, and 12 subjects with mood disorders. Based on morphological criteria, we divided the neurons into two...... subclasses, presumably corresponding to projection neurons and local circuit neurons. We estimated MD volume and the neuron number of each subclass using methods based on modern unbiased stereological principles. We also estimated the somal volumes of each subclass using a robust, but biased, approach...

  11. EFFICIENT ANALYSIS OF ELECTROMAGNETIC RADIATION FROM MICROSTRIP ANTENNA USING VOLUME-SURFACE CURRENT CONTUNUITY METHOD

    Institute of Scientific and Technical Information of China (English)

    Yuan Jiade; Su Kaixiong; Ye Yuhuang

    2012-01-01

    The Volume-Surface Current Continuity Method (VSCCM) is presented to analyze electromagnetic radiation from microstrip antenna.The microstrip antenna is discretized into small triangular patches on conducting surface and tetrahedral volume cells in dielectric region.The Method of Moments (MoM) is applied to solve the integral equation.An equation contains the restriction relation between the volume and surface current coefficient is derived from the current continuity equation at those parts where the conducting surface is in contact with the dielectric material.A simple equivalent strip model is introduced in the treatment of the feeding probe in VSCCM.The VSCCM can reduce the unknowns required to be solved in MoM,as well as the condition number of the matrix equation.Numerical results are given to validate the accuracy and efficiency of this method.

  12. Analysis of Partial Volume Effects on Arterial Input Functions Using Gradient Echo: A Simulation Study

    DEFF Research Database (Denmark)

    Kjølby, Birgitte Fuglsang; Mikkelsen, Irene Klærke; Pedersen, Michael;

    2009-01-01

    Absolute blood flow and blood volume measurements using perfusion weighted MRI require an accurately measured arterial input function (AIF). Because of limited spatial resolution of MR images, AIF voxels cannot be placed completely within a feeding artery. We present a two-compartment model...... of an AIF voxel including the relaxation properties of blood and tissue. Artery orientations parallel and perpendicular to the main magnetic field were investigated and AIF voxels were modeled to either include or be situated close to a large artery. The impact of partial volume effects on quantitative...

  13. Analysis of the Microstructure and Permeability of the Laminates with Different Fiber Volume Fraction

    Institute of Scientific and Technical Information of China (English)

    MA Yue; LI Wei; LIANG Zi-qing

    2008-01-01

    Microstmctures of laminates produced by epoxy/ carbon fibers with different fiber volume fraction were studied by analyzing the composite cross-sections. The main result of the compaction of reinforcement is the flatting of bundle shape, the reducing of gap and the embedment of bundles among each layer. The void content outside the bundle decreased sharply during the compoction until it is less than that inside the bundle when the fiber volume fraction is over 60%. The resin flow velocity in the fiber tow is 102-104 times greater than the flow velocity out the fiber tow no matter the capillary pressure is taken into account or not.

  14. Frame of reference for electronic maps - The relevance of spatial cognition, mental rotation, and componential task analysis

    Science.gov (United States)

    Wickens, Christopher D.; Aretz, Anthony; Harwood, Kelly

    1989-01-01

    Three experiments are reported that examine the difference between north-up and track-up maps for airborne navigation. The results of the first two experiments, conducted in a basic laboratory setting, identified the cost associated with mental rotation, when a north-up map is used. However, the data suggest that these costs are neither large nor consistent. The third experiment examined a range of tasks in a higher fidelity helicopter flight simulation, and associated the costs of north-up maps with a cognitive component related to orientation, and the costs of track-up maps with a cognitive component related to inconsistent landmark location. Different tasks are associated with different dependence on these components. The results are discussed in terms of their implications for map design, and for cognitive models of navigational processes.

  15. Assessment of Cognitive Function in the Water Maze Task: Maximizing Data Collection and Analysis in Animal Models of Brain Injury.

    Science.gov (United States)

    Whiting, Mark D; Kokiko-Cochran, Olga N

    2016-01-01

    Animal models play a critical role in understanding the biomechanical, pathophysiological, and behavioral consequences of traumatic brain injury (TBI). In preclinical studies, cognitive impairment induced by TBI is often assessed using the Morris water maze (MWM). Frequently described as a hippocampally dependent spatial navigation task, the MWM is a highly integrative behavioral task that requires intact functioning in numerous brain regions and involves an interdependent set of mnemonic and non-mnemonic processes. In this chapter, we review the special considerations involved in using the MWM in animal models of TBI, with an emphasis on maximizing the degree of information extracted from performance data. We include a theoretical framework for examining deficits in discrete stages of cognitive function and offer suggestions for how to make inferences regarding the specific nature of TBI-induced cognitive impairment. The ultimate goal is more precise modeling of the animal equivalents of the cognitive deficits seen in human TBI. PMID:27604738

  16. Task management in the new ATLAS production system

    International Nuclear Information System (INIS)

    This document describes the design of the new Production System of the ATLAS experiment at the LHC [1]. The Production System is the top level workflow manager which translates physicists' needs for production level processing and analysis into actual workflows executed across over a hundred Grid sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. In the new design, the main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, DEFT manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. The JEDI component then dynamically translates the task definitions from DEFT into actual workload jobs executed in the PanDA Workload Management System [2]. We present the requirements, design parameters, basics of the object model and concrete solutions utilized in building the new Production System and its components.

  17. Analysis of the situation of the vacuum in FTU; Analisi della situazione del vuoto di FTU (Resoconto del lavoro svolto dalla task force)

    Energy Technology Data Exchange (ETDEWEB)

    Alessandrini, C.; Angelini, B.; Apicella, M.L.; Mazzitelli, G.; Pirani, S.; Zanza, V. [ENEA, Centro Ricerche Frascati, Rome (Italy). Dipt. Energia

    1999-01-01

    To analyze the situation of the vacuum in the FTU tokamak, on 22/5/96 was set up a task force to identify the problem(s) and to settle the operative and cleaning procedures. The main actions of the task force were: leak tests, automatic procedure to monitor on line the state of the machine vacuum and an exhaustive analysis of the work done before. The task force reviewed the outgassing measurements of the plastic materials inserted into the machine and was decided to repeat the test on the thermocouples. The results pointed out that the thermocouples are a practically infinite reservoir of water. The outcome of the task force was a set of new procedures and recommendations during both the operation of FTU and the shutdown periods. FTU is now operating at more acceptable plasma purity. [Italiano] Il 22/5/96 fu costituita una Task Force (TF) per analizzare le cause della `non pulizia` di FTU e per identificare le nuove procedure da seguire per la pulizia della camera da vuoto. Le azioni che la TF intraprese furono: leak tests, monitoraggio continuo dello stato del vuoto e revisione critica del lavoro fatto precedentemente. Vennero poi analizzate delle misure fatte a suo tempo sui materiali presenti nella camera da vuoto di FTU e, nel caso delle termocoppie, si decise di ripetere il test. Da questa seconda prova emerse che i cilindretti di queste termocoppie sono delle riserve inesauribili di acqua che vengono alimentate da ogni riapertura della macchina. Il lavoro della TF si concluse con una serie di procedure e raccomandazioni, da seguire prima di ogni riapertura della macchina e durante le campagne sperimentali a macchina fredda; tali procedure tengono conto del fatto che negli anni e` aumentata la quantita` di materiali plastici presenti nella camera da vuoto, soprattutto nei ports. FTU sta attualmente operando in condizioni piu` accettabili di pulizia della camera.

  18. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Ikejimba, Lynda C., E-mail: lci@duke.edu [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Kiarashi, Nooshin [Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Electrical and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States); Ghate, Sujata V. [Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Samei, Ehsan [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Electrical and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27705 (United States); Department of Biomedical Engineering, Duke University, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Electrical and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States); Department of Biomedical Engineering, Duke University, Durham, North Carolina 27705 (United States)

    2014-06-15

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d{sup ′}, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d{sup ′} was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d{sup ′}, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d{sup ′} values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of

  19. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    International Nuclear Information System (INIS)

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d′, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d′ was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d′, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d′ values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of inplane structures and

  20. Effectiveness of the random sequential absorption algorithm in the analysis of volume elements with nanoplatelets

    DEFF Research Database (Denmark)

    Pontefisso, Alessandro; Zappalorto, Michele; Quaresimin, Marino

    2016-01-01

    In this work, a study of the Random Sequential Absorption (RSA) algorithm in the generation of nanoplatelet Volume Elements (VEs) is carried out. The effect of the algorithm input parameters on the reinforcement distribution is studied through the implementation of statistical tools, showing...