WorldWideScience

Sample records for task analysis diagram

  1. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  2. Formal Analysis Of Use Case Diagrams

    Directory of Open Access Journals (Sweden)

    Radosław Klimek

    2010-01-01

    Full Text Available Use case diagrams play an important role in modeling with UML. Careful modeling is crucialin obtaining a correct and efficient system architecture. The paper refers to the formalanalysis of the use case diagrams. A formal model of use cases is proposed and its constructionfor typical relationships between use cases is described. Two methods of formal analysis andverification are presented. The first one based on a states’ exploration represents a modelchecking approach. The second one refers to the symbolic reasoning using formal methodsof temporal logic. Simple but representative example of the use case scenario verification isdiscussed.

  3. Using influence diagrams for data worth analysis

    International Nuclear Information System (INIS)

    Sharif Heger, A.; White, Janis E.

    1997-01-01

    Decision-making under uncertainty describes most environmental remediation and waste management problems. Inherent limitations in knowledge concerning contaminants, environmental fate and transport, remedies, and risks force decision-makers to select a course of action based on uncertain and incomplete information. Because uncertainties can be reduced by collecting additional data., uncertainty and sensitivity analysis techniques have received considerable attention. When costs associated with reducing uncertainty are considered in a decision problem, the objective changes; rather than determine what data to collect to reduce overall uncertainty, the goal is to determine what data to collect to best differentiate between possible courses of action or decision alternatives. Environmental restoration and waste management requires cost-effective methods for characterization and monitoring, and these methods must also satisfy regulatory requirements. Characterization and monitoring activities imply that, sooner or later, a decision must be made about collecting new field data. Limited fiscal resources for data collection should be committed only to those data that have the most impact on the decision at lowest possible cost. Applying influence diagrams in combination with data worth analysis produces a method which not only satisfies these requirements but also gives rise to an intuitive representation of complex structures not possible in the more traditional decision tree representation. This paper demonstrates the use of influence diagrams in data worth analysis by applying to a monitor-and-treat problem often encountered in environmental decision problems

  4. Cognitive task analysis

    NARCIS (Netherlands)

    Schraagen, J.M.C.

    2000-01-01

    Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the

  5. Cost-effectiveness Analysis with Influence Diagrams.

    Science.gov (United States)

    Arias, M; Díez, F J

    2015-01-01

    Cost-effectiveness analysis (CEA) is used increasingly in medicine to determine whether the health benefit of an intervention is worth the economic cost. Decision trees, the standard decision modeling technique for non-temporal domains, can only perform CEA for very small problems. To develop a method for CEA in problems involving several dozen variables. We explain how to build influence diagrams (IDs) that explicitly represent cost and effectiveness. We propose an algorithm for evaluating cost-effectiveness IDs directly, i.e., without expanding an equivalent decision tree. The evaluation of an ID returns a set of intervals for the willingness to pay - separated by cost-effectiveness thresholds - and, for each interval, the cost, the effectiveness, and the optimal intervention. The algorithm that evaluates the ID directly is in general much more efficient than the brute-force method, which is in turn more efficient than the expansion of an equivalent decision tree. Using OpenMarkov, an open-source software tool that implements this algorithm, we have been able to perform CEAs on several IDs whose equivalent decision trees contain millions of branches. IDs can perform CEA on large problems that cannot be analyzed with decision trees.

  6. Critical point analysis of phase envelope diagram

    Energy Technology Data Exchange (ETDEWEB)

    Soetikno, Darmadi; Siagian, Ucok W. R. [Department of Petroleum Engineering, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id; Puspita, Dila, E-mail: rkusdiantara@s.itb.ac.id; Sidarto, Kuntjoro A., E-mail: rkusdiantara@s.itb.ac.id; Soewono, Edy; Gunawan, Agus Y. [Department of Mathematics, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia)

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  7. Critical point analysis of phase envelope diagram

    International Nuclear Information System (INIS)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Soewono, Edy; Gunawan, Agus Y.

    2014-01-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab

  8. Stabilization diagrams using operational modal analysis and sliding filters

    DEFF Research Database (Denmark)

    Olsen, Peter; Juul, Martin Ørum Ørhem; Tarpø, Marius Glindtvad

    2017-01-01

    This paper presents a filtering technique for doing effective operational modal analysis. The result of the filtering method is construction of stabilization diagram that clearly separates physical poles from spurious noise poles needed for unbiased fitting. A band pass filter is moved slowly over...

  9. Level Diagrams analysis of Pareto Front for multiobjective system redundancy allocation

    International Nuclear Information System (INIS)

    Zio, E.; Bazzo, R.

    2011-01-01

    Reliability-based and risk-informed design, operation, maintenance and regulation lead to multiobjective (multicriteria) optimization problems. In this context, the Pareto Front and Set found in a multiobjective optimality search provide a family of solutions among which the decision maker has to look for the best choice according to his or her preferences. Efficient visualization techniques for Pareto Front and Set analyses are needed for helping decision makers in the selection task. In this paper, we consider the multiobjective optimization of system redundancy allocation and use the recently introduced Level Diagrams technique for graphically representing the resulting Pareto Front and Set. Each objective and decision variable is represented on separate diagrams where the points of the Pareto Front and Set are positioned according to their proximity to ideally optimal points, as measured by a metric of normalized objective values. All diagrams are synchronized across all objectives and decision variables. On the basis of the analysis of the Level Diagrams, we introduce a procedure for reducing the number of solutions in the Pareto Front; from the reduced set of solutions, the decision maker can more easily identify his or her preferred solution.

  10. Drawing Euler Diagrams with Circles

    OpenAIRE

    Stapleton, Gem; Zhang, Leishi; Howse, John; Rodgers, Peter

    2010-01-01

    Euler diagrams are a popular and intuitive visualization tool which are used in a wide variety of application areas, including biological and medical data analysis. As with other data visualization methods, such as graphs, bar charts, or pie charts, the automated generation of an Euler diagram from a suitable data set would be advantageous, removing the burden of manual data analysis and the subsequent task of drawing an appropriate diagram. Various methods have emerged that automatically dra...

  11. Development and Application of a Rubric for Analysis of Novice Students' Laboratory Flow Diagrams

    Science.gov (United States)

    Davidowitz, Bette; Rollnick, Marissa; Fakudze, Cynthia

    2005-01-01

    The purpose of this study was to develop and apply a scheme for the analysis of flow diagrams. The flow diagrams in question are a schematic representation of written instructions that require students to process the text of their practical manual. It was hoped that an analysis of the flow diagrams would provide insight into students'…

  12. Stock flow diagram analysis on solid waste management in Malaysia

    Science.gov (United States)

    Zulkipli, Faridah; Nopiah, Zulkifli Mohd; Basri, Noor Ezlin Ahmad; Kie, Cheng Jack

    2016-10-01

    The effectiveness on solid waste management is a major importance to societies. Numerous generation of solid waste from our daily activities has risked for our communities. These due to rapid population grow and advance in economic development. Moreover, the complexity of solid waste management is inherently involved large scale, diverse and element of uncertainties that must assist stakeholders with deviating objectives. In this paper, we proposed a system dynamics simulation by developing a stock flow diagram to illustrate the solid waste generation process and waste recycle process. The analysis highlights the impact on increasing the number of population toward the amount of solid waste generated and the amount of recycled waste. The results show an increment in the number of population as well as the amount of recycled waste will decrease the amount of waste generated. It is positively represent the achievement of government aim to minimize the amount of waste to be disposed by year 2020.

  13. The root cause of ability and inability to assemble and install components using written manual with or without diagrams among non-native English speakers: Root cause analysis

    Science.gov (United States)

    Shukri, S. Ahmad; Millar, R.; Gratton, G.; Garner, M.; Noh, H. Mohd

    2017-12-01

    Documentation errors and human errors are often claimed to be the contributory factors for aircraft maintenance mistakes. This paper highlights the preliminary results of the third phase of a four-phased research on communication media that are utilised in an aircraft maintenance organisation. The second phase has looked into the probability of success and failure in completing a task by 60 subjects while in this third phase, the same subjects have been interviewed immediately after completing the task by using Root Cause Analysis (RCA) method. It is discovered that the root cause of their inability to finish the task while using only written manual is the absence of diagrams. However, haste is identified to be the root cause for the incompletion of the task when both manual and diagram are given to the participants. It is observed that those who are able to complete the task is due to their reference to both manual and diagram, simultaneously.

  14. A fault tree analysis strategy using binary decision diagrams

    International Nuclear Information System (INIS)

    Reay, Karen A.; Andrews, John D.

    2002-01-01

    The use of binary decision diagrams (BDDs) in fault tree analysis provides both an accurate and efficient means of analysing a system. There is a problem, however, with the conversion process of the fault tree to the BDD. The variable ordering scheme chosen for the construction of the BDD has a crucial effect on its resulting size and previous research has failed to identify any scheme that is capable of producing BDDs for all fault trees. This paper proposes an analysis strategy aimed at increasing the likelihood of obtaining a BDD for any given fault tree, by ensuring the associated calculations are as efficient as possible. The method implements simplification techniques, which are applied to the fault tree to obtain a set of 'minimal' subtrees, equivalent to the original fault tree structure. BDDs are constructed for each, using ordering schemes most suited to their particular characteristics. Quantitative analysis is performed simultaneously on the set of BDDs to obtain the top event probability, the system unconditional failure intensity and the criticality of the basic events

  15. Performance of Phonatory Deviation Diagrams in Synthesized Voice Analysis.

    Science.gov (United States)

    Lopes, Leonardo Wanderley; da Silva, Karoline Evangelista; da Silva Evangelista, Deyverson; Almeida, Anna Alice; Silva, Priscila Oliveira Costa; Lucero, Jorge; Behlau, Mara

    2018-05-02

    To analyze the performance of a phonatory deviation diagram (PDD) in discriminating the presence and severity of voice deviation and the predominant voice quality of synthesized voices. A speech-language pathologist performed the auditory-perceptual analysis of the synthesized voice (n = 871). The PDD distribution of voice signals was analyzed according to area, quadrant, shape, and density. Differences in signal distribution regarding the PDD area and quadrant were detected when differentiating the signals with and without voice deviation and with different predominant voice quality. Differences in signal distribution were found in all PDD parameters as a function of the severity of voice disorder. The PDD area and quadrant can differentiate normal voices from deviant synthesized voices. There are differences in signal distribution in PDD area and quadrant as a function of the severity of voice disorder and the predominant voice quality. However, the PDD area and quadrant do not differentiate the signals as a function of severity of voice disorder and differentiated only the breathy and rough voices from the normal and strained voices. PDD density is able to differentiate only signals with moderate and severe deviation. PDD shape shows differences between signals with different severities of voice deviation. © 2018 S. Karger AG, Basel.

  16. Task analysis and support for problem solving tasks

    International Nuclear Information System (INIS)

    Bainbridge, L.

    1987-01-01

    This paper is concerned with Task Analysis as the basis for ergonomic design to reduce human error rates, rather than for predicting human error rates. Task Analysis techniques usually provide a set of categories for describing sub tasks, and a framework describing the relations between sub-tasks. Both the task type categories and their organisation have implications for optimum interface and training design. In this paper, the framework needed for considering the most complex tasks faced by operators in process industries is discussed such as fault management in unexpected situations, and what is likely to minimise human error in these circumstances. (author)

  17. Quark-diagram analysis of charmed-baryon decays

    International Nuclear Information System (INIS)

    Kohara, Y.

    1991-01-01

    The Cabibbo-allowed two-body nonleptonic decays of charmed baryons to a SU(3)-octet (or -decuplet) baryon and a pseudoscalar meson are examined on the basis of the quark-diagram scheme. Some relations among the decay amplitudes or rates of various decay modes are derived. The decays of Ξ c + to a decuplet baryon are forbidden

  18. Chiral symmetry breaking in gauge theories from Reggeon diagram analysis

    International Nuclear Information System (INIS)

    White, A.R.

    1991-01-01

    It is argued that reggeon diagrams can be used to study dynamical properties of gauge theories containing a large number of massless fermions. SU(2) gauge theory is studied in detail and it is argued that there is a high energy solution which is analogous to the solution of the massless Schwinger model. A generalized winding-number condensate produces the massless pseudoscalar spectrum associated with chiral symmetry breaking and a ''trivial'' S-Matrix

  19. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  20. Cognitive task load analysis : Allocating tasks and designing support

    NARCIS (Netherlands)

    Neerincx, M.A.

    2003-01-01

    We present a method for Cognitive Task Analysis that guides the early stages of software development, aiming at an optimal cognitive load for operators of process control systems. The method is based on a practical theory of cognitive task load and support. In addition to the classical measure

  1. A study on the influence diagrams for the application to containment performance analysis

    International Nuclear Information System (INIS)

    Park, Joon Won

    1995-02-01

    Influence diagrams have been applied to containment performance analysis of Young-Gwang 3 and 4 in an effort to explicitly display the dependencies between events and to treat operator intervention more generally. This study has been initiated to remove the three major drawbacks of the current event tree methodology: 1) Event tree cannot express dependency between events explicitly. 2) Accident Progression Event Tree (APET) cannot represent entire containment system. 3) It is difficult to consider operator intervention with event tree. To resolve these problems, a new approach, i.e., influence diagrams, are proposed. In the present work, the applicability of the influence diagrams have been demonstrated to YGN 3 and 4 containment performance analysis and an assessment of accident management strategies. To show that the results of the application of influence diagrams are reasonable, results are compared with that of YGN 3 and 4 IPE. Both results are in good agreement. In addition, influence diagrams are used to assess two accident management strategies: 1) RCS depressurization, 2) cavity flooding. Cavity flooding has a favorable effect to late containment failure and basemat melt-through, and depressurization of RCS is good for steam generator tube rupture. However, early containment failure probability is worse in both cases. As a result of the present study, it is shown that influence diagrams can be applied to the containment performance analysis

  2. Overview of job and task analysis

    International Nuclear Information System (INIS)

    Gertman, D.I.

    1984-01-01

    During the past few years the nuclear industry has become concerned with predicting human performance in nuclear power plants. One of the best means available at the present time to make sure that training, procedures, job performance aids and plant hardware match the capabilities and limitations of personnel is by performing a detailed analysis of the tasks required in each job position. The approved method for this type of analysis is referred to as job or task analysis. Job analysis is a broader type of analysis and is usually thought of in terms of establishing overall performance objectives, and in establishing a basis for position descriptions. Task analysis focuses on the building blocks of task performance, task elements, and places them within the context of specific performance requirements including time to perform, feedback required, special tools used, and required systems knowledge. The use of task analysis in the nuclear industry has included training validation, preliminary risk screening, and procedures development

  3. Radial frequency diagram (sunflower) for the analysis of diurnal cycle parameters: Solar energy application

    International Nuclear Information System (INIS)

    Božnar, Marija Zlata; Grašič, Boštjan; Mlakar, Primož; Soares, Jacyra; Pereira de Oliveira, Amauri; Costa, Tássio Santos

    2015-01-01

    Graphical abstract: A new type of graphical presentation showing diurnal cycle of solar energy forecast. The application is possible for other parameters related to weather and green energy production. - Highlights: • The diurnal cycle of solar energy is important for the management of the electrical grid. • A solar plant’s average production depends on the statistical features of solar radiation. • The new tool – the “sunflower”, is proposed for solar energy availability representation. • The sunflower identifies and quantifies information with a clear diurnal cycle. • The sunflower diagram has been developed from the “wind rose” diagram. - Abstract: Many meteorological parameters present a natural diurnal cycle because they are directly or indirectly dependent on sunshine exposure. The solar radiation diurnal pattern is important to energy production, agriculture, prognostic models, health and general climatology. This article aims at introducing a new type of radial frequency diagram – hereafter called sunflower – for the analysis of solar radiation data in connection with energy production and also for climatological studies. The diagram is based on two-dimensional data sorting. Firstly data are sorted into classes representing hours in a day. Then the data in each hourly class is sorted into classes of the observed variable values. The relative frequencies of the value classes are shown as sections on each hour’s segment in a radial diagram. The radial diagram forms a unique pattern for each analysed dataset. Therefore it enables the quick detection of features and the comparison of several such patterns belonging to the different datasets being analysed. The sunflower diagram enables a quick and comprehensive understanding of the information about diurnal cycle of the solar radiation data. It enables in a graphical form, quick screening and long-term statistics of huge data quantities when searching for their diurnal features and

  4. Safety- barrier diagrams

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    2008-01-01

    Safety-barrier diagrams and the related so-called 'bow-tie' diagrams have become popular methods in risk analysis. This paper describes the syntax and principles for constructing consistent and valid safety-barrier diagrams. The relation of safety-barrier diagrams to other methods such as fault...... trees and Bayesian networks is discussed. A simple method for quantification of safety-barrier diagrams is proposed. It is concluded that safety-barrier diagrams provide a useful framework for an electronic data structure that integrates information from risk analysis with operational safety management....

  5. Analysis of Human Communication during Assembly Tasks.

    Science.gov (United States)

    1986-06-01

    AD-A7l 43 ANALYSIS OF HUMAN COMMUNICATION DURING ASSEMBLY TASKS in1(U) CRNEGIE-MELLO UNIY PITTSBURGH PA ROBOTICS INST UNCLSSIIEDK S BARBER ET AL...ao I Dur~~~~IngAbcbyTs; 7c .S:in i lSAo .0. Analysis of Human Communication During Assembly Tasks K. Suzanne Barber and Gerald J. Agin CMU-RI-TR-86-1...TYPE or REPORT & PE-Rioo CevCZaz Analysis of Human Communication During Assembly Inlterim Tasks I . PERFORMING 00RG. REPORT NUMBER 1. £UT~oOR~e) IL

  6. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography

    DEFF Research Database (Denmark)

    Jørgensen, Jakob Sauer; Sidky, E. Y.

    2015-01-01

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study...... and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers...... measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means...

  7. Network and system diagrams revisited: Satisfying CEA requirements for causality analysis

    International Nuclear Information System (INIS)

    Perdicoulis, Anastassios; Piper, Jake

    2008-01-01

    Published guidelines for Cumulative Effects Assessment (CEA) have called for the identification of cause-and-effect relationships, or causality, challenging researchers to identify methods that can possibly meet CEA's specific requirements. Together with an outline of these requirements from CEA key literature, the various definitions of cumulative effects point to the direction of a method for causality analysis that is visually-oriented and qualitative. This article consequently revisits network and system diagrams, resolves their reported shortcomings, and extends their capabilities with causal loop diagramming methodology. The application of the resulting composite causality analysis method to three Environmental Impact Assessment (EIA) case studies appears to satisfy the specific requirements of CEA regarding causality. Three 'moments' are envisaged for the use of the proposed method: during the scoping stage, during the assessment process, and during the stakeholder participation process

  8. An ergonomic task analysis of spinal anaesthesia.

    LENUS (Irish Health Repository)

    Ajmal, Muhammad

    2009-12-01

    Ergonomics is the study of physical interaction between humans and their working environment. The objective of this study was to characterize the performance of spinal anaesthesia in an acute hospital setting, applying ergonomic task analysis.

  9. Lattice and Phase Diagram in QCD

    International Nuclear Information System (INIS)

    Lombardo, Maria Paola

    2008-01-01

    Model calculations have produced a number of very interesting expectations for the QCD Phase Diagram, and the task of a lattice calculations is to put these studies on a quantitative grounds. I will give an overview of the current status of the lattice analysis of the QCD phase diagram, from the quantitative results of mature calculations at zero and small baryochemical potential, to the exploratory studies of the colder, denser phase.

  10. Safety-barrier diagrams

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    2007-01-01

    Safety-barrier diagrams and the related so-called "bow-tie" diagrams have become popular methods in risk analysis. This paper describes the syntax and principles for constructing consistent and valid safety-barrier diagrams. The relation with other methods such as fault trees and Bayesian networks...... are discussed. A simple method for quantification of safety-barrier diagrams is proposed, including situations where safety barriers depend on shared common elements. It is concluded that safety-barrier diagrams provide a useful framework for an electronic data structure that integrates information from risk...... analysis with operational safety management....

  11. [Identification of meridian-acupoint diagrams and meridian diagrams].

    Science.gov (United States)

    Shen, Wei-hong

    2008-08-01

    In acu-moxibustion literature, there are two kinds of diagrams, meridian-acupoint diagrams and meridian diagrams. Because they are very similar in outline, and people now have seldom seen the typical ancient meridian diagrams, meridian-acupoint diagrams have been being incorrectly considered to be the meridian diagrams for a long time. It results in confusion in acu-moxibustion academia. The present paper stresses its importance in academic research and introduces some methods for identifying them correctly. The key points for identification of meridian-acupoint diagrams and meridian diagrams are: the legend of diagrams and the drawing style of the ancient charts. In addition, the author makes a detailed explanation about some acu-moxibustion charts which are easily confused. In order to distinguish meridian-acupoint diagrams and meridian diagrams correctly, he or she shoulnd understand the diagrams' intrinsic information as much as possible and make a comprehensive analysis about them.

  12. Job and task analysis for technical staff

    International Nuclear Information System (INIS)

    Toline, B.C.

    1991-01-01

    In September of 1989 Cooper Nuclear Station began a project to upgrade the Technical Staff Training Program. This project's roots began by performing job and Task Analysis for Technical Staff. While the industry has long been committed to Job and Task Analysis to target performance based instruction for single job positions, this approach was unique in that it was not originally considered appropriate for a group as diverse as Tech Staff. Much to his satisfaction the Job and Task Analysis Project was much less complicated for Technical Staff than the author had imagined. The benefits of performing the Job and Task Analysis for Technical Staff have become increasingly obvious as he pursues lesson plan development and course revisions. The outline for this presentation will be as follows: philosophy adopted; preparation of the job survey document; performing the job analysis; performing task analysis for technical staff and associated pitfalls; clustering objectives for training and comparison to existing program; benefits now and in the future; final phase (comparison to INPO guides and meeting the needs of non-degreed engineering professionals); and conclusion. By focusing on performance based needs for engineers rather than traditional academics for training the author is confident the future Technical Staff Program will meet the challenges ahead and will exceed requirements for accreditation

  13. [Comparison of film-screen combination in a contrast detail diagram and with interactive image analysis. 1: Contrast detail diagram].

    Science.gov (United States)

    Hagemann, G; Eichbaum, G

    1997-07-01

    The following three film-screen combinations were compared: a) a combination of anticrossover film and UV-light emitting screens, b) a combination of blue-light emitting screens and film, and c) a conventional green fluorescing screen film combination. Radiographs of a specially designed plexiglass phantom (0.2 x 0.2 x 0.12 m3) were obtained that contained bar patterns of lead and plaster (calcium sulfate) to test high and intermediate contrast resolution and bar patterns of air to test low contrast resolution, respectively. An aluminum step wedge was integrated to evaluate dose-density curves of the radiographs. The dose values for the various step thicknesses were measured as percentage of the dose value in air for 60, 81, and 117 kV. Exposure conditions were the following: 12 pulse generator, 0.6 mm focus size, 4.7 mm aluminum prefilter, a grid with 40 lines/cm (12:1), and a focus-detector distance of 1.15 m. The thresholds of visible bars of the various pattern materials were assessed by seven radiologists, one technician, and the authors. The resulting contrast detail diagram could not prove any significant differences between the three tested screen film combinations. The pairwise comparison, however, found 8 of the 18 paired differences to be statistically significant between the conventional and the two new screen-film combinations. The authors concluded that subjective visual assessment of the threshold in a contrast detail study alone is of only limited value to grade image quality if no well-defined criteria are used (BIR report 20 [1989] 137-139). The statistical approach of paired differences of the estimated means appeared to be more appropriate.

  14. Roundhouse Diagrams.

    Science.gov (United States)

    Ward, Robin E.; Wandersee, James

    2000-01-01

    Students must understand key concepts through reasoning, searching out related concepts, and making connections within multiple systems to learn science. The Roundhouse diagram was developed to be a concise, holistic, graphic representation of a science topic, process, or activity. Includes sample Roundhouse diagrams, a diagram checklist, and…

  15. Employment of neural networks for analysis of chemical composition and cooling rate effect on CCT diagrams shape

    International Nuclear Information System (INIS)

    Dobrzanski, L.A.; Trzaska, J.

    2004-01-01

    The paper presents possibility of employment of the original supercooled austenite transformation anisothermic diagrams forecasting method for analysis of the chemical composition effect on the CCT diagrams shape. The developed model makes it possible to substitute computer simulation for the costly and time consuming experiments. The information derived from calculations make it possible to plot diagrams illustrating the effects of the particular elements or pairs of elements, as well as cooling rate and/or austenitizing temperature, on any temperature or time describing transformations in steel during its continuous cooling. Evaluation is also possible of the effect of the aforementioned factors on hardness and fractions of the particular structural constituents. (author)

  16. A Cognitive Task Analysis for Dental Hygiene.

    Science.gov (United States)

    Cameron, Cheryl A.; Beemsterboer, Phyllis L.; Johnson, Lynn A.; Mislevy, Robert J.; Steinberg, Linda S.; Breyer, F. Jay

    2000-01-01

    As part of the development of a scoring algorithm for a simulation-based dental hygiene initial licensure examination, this effort conducted a task analysis of the dental hygiene domain. Broad classes of behaviors that distinguish along the dental hygiene expert-novice continuum were identified and applied to the design of nine paper-based cases…

  17. Task analysis in neurosciences programme design - neurological ...

    African Journals Online (AJOL)

    Defining educational objectives is the key to achieving the goal of professional competence in students. The technique of task analysis was selected to determine components of competence in clinical neurology appropriate to the needs of primary care. A survey of neurological problems in general practice revealed that ...

  18. Radiation protection technician job task analysis manual

    International Nuclear Information System (INIS)

    1990-03-01

    This manual was developed to assist all DOE contractors in the design and conduct of job task analysis (JTA) for the radiation protection technician. Experience throughout the nuclear industry and the DOE system has indicated that the quality and efficiency in conducting a JTA at most sites is greatly enhanced by using a generic task list for the position, and clearly written guidelines on the JTA process. This manual is designed to provide this information for personnel to use in developing and conducting site-specific JTAs. (VC)

  19. Analysis of Business Process at PT XYZ by Using SCOR Thread Diagram

    Science.gov (United States)

    Sembiring, M. T.; Rambe, H. C.

    2017-03-01

    Supply Chain Operations Reference (SCOR) is a standard supply chain performance evaluation model which is proposed by Supply Chain Council (SCC). SCOR makes companies can analyse and evaluate their supply chain performance. SCOR has Thread Diagram which describes business process simply and systematically to help the analysis of company’s business process. This research takes place in PT XYZ that is involved in Crude Palm Oil (CPO) industry. PT XYZ used to be the market leader of CPO industry but nowadays they have a trouble to compete with new competitors. The purpose of this study is to provide the input for PT XYZ business process improvement to enhance the competitiveness of the company with the others. The result obtained shows that there are two performance metrics that are not reached. The analysis of business process shows the lack of control role of PT XYZ to supplier and customer side which is going to be the suggestion of improvement.

  20. VennDiagram: a package for the generation of highly-customizable Venn and Euler diagrams in R.

    Science.gov (United States)

    Chen, Hanbo; Boutros, Paul C

    2011-01-26

    Visualization of orthogonal (disjoint) or overlapping datasets is a common task in bioinformatics. Few tools exist to automate the generation of extensively-customizable, high-resolution Venn and Euler diagrams in the R statistical environment. To fill this gap we introduce VennDiagram, an R package that enables the automated generation of highly-customizable, high-resolution Venn diagrams with up to four sets and Euler diagrams with up to three sets. The VennDiagram package offers the user the ability to customize essentially all aspects of the generated diagrams, including font sizes, label styles and locations, and the overall rotation of the diagram. We have implemented scaled Venn and Euler diagrams, which increase graphical accuracy and visual appeal. Diagrams are generated as high-definition TIFF files, simplifying the process of creating publication-quality figures and easing integration with established analysis pipelines. The VennDiagram package allows the creation of high quality Venn and Euler diagrams in the R statistical environment.

  1. VennDiagram: a package for the generation of highly-customizable Venn and Euler diagrams in R

    Directory of Open Access Journals (Sweden)

    Boutros Paul C

    2011-01-01

    Full Text Available Abstract Background Visualization of orthogonal (disjoint or overlapping datasets is a common task in bioinformatics. Few tools exist to automate the generation of extensively-customizable, high-resolution Venn and Euler diagrams in the R statistical environment. To fill this gap we introduce VennDiagram, an R package that enables the automated generation of highly-customizable, high-resolution Venn diagrams with up to four sets and Euler diagrams with up to three sets. Results The VennDiagram package offers the user the ability to customize essentially all aspects of the generated diagrams, including font sizes, label styles and locations, and the overall rotation of the diagram. We have implemented scaled Venn and Euler diagrams, which increase graphical accuracy and visual appeal. Diagrams are generated as high-definition TIFF files, simplifying the process of creating publication-quality figures and easing integration with established analysis pipelines. Conclusions The VennDiagram package allows the creation of high quality Venn and Euler diagrams in the R statistical environment.

  2. Final report on the Pathway Analysis Task

    International Nuclear Information System (INIS)

    Whicker, F.W.; Kirchner, T.B.

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University's Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere

  3. Final report on the Pathway Analysis Task

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, F.W.; Kirchner, T.B. [Colorado State Univ., Fort Collins, CO (United States)

    1993-04-01

    The Pathway Analysis Task constituted one of several multi-laboratory efforts to estimate radiation doses to people, considering all important pathways of exposure, from the testing of nuclear devices at the Nevada Test Site (NTS). The primary goal of the Pathway Analysis Task was to predict radionuclide ingestion by residents of Utah, Nevada, and portions of seven other adjoining western states following radioactive fallout deposition from individual events at the NTS. This report provides comprehensive documentation of the activities and accomplishments of Colorado State University`s Pathway Analysis Task during the entire period of support (1979--91). The history of the project will be summarized, indicating the principal dates and milestones, personnel involved, subcontractors, and budget information. Accomplishments, both primary and auxiliary, will be summarized with general results rather than technical details being emphasized. This will also serve as a guide to the reports and open literature publications produced, where the methodological details and specific results are documented. Selected examples of results on internal dose estimates are provided in this report because the data have not been published elsewhere.

  4. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography.

    Science.gov (United States)

    Jørgensen, J S; Sidky, E Y

    2015-06-13

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.

  5. Electronic diagrams

    CERN Document Server

    Colwell, Morris A

    1976-01-01

    Electronic Diagrams is a ready reference and general guide to systems and circuit planning and in the preparation of diagrams for both newcomers and the more experienced. This book presents guidelines and logical procedures that the reader can follow and then be equipped to tackle large complex diagrams by recognition of characteristic 'building blocks' or 'black boxes'. The goal is to break down many of the barriers that often seem to deter students and laymen in learning the art of electronics, especially when they take up electronics as a spare time occupation. This text is comprised of nin

  6. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  7. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  8. A cognitive task analysis of the SGTR scenario

    International Nuclear Information System (INIS)

    Hollnagel, E.; Edland, A.; Svenson, O.

    1996-04-01

    This report constitutes a contribution to the NKS/RAK-1:3 project on Integrated Sequence Analysis. Following the meeting at Ringhals, the work was proposed to be performed by the following three steps: Task 1. Cognitive Task Analysis of the E-3 procedure. Task 2. Evaluation and revision of task analysis with Ringhals/KSU experts. Task 3. Integration with simulator data. The Cognitive Task Analysis (CTA) of Task 1 uses the Goals-Means Task Analysis (GMTA) method to identify the sequence of tasks and task steps necessary to achieve the goals of the procedure. It is based on material supplied by Ringhals, which describes the E-3 procedure, including the relevant ES and ECA procedures. The analysis further outlines the cognitive demands profile associated with individual task steps as well as with the task as a whole, as an indication of the nominal task load. The outcome of the cognitive task analysis provides a basis for proposing an adequate event tree. This report describes the results from Task 1. The work has included a two-day meeting between the three contributors, as well as the exchange of intermediate results and comments throughout the period. After the initial draft of the report was prepared, an opportunity was given to observe the SGTR scenario in a full-scope training simulator, and to discuss the details with the instructors. This led to several improvements from the initial draft. (EG)

  9. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  10. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  11. Importance analysis based on logical differential calculus and Binary Decision Diagram

    International Nuclear Information System (INIS)

    Zaitseva, Elena; Levashenko, Vitaly; Kostolny, Jozef

    2015-01-01

    System availability evaluation, sensitivity analysis, Importance Measures, and optimal design are important issues that have become research topics for reliability engineering. There are different mathematical approaches to the development of these topics. The structure function based approach is one of them. Structure function enables one to analyse a system of any complexity. But computational complexity of structure function based methods is time consuming for large-scale networks. We propose to use two mathematical approaches for decision to this problem for system importance analysis. The first of them is Direct Partial Boolean Derivative. New equations for calculating the Importance Measures are developed in terms of these derivatives. The second is Binary Decision Diagram (BDD), that supports efficient manipulation of Boolean algebra. Two algorithms for calculating Direct Partial Boolean Derivative based on BDD of structure function are proposed in this paper. The experimental results show the efficiency of new algorithms for calculating Direct Partial Boolean Derivative and Importance Measures. - Highlights: • New approach for calculation of Importance Measures is proposed. • Direct Partial Boolean Derivatives are used for calculation of Importance Measures. • New equations for Importance Measures are obtained. • New algorithm to calculate Direct Partial Boolean Derivatives by BDD is developed

  12. Task Analysis in Instructional Program Development. Theoretical Paper No. 52.

    Science.gov (United States)

    Bernard, Michael E.

    A review of task analysis procedures beginning with the military training and systems development approach and covering the more recent work of Gagne, Klausmeier, Merrill, Resnick, and others is presented along with a plan for effective instruction based on the review of task analysis. Literature dealing with the use of task analysis in programmed…

  13. Top-down versus bottom-up processing of influence diagrams in probabilistic analysis

    International Nuclear Information System (INIS)

    Timmerman, R.D.; Burns, T.J.; Dodds, H.L. Jr.

    1986-01-01

    Recent work by Phillips and Selby has shown that influence diagram methodology can be a useful analytical tool in reactor safety studies. In some instances an influence diagram can be used as a graphical representation of probabilistic dependence within a system or event sequence. Under these circumstances, Bayesian statistics is employed to transform the relationships depicted in the influence diagram into the correct expression for a desired marginal probability (e.g. the top node). Top-down and bottom-up algorithms have emerged as the dominant methods for quantifying influence diagrams. The purpose of this paper is to demonstrate a potential error in employing the bottom-up algorithm when dealing with interdependencies

  14. Top-down versus bottom-up processing of influence diagrams in probabilistic analysis

    International Nuclear Information System (INIS)

    Timmerman, R.D.; Burns, T.J.; Dodds, H.L. Jr.

    1984-01-01

    Recent work by Phillips et al., and Selby et al., has shown that influence diagram methodology can be a useful analytical tool in reactor safety studies. An influence diagram is a graphical representation of probabilistic dependence within a system or event sequence. Bayesian statistics are employed to transform the relationships depicted in the influence diagram into the correct expression for a desired marginal probability (e.g. the top event). As with fault trees, top-down and bottom-up algorithms have emerged as the dominant methods for quantifying influence diagrams. Purpose of this paper is to demonstrate a potential error in employing the bottom-up algorithm when dealing with interdependencies. In addition, the computing efficiency of both methods is discussed

  15. Top-down versus bottom-up processing of influence diagrams in probabilistic analysis

    International Nuclear Information System (INIS)

    Timmerman, R.D.; Burns, T.J.; Dodds, H.L. Jr.

    1986-01-01

    Recent work by Phillips et al and Selby et al has shown that influence diagram methodology can be a useful analytical tool in reactor safety studies. In some instances, an influence diagram can be used as a graphical representation of probabilistic dependence within a system or event sequence. Under these circumstances, Bayesian statistics is employed to transform the relationships depicted in the influence diagram into the correct expression for a desired marginal probability (e.g., the top node). In the references cited above, the authors demonstrated the usefulness of influence diagrams for assessing the reliability of operator performance during pressurized thermal shock transients. In addition, the use of influence diagrams identified the critical variables that had the greatest impact on operator reliability for a particular scenario (e.g., control room design, procedures, etc.). Top-down and bottom-up algorithms have emerged as the dominant methods for quantifying influence diagrams. The purpose of this paper is to demonstrate a potential error in employing the bottom-up algorithm when dealing with interdependencies

  16. Application of Failure Mode and Effect Analysis (FMEA), cause and effect analysis, and Pareto diagram in conjunction with HACCP to a corn curl manufacturing plant.

    Science.gov (United States)

    Varzakas, Theodoros H; Arvanitoyannis, Ioannis S

    2007-01-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of corn curl manufacturing. A tentative approach of FMEA application to the snacks industry was attempted in an effort to exclude the presence of GMOs in the final product. This is of crucial importance both from the ethics and the legislation (Regulations EC 1829/2003; EC 1830/2003; Directive EC 18/2001) point of view. The Preliminary Hazard Analysis and the Fault Tree Analysis were used to analyze and predict the occurring failure modes in a food chain system (corn curls processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and the fishbone diagram). Finally, Pareto diagrams were employed towards the optimization of GMOs detection potential of FMEA.

  17. Point Cluster Analysis Using a 3D Voronoi Diagram with Applications in Point Cloud Segmentation

    Directory of Open Access Journals (Sweden)

    Shen Ying

    2015-08-01

    Full Text Available Three-dimensional (3D point analysis and visualization is one of the most effective methods of point cluster detection and segmentation in geospatial datasets. However, serious scattering and clotting characteristics interfere with the visual detection of 3D point clusters. To overcome this problem, this study proposes the use of 3D Voronoi diagrams to analyze and visualize 3D points instead of the original data item. The proposed algorithm computes the cluster of 3D points by applying a set of 3D Voronoi cells to describe and quantify 3D points. The decompositions of point cloud of 3D models are guided by the 3D Voronoi cell parameters. The parameter values are mapped from the Voronoi cells to 3D points to show the spatial pattern and relationships; thus, a 3D point cluster pattern can be highlighted and easily recognized. To capture different cluster patterns, continuous progressive clusters and segmentations are tested. The 3D spatial relationship is shown to facilitate cluster detection. Furthermore, the generated segmentations of real 3D data cases are exploited to demonstrate the feasibility of our approach in detecting different spatial clusters for continuous point cloud segmentation.

  18. Analysis of diffuse scattering in neutron powder diagrams. Application to glassy carbon

    International Nuclear Information System (INIS)

    Boysen, H.

    1985-01-01

    From the quantitative analysis of the diffuse scattered intensity in powder diagrams valuable information about the disorder in crystals may be obtained. According to the dimensionality of this disorder (0D, 1D, 2D or 3D corresponding to diffuse peaks, streaks, planes or volume in reciprocal space) a characteristic modulation of the background is observed, which is described by specific functions. These are derived by averaging the appropriate cross sections over all crystallite orientations in the powder and folding with the resolution function of the instrument. If proper account is taken of all proportionality factors different components of the background can be put on one relative scale. The results are applied to two samples of glassy carbon differing in their degree of disorder. The neutron powder patterns contain contributions from 0D (00l peaks due to the stacking of graphitic layers), 1D (hkzeta streaks caused by the random orientation of these layers) and 3D (incoherent scattering, averaged thermal diffuse scattering, multiple scattering). From the fit to the observed data various parameters of the disorder like domain sizes, strains, interlayer distances, amount of incorporated hydrogen, pore sizes etc. are determined. It is shown that the omission of resolution corrections leads to false parameters. (orig.)

  19. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    Moray, N.P.; Senders, J.W.; Rhodes, W.

    1985-06-01

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  20. Cue Representation and Situational Awareness in Task Analysis

    Science.gov (United States)

    Carl, Diana R.

    2009-01-01

    Task analysis in human performance technology is used to determine how human performance can be well supported with training, job aids, environmental changes, and other interventions. Early work by Miller (1953) and Gilbert (1969, 1974) addressed cue processing in task execution and recommended cue descriptions in task analysis. Modern task…

  1. Diagrammatic analysis of correlations in polymer fluids: Cluster diagrams via Edwards' field theory

    International Nuclear Information System (INIS)

    Morse, David C.

    2006-01-01

    Edwards' functional integral approach to the statistical mechanics of polymer liquids is amenable to a diagrammatic analysis in which free energies and correlation functions are expanded as infinite sums of Feynman diagrams. This analysis is shown to lead naturally to a perturbative cluster expansion that is closely related to the Mayer cluster expansion developed for molecular liquids by Chandler and co-workers. Expansion of the functional integral representation of the grand-canonical partition function yields a perturbation theory in which all quantities of interest are expressed as functionals of a monomer-monomer pair potential, as functionals of intramolecular correlation functions of non-interacting molecules, and as functions of molecular activities. In different variants of the theory, the pair potential may be either a bare or a screened potential. A series of topological reductions yields a renormalized diagrammatic expansion in which collective correlation functions are instead expressed diagrammatically as functionals of the true single-molecule correlation functions in the interacting fluid, and as functions of molecular number density. Similar renormalized expansions are also obtained for a collective Ornstein-Zernicke direct correlation function, and for intramolecular correlation functions. A concise discussion is given of the corresponding Mayer cluster expansion, and of the relationship between the Mayer and perturbative cluster expansions for liquids of flexible molecules. The application of the perturbative cluster expansion to coarse-grained models of dense multi-component polymer liquids is discussed, and a justification is given for the use of a loop expansion. As an example, the formalism is used to derive a new expression for the wave-number dependent direct correlation function and recover known expressions for the intramolecular two-point correlation function to first-order in a renormalized loop expansion for coarse-grained models of

  2. Task Analysis data Processing and Enhanced Representations (TAPER), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Task Analysis (TA) is a fundamental part of NASA system design and validation. TAs are used to produce Master Task Lists that support engineering teams and...

  3. Delimiting diagrams

    NARCIS (Netherlands)

    Oostrom, V. van

    2004-01-01

    We introduce the unifying notion of delimiting diagram. Hitherto unrelated results such as: Minimality of the internal needed strategy for orthogonal first-order term rewriting systems, maximality of the limit strategy for orthogonal higher-order pattern rewrite systems (with maximality of the

  4. Nuclear power station: validation of a method for designing operation mimic diagrams

    International Nuclear Information System (INIS)

    Colard, M.I.; De Vlaminck, M.; Javaux, D.

    1995-01-01

    This is the first time in a Belgium nuclear power station that the design of mimic diagrams has involved biotechnologists. The methodology is based upon task analysis, the formalism of which has been evaluated. It is now integrated in the Dimos assisted supervision system to produce the final design of mimic diagram images. (authors). 3 figs., 3 refs

  5. Analysis of leak and break behavior in a failure assessment diagram for carbon steel pipes

    International Nuclear Information System (INIS)

    Kanno, Satoshi; Hasegawa, Kunio; Shimizu, Tasuku; Saitoh, Takashi; Gotoh, Nobuho

    1992-01-01

    The leak and break behavior of a cracked coolant pipe subjected to an internal pressure and a bending moment was analyzed with a failure assessment diagram using the R6 approach. This paper examines the conditions of the detectable coolant leakage without breakage. A leakage assessment curve, a locus of assessment point for detectable coolant leakage, was defined in the failure assessment diagram. The region between the leak assessment and failure assessment curves satisfies the condition of detectable leakage without breakage. In this region, a crack can be safely inspected by a coolant leak detector. (orig.)

  6. Workplace for analysis of task performance

    NARCIS (Netherlands)

    Bos, J; Mulder, LJM; van Ouwerkerk, RJ; Maarse, FJ; Akkerman, AE; Brand, AN; Mulder, LJM

    2003-01-01

    In current research on mental workload and task performance a large gap exists between laboratory based studies and research projects in real life working practice. Tasks conducted within a laboratory environment often lack a strong resemblance with real life working situations. This paper presents

  7. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  8. PBF task and training requirements analysis

    International Nuclear Information System (INIS)

    Blackman, H.S.; Gertman, D.I.; Petersen, R.J.

    1983-05-01

    Task analyses were used to assist in identifying improvements needed in the training curriculum for selected positions at the Power Burst Facility (PBF). Four positions were examined: Experiment Power Reactor Operator, Experiment (EPRO-Ex); Experiment Power Reactor Operator, Plant (EPRO-P); Experiment Power Reactor Operator, Console (EPRO-Co), and Shift Supervisor (SS). A complete position task listing and core of tasks defined in terms of (a) level of difficulty to perform, (b) severity of consequence if performed improperly, and (c) associated error probability were identified by each position. The systems, academic, and administrative knowledge needed by job incumbents to perform each task was noted. Strategies for teaching the knowledge associated with these tasks are presented

  9. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    OpenAIRE

    Chuzlov, Vyacheslav Alekseevich; Molotov, Konstantin

    2016-01-01

    An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  10. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    Directory of Open Access Journals (Sweden)

    Chuzlov Vjacheslav

    2016-01-01

    Full Text Available An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  11. The Use of the Skew T, Log P Diagram in Analysis and Forecasting. Revision

    Science.gov (United States)

    1990-03-01

    28 x 30 been added to further enhance the value of the inches. This version now includes the Apple - diagram. A detailed description of the Skew T, man...airocrau rqor we ovailable. The eauning lIkIaatte U the lop rate Is. at times. recorded as swot - adobaik wheun the mulm leave* a cloud Up and ener

  12. Job and task analysis: a view from the inside

    International Nuclear Information System (INIS)

    Allison, C.E.

    1981-01-01

    This paper is not intended to describe how to perform a Job and Task Analysis. There are a wide variety of approaches to conducting a Job and Task Analysis, many of which have been developed by highy seasoned and skilled professionals in this field. This paper is intended to discuss the internal support, in terms of money, time, and people, required for the Job and Task Analysis Project

  13. Guidelines for job and task analysis for DOE nuclear facilities

    International Nuclear Information System (INIS)

    1983-06-01

    The guidelines are intended to be responsive to the need for information on methodology, procedures, content, and use of job and task analysis since the establishment of a requirement for position task analysis for Category A reactors in DOE 5480.1A, Chapter VI. The guide describes the general approach and methods currently being utilized in the nuclear industry and by several DOE contractors for the conduct of job and task analysis and applications to the development of training programs or evaluation of existing programs. In addition other applications for job and task analysis are described including: operating procedures development, personnel management, system design, communications, and human performance predictions

  14. Diagram analysis of the Hubbard model: Stationarity property of the thermodynamic potential

    International Nuclear Information System (INIS)

    Moskalenko, V. A.; Dohotaru, L. A.; Cebotari, I. D.

    2010-01-01

    The diagram approach proposed many years ago for the strongly correlated Hubbard model is developed with the aim to analyze the thermodynamic potential properties. A new exact relation between renormalized quantities such as the thermodynamic potential, the one-particle propagator, and the correlation function is established. This relation contains an additional integration of the one-particle propagator with respect to an auxiliary constant. The vacuum skeleton diagrams constructed from the irreducible Green's functions and tunneling propagator lines are determined and a special functional is introduced. The properties of this functional are investigated and its relation to the thermodynamic potential is established. The stationarity property of this functional with respect to first-order variations of the correlation function is demonstrated; as a consequence, the stationarity property of the thermodynamic potential is proved.

  15. Thermodynamic analysis of 6xxx series Al alloys: Phase fraction diagrams

    Directory of Open Access Journals (Sweden)

    Cui S.

    2018-01-01

    Full Text Available Microstructural evolution of 6xxx Al alloys during various metallurgical processes was analyzed using accurate thermodynamic database. Phase fractions of all the possible precipitate phases which can form in the as-cast and equilibrium states of the Al-Mg-Si-Cu-Fe-Mn-Cr alloys were calculated over the technically useful composition range. The influence of minor elements such as Cu, Fe, Mn, and Cr on the amount of each type of precipitate in the as-cast and equilibrium conditions were analyzed. Phase fraction diagrams at 500 °C were mapped in the composition range of 0-1.1 wt.% Mg and 0-0.7 wt.% Si to investigate the as-homogenized microstructure. In addition, phase fraction diagram of Mg2Si at 177 °C was mapped to understand the microstructure after final annealing of 6xxx Al alloy. Based on the calculated diagrams, the design strategy of 6xxx Al alloy to produce highest strength due to Mg2Si is discussed.

  16. Task Analysis Assessment on Intrastate Bus Traffic Controllers

    Science.gov (United States)

    Yen Bin, Teo; Azlis-Sani, Jalil; Nur Annuar Mohd Yunos, Muhammad; Ismail, S. M. Sabri S. M.; Tajedi, Noor Aqilah Ahmad

    2016-11-01

    Public transportation acts as social mobility and caters the daily needs of the society for passengers to travel from one place to another. This is true for a country like Malaysia where international trade has been growing significantly over the past few decades. Task analysis assessment was conducted with the consideration of cognitive ergonomic view towards problem related to human factors. Conducting research regarding the task analysis on bus traffic controllers had allowed a better understanding regarding the nature of work and the overall monitoring activities of the bus services. This paper served to study the task analysis assessment on intrastate bus traffic controllers and the objectives of this study include to conduct task analysis assessment on the bus traffic controllers. Task analysis assessment for the bus traffic controllers was developed via Hierarchical Task Analysis (HTA). There are a total of five subsidiary tasks on level one and only two were able to be further broken down in level two. Development of HTA allowed a better understanding regarding the work and this could further ease the evaluation of the tasks conducted by the bus traffic controllers. Thus, human error could be reduced for the safety of all passengers and increase the overall efficiency of the system. Besides, it could assist in improving the operation of the bus traffic controllers by modelling or synthesizing the existing tasks if necessary.

  17. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  18. 49 CFR 236.1043 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... classroom, simulator, computer-based, hands-on, or other formally structured training designed to impart the... 49 Transportation 4 2010-10-01 2010-10-01 false Task analysis and basic requirements. 236.1043...

  19. 49 CFR 236.923 - Task analysis and basic requirements.

    Science.gov (United States)

    2010-10-01

    ... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements... structured training designed to impart the knowledge, skills, and abilities identified as necessary to... 49 Transportation 4 2010-10-01 2010-10-01 false Task analysis and basic requirements. 236.923...

  20. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  1. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-01

    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  2. Analysis and application of impedance polar diagram and zstrike rose diagram of magnetotellurics data in southern part of the Wayang Windu geothermal field

    Science.gov (United States)

    Rohayat, O. R.; Wicaksono, R. A.; Daud, Y.

    2018-03-01

    In this study, we determined the main direction of geoelectric strike in the southern part of the Wayang Windu geothermal field using magnetotellurics (MT) data. The strike direction was obtained by analyzing data using impedance polar and Zstrike rose diagram. We investigated 51 MT data at different sites of the southern part of the Wayang Windu geothermal field. Determination of geoelectric strike direction is important since the strike is the rotation references in MT data processing. Our findings had pointed out that the geoelectric strike direction in this study area is in accordance with the direction of geological structure and has a good correlation with structures delineated from 3D MT inversion model.

  3. The use of Ternary Diagrams in the Analysis and the Mathematical Modeling of Bank Assets Structure

    Directory of Open Access Journals (Sweden)

    Ramona Mariana CALINICA

    2012-04-01

    Full Text Available The objectives pursued in this paper are: to obtain by means of ternary diagrams imagistic representations of the structure of assets in credit banking institutions operating in Romania in case of stability, turbulence and intense manifestation of the crisis; identifying functional discontinuities and achieve a comparative database. The ultimate goal of this paper is reporting the results obtained from comparative database to find out what signals preceding a turbulent situation in the banking sector and how far away is the banking system by the normal situation.

  4. Physical Education-in-CLIL tasks. Determining tasks characteristics through the analysis of the diaries

    Directory of Open Access Journals (Sweden)

    Josep Coral Mateu

    2013-07-01

    Full Text Available This article focuses on the characteristics of Physical Education-in-CLIL (PE-in-CLIL tasks. CLIL (Content and Language Integrated Learning is a teaching approach which uses foreign language as a tool to enhance the subject learning process. We connect PE-in-CLIL with key competences and we introduce the CLIL 4Cs framework. We establish the aims of the study, that is; to describe the features of tasks which are most suitable to PE-in-CLIL and identify integrated tasks which appeal most to learners. We use Action-Research and we collect data through diaries. The participants of the study were twenty-six learners of 5th grade of primary school. We described the strategies of rigour and quality applied and we analysed data using a qualitative data analysis software programme (NVivo. In the results, we identify both the tasks that appeal to students and the tasks that are developed successfully. In the conclusions, we provide teaching guidelines to plan successful PE-in-CLIL tasks that appeal to students. At this point, we emphasise tasks that combined both cooperative learning and oracy with motor activity and games. We also declare the necessity of incorporating scaffolding strategies in order to accommodate students’ linguistic needs and facilitate tasks development. Future CLIL research possibilities emerge in the Physical Education field of work.

  5. Uhlenbeck-Ford model: Phase diagram and corresponding-states analysis

    Science.gov (United States)

    Paula Leite, Rodolfo; Santos-Flórez, Pedro Antonio; de Koning, Maurice

    2017-09-01

    Using molecular dynamics simulations and nonequilibrium thermodynamic-integration techniques we compute the Helmholtz free energies of the body-centered-cubic (bcc), face-centered-cubic (fcc), hexagonal close-packed, and fluid phases of the Uhlenbeck-Ford model (UFM) and use the results to construct its phase diagram. The pair interaction associated with the UFM is characterized by an ultrasoft, purely repulsive pair potential that diverges logarithmically at the origin. We find that the bcc and fcc are the only thermodynamically stable crystalline phases in the phase diagram. Furthermore, we report the existence of two reentrant transition sequences as a function of the number density, one featuring a fluid-bcc-fluid succession and another displaying a bcc-fcc-bcc sequence near the triple point. We find strong resemblances to the phase behavior of other soft, purely repulsive systems such as the Gaussian-core model (GCM), inverse-power-law, and Yukawa potentials. In particular, we find that the fcc-bcc-fluid triple point and the phase boundaries in its vicinity are in good agreement with the prediction supplied by a recently proposed corresponding-states principle [J. Chem. Phys. 134, 241101 (2011), 10.1063/1.3605659; Europhys. Lett. 100, 66004 (2012), 10.1209/0295-5075/100/66004]. The particularly strong resemblance between the behavior of the UFM and GCM models are also discussed.

  6. Diagrams of natural deductions

    Energy Technology Data Exchange (ETDEWEB)

    Popov, S V

    1982-01-01

    The concept of natural deductions was investigated by the author in his analysis of the complexity of deductions in propositional computations (1975). Here some natural deduction systems are considered, and an analytical procedure proposed which results in a deduction diagram for each system. Each diagram takes the form of an orientated, charge graph, features of which can be used to establish the equivalence of classes of deductions. For each of the natural deduction systems considered, a system of equivalent transformation schemes is derived, which is complete with respect to the given definition of equivalence. 2 references.

  7. Task analysis: a detailed example of stepping up from JSA

    International Nuclear Information System (INIS)

    Banks, W.W.; Paramore, B.A.; Buys, J.R.

    1984-10-01

    This paper discusses a pilot task analysis of operations in a proposed facility for the cutting and packaging of radioactively contaminated gloveboxes, for long-term storage or burial. The objective was to demonstrate how task analysis may be used as a tool for planning and risk management. Two specific products were generated - preliminary operating procedures and training requirements. The task data base, procedures list and training requirements developed were intended as first order categorizations. The analysis was limited to tasks that will be performed within the boundaries of the operational facility and the associated load-out area. The analysis documents tasks to be performed by D and D (Decontamination and Decommissioning) Workers. However, the analysis included all tasks identified as an integral part of glovebox processing within the facility. Thus tasks involving Radiation Protection Technicians (RPTs) are included. Based on hazard assessments, it is planned that at least two RPTs will be assigned full-time to the facility, so they may be considered part of its crew. Similarly, supervisory/administrative tasks are included where they were determined to be directly part of process sequences, such as obtaining appropriate certification. 11 tables

  8. Extended sequence diagram for human system interaction

    International Nuclear Information System (INIS)

    Hwang, Jong Rok; Choi, Sun Woo; Ko, Hee Ran; Kim, Jong Hyun

    2012-01-01

    Unified Modeling Language (UML) is a modeling language in the field of object oriented software engineering. The sequence diagram is a kind of interaction diagram that shows how processes operate with one another and in what order. It is a construct of a message sequence chart. It depicts the objects and classes involved in the scenario and the sequence of messages exchanged between the objects needed to carry out the functionality of the scenario. This paper proposes the Extended Sequence Diagram (ESD), which is capable of depicting human system interaction for nuclear power plants, as well as cognitive process of operators analysis. In the conventional sequence diagram, there is a limit to only identify the activities of human and systems interactions. The ESD is extended to describe operators' cognitive process in more detail. The ESD is expected to be used as a task analysis method for describing human system interaction. The ESD can also present key steps causing abnormal operations or failures and diverse human errors based on cognitive condition

  9. A Task-Content Analysis of an Introductory Entomology Curriculum.

    Science.gov (United States)

    Brandenburg, R.

    Described is an analysis of the content, tasks, and strategies needed by students to enable them to identify insects to order by sight and to family by use of a standard dichotomous taxonomic key. Tasks and strategies are broken down and arranged progressively in the approximate order in which students should progress. Included are listings of…

  10. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 3: Wiring diagrams

    Science.gov (United States)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.

  11. Experiment with expert system guidance of an engineering analysis task

    International Nuclear Information System (INIS)

    Ransom, V.H.; Fink, R.K.; Callow, R.A.

    1986-01-01

    An experiment is being conducted in which expert systems are used to guide the performance of an engineering analysis task. The task chosen for experimentation is the application of a large thermal hydraulic transient simulation computer code. The expectation from this work is that the expert system will result in an improved analytical result with a reduction in the amount of human labor and expertise required. The code associated functions of model formulation, data input, code execution, and analysis of the computed output have all been identified as candidate tasks that could benefit from the use of expert systems. Expert system modules have been built for the model building and data input task. Initial results include the observation that human expectations of an intelligent environment rapidly escalate and structured or stylized tasks that are tolerated in the unaided system are frustrating within the intelligent environment

  12. Asymmetric simple exclusion process with position-dependent hopping rates: Phase diagram from boundary-layer analysis.

    Science.gov (United States)

    Mukherji, Sutapa

    2018-03-01

    In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.

  13. Cognitive Task Analysis of the Battalion Level Visualization Process

    National Research Council Canada - National Science Library

    Leedom, Dennis K; McElroy, William; Shadrick, Scott B; Lickteig, Carl; Pokorny, Robet A; Haynes, Jacqueline A; Bell, James

    2007-01-01

    ... position or as a battalion Operations Officer or Executive Officer. Bases on findings from the cognitive task analysis, 11 skill areas were identified as potential focal points for future training development...

  14. From State Diagram to Class Diagram

    DEFF Research Database (Denmark)

    Borch, Ole; Madsen, Per Printz

    2009-01-01

    UML class diagram and Java source code are interrelated and Java code is a kind of interchange format. Working with UML state diagram in CASE tools, a corresponding xml file is maintained. Designing state diagrams is mostly performed manually using design patterns and coding templates - a time...... consuming process. This article demonstrates how to compile such a diagram into Java code and later, by reverse engineering, produce a class diagram. The process from state diagram via intermediate SAX parsed xml file to Apache Velocity generated Java code is described. The result is a fast reproducible...

  15. Cognitive Task Analysis Based Training for Cyber Situation Awareness

    OpenAIRE

    Huang , Zequn; Shen , Chien-Chung; Doshi , Sheetal; Thomas , Nimmi; Duong , Ha

    2015-01-01

    Part 1: Innovative Methods; International audience; Cyber attacks have been increasing significantly in both number and complexity, prompting the need for better training of cyber defense analysts. To conduct effective training for cyber situation awareness, it becomes essential to design realistic training scenarios. In this paper, we present a Cognitive Task Analysis based approach to address this training need. The technique of Cognitive Task Analysis is to capture and represent knowledge ...

  16. Analysis of the resolution processes of three modeling tasks

    Directory of Open Access Journals (Sweden)

    Cèsar Gallart Palau

    2017-08-01

    Full Text Available In this paper we present a comparative analysis of the resolution process of three modeling tasks performed by secondary education students (13-14 years, designed from three different points of view: The Modelling-eliciting Activities, the LEMA project, and the Realistic Mathematical Problems. The purpose of this analysis is to obtain a methodological characterization of them in order to provide to secondary education teachers a proper selection and sequencing of tasks for their implementation in the classroom.

  17. Unavailability Analysis of the Reactor Core Protection System using Reliability Block Diagram

    International Nuclear Information System (INIS)

    Shin, Hyun Kook; Kim, Sung Ho; Choi, Woong Suk; Kim, Jae Hack

    2006-01-01

    The reactor core of nuclear power plants needs to be monitored for the early detection of core abnormal conditions to protect plants from a severe accident. The core protection calculator system (CPCS) has been provided to calculate the departure from nucleate boiling ratio (DNBR) and the local power density (LPD) based on measured parameters of reactor and coolant system. The original CPCS for OPR 1000 has been designed and implemented based on the concurrent 3205 computer system whose components are obsolete. The CPCS based on Westinghouse Common-Q system has recently been implemented for the Shin-Kori Nuclear Power Plant, Units 1 and 2(SKN 1 and 2). An R and D project has been launched to develop new core protection system called as RCOPS (Reactor Core Protection System) with the partnership of KOPEC and Doosan Heavy Industries and Construction Co. RCOPS is implemented on the HFC-6000 safety class programmable logic controller (PLC). In this paper, the reliability of RCOPS is analyzed using the reliability block diagram (RBD) method. The calculated results are compared with that of the CPCS for SKN 1 and 2

  18. Quantitative analysis of dynamic fault trees using improved Sequential Binary Decision Diagrams

    International Nuclear Information System (INIS)

    Ge, Daochuan; Lin, Meng; Yang, Yanhua; Zhang, Ruoxing; Chou, Qiang

    2015-01-01

    Dynamic fault trees (DFTs) are powerful in modeling systems with sequence- and function dependent failure behaviors. The key point lies in how to quantify complex DFTs analytically and efficiently. Unfortunately, the existing methods for analyzing DFTs all have their own disadvantages. They either suffer from the problem of combinatorial explosion or need a long computation time to obtain an accurate solution. Sequential Binary Decision Diagrams (SBDDs) are regarded as novel and efficient approaches to deal with DFTs, but their two apparent shortcomings remain to be handled: That is, SBDDs probably generate invalid nodes when given an unpleasant variable index and the scale of the resultant cut sequences greatly relies on the chosen variable index. An improved SBDD method is proposed in this paper to deal with the two mentioned problems. It uses an improved ite (If-Then-Else) algorithm to avoid generating invalid nodes when building SBDDs, and a heuristic variable index to keep the scale of resultant cut sequences as small as possible. To confirm the applicability and merits of the proposed method, several benchmark examples are demonstrated, and the results indicate this approach is efficient as well as reasonable. - Highlights: • New ITE method. • Linear complexity-based finding algorithm. • Heuristic variable index

  19. Computational analysis of RNA-protein interaction interfaces via the Voronoi diagram.

    Science.gov (United States)

    Mahdavi, Sedigheh; Mohades, Ali; Salehzadeh Yazdi, Ali; Jahandideh, Samad; Masoudi-Nejad, Ali

    2012-01-21

    Cellular functions are mediated by various biological processes including biomolecular interactions, such as protein-protein, DNA-protein and RNA-protein interactions in which RNA-Protein interactions are indispensable for many biological processes like cell development and viral replication. Unlike the protein-protein and protein-DNA interactions, accurate mechanisms and structures of the RNA-Protein complexes are not fully understood. A large amount of theoretical evidence have shown during the past several years that computational geometry is the first pace in understanding the binding profiles and plays a key role in the study of intricate biological structures, interactions and complexes. In this paper, RNA-Protein interaction interface surface is computed via the weighted Voronoi diagram of atoms. Using two filter operations provides a natural definition for interface atoms as classic methods. Unbounded parts of Voronoi facets that are far from the complex are trimmed using modified convex hull of atom centers. This algorithm is implemented to a database with different RNA-Protein complexes extracted from Protein Data Bank (PDB). Afterward, the features of interfaces have been computed and compared with classic method. The results show high correlation coefficients between interface size in the Voronoi model and the classical model based on solvent accessibility, as well as high accuracy and precision in comparison to classical model. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Imaging gait analysis: An fMRI dual task study.

    Science.gov (United States)

    Bürki, Céline N; Bridenbaugh, Stephanie A; Reinhardt, Julia; Stippich, Christoph; Kressig, Reto W; Blatow, Maria

    2017-08-01

    In geriatric clinical diagnostics, gait analysis with cognitive-motor dual tasking is used to predict fall risk and cognitive decline. To date, the neural correlates of cognitive-motor dual tasking processes are not fully understood. To investigate these underlying neural mechanisms, we designed an fMRI paradigm to reproduce the gait analysis. We tested the fMRI paradigm's feasibility in a substudy with fifteen young adults and assessed 31 healthy older adults in the main study. First, gait speed and variability were quantified using the GAITRite © electronic walkway. Then, participants lying in the MRI-scanner were stepping on pedals of an MRI-compatible stepping device used to imitate gait during functional imaging. In each session, participants performed cognitive and motor single tasks as well as cognitive-motor dual tasks. Behavioral results showed that the parameters of both gait analyses, GAITRite © and fMRI, were significantly positively correlated. FMRI results revealed significantly reduced brain activation during dual task compared to single task conditions. Functional ROI analysis showed that activation in the superior parietal lobe (SPL) decreased less from single to dual task condition than activation in primary motor cortex and in supplementary motor areas. Moreover, SPL activation was increased during dual tasks in subjects exhibiting lower stepping speed and lower executive control. We were able to simulate walking during functional imaging with valid results that reproduce those from the GAITRite © gait analysis. On the neural level, SPL seems to play a crucial role in cognitive-motor dual tasking and to be linked to divided attention processes, particularly when motor activity is involved.

  1. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  2. Viral pathogenesis in diagrams

    National Research Council Canada - National Science Library

    Tremblay, Michel; Berthiaume, Laurent; Ackermann, Hans-Wolfgang

    2001-01-01

    .... The 268 diagrams in Viral Pathogenesis in Diagrams were selected from over 800 diagrams of English and French virological literature, including one derived from a famous drawing by Leonardo da Vinci...

  3. Analysis of color-magnitude diagrams from three large Magellanic Cloud clusters

    International Nuclear Information System (INIS)

    Jones, J.H.

    1985-01-01

    The color-magnitude diagrams of three LMC clusters and a field were derived from photographic and CCD data provided by Dr. P.J. Flower of Clemson University and Dr. R. Schommer of Rutgers University. The photographic data were scanned and converted to intensity images at KPNO. The stellar photometry program RICHFLD was used to measure the raw magnitudes from these images. Problems with the standard sequence on the plate kept the color terms for the photographic data from being well determined. A version of DAOPHOT was installed on the VAX 11/280s at Clemson and was used to measure the magnitudes from the CCD images of NGC 2249. These magnitudes were used to define another photoelectric sequence for the photographic data which were used to determine a well defined transformation into the standard BV system. The CMDs derived from both the photographic and CCD images of NGC 2249 showed a gap near the tip of the MS. This gap was taken to be the period of rapid evolution just after core hydrogen exhaustion. Using a true distance modulus of 18.3 for the LMC and a reddening taken from the literature, an age of 600 +/- 75 million years was found for NGC 2249. Comparing the CMD of SL 889 to that of NGC 2249 gives a similar age for this small LMC cluster. A subgiant branch was identified in the CMD of NGC 2241. Comparison to old metal poor galactic clusters gave an age near 4 billion years, favoring the short distance scale to the LMC

  4. "Cooperative collapse" of the denatured state revealed through Clausius-Clapeyron analysis of protein denaturation phase diagrams.

    Science.gov (United States)

    Tischer, Alexander; Machha, Venkata R; Rösgen, Jörg; Auton, Matthew

    2018-02-19

    Protein phase diagrams have a unique potential to identify the presence of additional thermodynamic states even when non-2-state character is not readily apparent from the experimental observables used to follow protein unfolding transitions. Two-state analysis of the von Willebrand factor A3 domain has previously revealed a discrepancy in the calorimetric enthalpy obtained from thermal unfolding transitions as compared with Gibbs-Helmholtz analysis of free energies obtained from the Linear Extrapolation Method (Tischer and Auton, Prot Sci 2013; 22(9):1147-60). We resolve this thermodynamic conundrum using a Clausius-Clapeyron analysis of the urea-temperature phase diagram that defines how ΔH and the urea m-value interconvert through the slope of c m versus T, (∂cm/∂T)=ΔH/(mT). This relationship permits the calculation of ΔH at low temperature from m-values obtained through iso-thermal urea denaturation and high temperature m-values from ΔH obtained through iso-urea thermal denaturation. Application of this equation uncovers sigmoid transitions in both cooperativity parameters as temperature is increased. Such residual thermal cooperativity of ΔH and the m-value confirms the presence of an additional state which is verified to result from a cooperative phase transition between urea-expanded and thermally-compact denatured states. Comparison of the equilibria between expanded and compact denatured ensembles of disulfide-intact and carboxyamidated A3 domains reveals that introducing a single disulfide crosslink does not affect the presence of the additional denatured state. It does, however, make a small thermodynamically favorable free energy (∼-13 ± 1 kJ/mol) contribution to the cooperative denatured state collapse transition as temperature is raised and urea concentration is lowered. The thermodynamics of this "cooperative collapse" of the denatured state retain significant compensations between the enthalpy and entropy contributions to the overall

  5. Family history assessment for colorectal cancer (CRC) risk analysis - comparison of diagram- and questionnaire-based web interfaces.

    Science.gov (United States)

    Schultz, Michael; Seo, Steven Bohwan; Holt, Alec; Regenbrecht, Holger

    2015-11-18

    Colorectal cancer (CRC) has a high incidence, especially in New Zealand. The reasons for this are unknown. While most cancers develop sporadically, a positive family history, determined by the number and age at diagnosis of affected first and second degree relatives with CRC is one of the major factors, which may increase an individual's lifetime risk. Before a patient can be enrolled in a surveillance program a detailed assessment and documentation of the family history is important but time consuming and often inaccurate. The documentation is usually paper-based. Our aim was therefore to develop and validate the usability and efficacy of a web-based family history assessment tool for CRC suitable for the general population. The tool was also to calculate the risk and make a recommendation for surveillance. Two versions of an electronic assessment tool, diagram-based and questionnaire-based, were developed with the risk analysis and recommendations for surveillance based on the New Zealand Guidelines Group recommendations. Accuracy of our tool was tested prior to the study by comparing risk calculations based on family history by experienced gastroenterologists with the electronic assessment. The general public, visiting a local science fair were asked to use and comment on the usability of the two interfaces. Ninety people assessed and commented on the two interfaces. Both interfaces were effective in assessing the risk to develop CRC through their familial history for CRC. However, the questionnaire-based interface performed with significantly better satisfaction (p = 0.001) than the diagram-based interface. There was no difference in efficacy though. We conclude that a web-based questionnaire tool can assist in the accurate documentation and analysis of the family history relevant to determine the individual risk of CRC based on local guidelines. The calculator is now implemented and assessable through the web-page of a local charity for colorectal cancer

  6. Diagram Size vs. Layout Flaws: Understanding Quality Factors of UML Diagrams

    DEFF Research Database (Denmark)

    Störrle, Harald

    2016-01-01

    , though, is our third goal of extending our analysis aspects of diagram quality. Method: We improve our definition of diagram size and add a (provisional) definition of diagram quality as the number of topographic layout flaws. We apply these metrics on 60 diagrams of the five most commonly used types...... of UML diagram. We carefully analyze the structure of our diagram samples to ensure representativeness. We correlate diagram size and layout quality with modeler performance data obtained in previous experiments. The data set is the largest of its kind (n-156). Results: We replicate earlier findings......, and extend them to two new diagram types. We provide an improved definition of diagram size, and provide a definition of topographic layout quality, which is one more step towards a comprehensive definition of diagram quality as such. Both metrics are shown to be objectively applicable. We quantify...

  7. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.

  8. Measurement Uncertainty of Liquid Chromatographic Analyses Visualized by Ishikawa Diagrams

    OpenAIRE

    Meyer, Veronika R.

    2017-01-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncer...

  9. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability: SSD Plot Diagrams

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  10. Stability analysis for tidal inlets of Thuan An and Tu Hien using Escoffier diagram

    NARCIS (Netherlands)

    Lam, N.T.; Verhagen, H.J.; Van der Wegen, M.

    2004-01-01

    Stability analysis of tidal inlets is very important in providing knowledge on the behaviour of tidal inlet and lagoon systems. The analysis results can help to plan and manage the system effectively as well as to provide information for stability design of the inlets. This paper presents a method

  11. Robotics/Automated Systems Task Analysis and Description of Required Job Competencies Report. Task Analysis and Description of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…

  12. Diagram-based Analysis of Causal Systems (DACS): elucidating inter-relationships between determinants of acute lower respiratory infections among children in sub-Saharan Africa.

    Science.gov (United States)

    Rehfuess, Eva A; Best, Nicky; Briggs, David J; Joffe, Mike

    2013-12-06

    Effective interventions require evidence on how individual causal pathways jointly determine disease. Based on the concept of systems epidemiology, this paper develops Diagram-based Analysis of Causal Systems (DACS) as an approach to analyze complex systems, and applies it by examining the contributions of proximal and distal determinants of childhood acute lower respiratory infections (ALRI) in sub-Saharan Africa. Diagram-based Analysis of Causal Systems combines the use of causal diagrams with multiple routinely available data sources, using a variety of statistical techniques. In a step-by-step process, the causal diagram evolves from conceptual based on a priori knowledge and assumptions, through operational informed by data availability which then undergoes empirical testing, to integrated which synthesizes information from multiple datasets. In our application, we apply different regression techniques to Demographic and Health Survey (DHS) datasets for Benin, Ethiopia, Kenya and Namibia and a pooled World Health Survey (WHS) dataset for sixteen African countries. Explicit strategies are employed to make decisions transparent about the inclusion/omission of arrows, the sign and strength of the relationships and homogeneity/heterogeneity across settings.Findings about the current state of evidence on the complex web of socio-economic, environmental, behavioral and healthcare factors influencing childhood ALRI, based on DHS and WHS data, are summarized in an integrated causal diagram. Notably, solid fuel use is structured by socio-economic factors and increases the risk of childhood ALRI mortality. Diagram-based Analysis of Causal Systems is a means of organizing the current state of knowledge about a specific area of research, and a framework for integrating statistical analyses across a whole system. This partly a priori approach is explicit about causal assumptions guiding the analysis and about researcher judgment, and wrong assumptions can be reversed

  13. An Analysis of Impact Factors for Positioning Performance in WLAN Fingerprinting Systems Using Ishikawa Diagrams and a Simulation Platform

    Directory of Open Access Journals (Sweden)

    Keqiang Liu

    2017-01-01

    Full Text Available Many factors influence the positioning performance in WLAN RSSI fingerprinting systems, and summary of these factors is an important but challenging job. Moreover, impact analysis on nonalgorithm factors is significant to system application and quality control but little research has been conducted. This paper analyzes and summarizes the potential impact factors by using an Ishikawa diagram considering radio signal transmitting, propagating, receiving, and processing. A simulation platform was developed to facilitate the analysis experiment, and the paper classifies the potential factors into controllable, uncontrollable, nuisance, and held-constant factors considering simulation feasibility. It takes five nonalgorithm controllable factors including APs density, APs distribution, radio signal propagating attenuation factor, radio signal propagating noise, and RPs density into consideration and adopted the OFAT analysis method in experiment. The positioning result was achieved by using the deterministic and probabilistic algorithms, and the error was presented by RMSE and CDF. The results indicate that the high APs density, signal propagating attenuation factor, and RPs density, with the low signal propagating noise level, are favorable to better performance, while APs distribution has no particular impact pattern on the positioning error. Overall, this paper has made great potential contribution to the quality control of WLAN fingerprinting solutions.

  14. A Validated Task Analysis of the Single Pilot Operations Concept

    Science.gov (United States)

    Wolter, Cynthia A.; Gore, Brian F.

    2015-01-01

    The current day flight deck operational environment consists of a two-person Captain/First Officer crew. A concept of operations (ConOps) to reduce the commercial cockpit to a single pilot from the current two pilot crew is termed Single Pilot Operations (SPO). This concept has been under study by researchers in the Flight Deck Display Research Laboratory (FDDRL) at the National Aeronautics and Space Administration's (NASA) Ames (Johnson, Comerford, Lachter, Battiste, Feary, and Mogford, 2012) and researchers from Langley Research Centers (Schutte et al., 2007). Transitioning from a two pilot crew to a single pilot crew will undoubtedly require changes in operational procedures, crew coordination, use of automation, and in how the roles and responsibilities of the flight deck and ATC are conceptualized in order to maintain the high levels of safety expected of the US National Airspace System. These modifications will affect the roles and the subsequent tasks that are required of the various operators in the NextGen environment. The current report outlines the process taken to identify and document the tasks required by the crew according to a number of operational scenarios studied by the FDDRL between the years 2012-2014. A baseline task decomposition has been refined to represent the tasks consistent with a new set of entities, tasks, roles, and responsibilities being explored by the FDDRL as the move is made towards SPO. Information from Subject Matter Expert interviews, participation in FDDRL experimental design meetings, and study observation was used to populate and refine task sets that were developed as part of the SPO task analyses. The task analysis is based upon the proposed ConOps for the third FDDRL SPO study. This experiment possessed nine different entities operating in six scenarios using a variety of SPO-related automation and procedural activities required to guide safe and efficient aircraft operations. The task analysis presents the roles and

  15. Cognitive Task Analysis of Prioritization in Air Traffic Control.

    Science.gov (United States)

    Redding, Richard E.; And Others

    A cognitive task analysis was performed to analyze the key cognitive components of the en route air traffic controllers' jobs. The goals were to ascertain expert mental models and decision-making strategies and to identify important differences in controller knowledge, skills, and mental models as a function of expertise. Four groups of…

  16. Data analysis & probability task sheets : grades pk-2

    CERN Document Server

    Cook, Tanya

    2009-01-01

    For grades PK-2, our Common Core State Standards-based resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to learn and review the concepts in unique ways. Each task sheet is organized around a central problem taken from real-life experiences of the students.

  17. Use of Job and Task Analysis in Training.

    Science.gov (United States)

    George Washington Univ., Alexandria, VA. Human Resources Research Office.

    A t a briefing sponsored by the Office of the Deputy Chief of Staff for Individual Training, members of the Human Resources Research Office reported on four projects using job and task analysis in different training situations. Wor k Unit STOCK was a training program designed to develop training management procedures for heterogeneous ability…

  18. Partial Derivative Games in Thermodynamics: A Cognitive Task Analysis

    Science.gov (United States)

    Kustusch, Mary Bridget; Roundy, David; Dray, Tevian; Manogue, Corinne A.

    2014-01-01

    Several studies in recent years have demonstrated that upper-division students struggle with the mathematics of thermodynamics. This paper presents a task analysis based on several expert attempts to solve a challenging mathematics problem in thermodynamics. The purpose of this paper is twofold. First, we highlight the importance of cognitive task…

  19. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...

  20. Application of ISO22000, failure mode, and effect analysis (FMEA) cause and effect diagrams and pareto in conjunction with HACCP and risk assessment for processing of pastry products.

    Science.gov (United States)

    Varzakas, Theodoros H

    2011-09-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.

  1. The CCT diagram of the austenite transformations of the 45 steel during isothermal cooling. Dilatometric and microscopic analysis

    International Nuclear Information System (INIS)

    Wierszyllowski, I.; Wieczorek, S.

    2003-01-01

    The CCT diagram of the austenite transformations of the 45 steel during isochronal cooling makes possible to develop equations that enables prediction of structure and properties after conventional heat treatment. Dilatometric method was applied in order to work out such diagram for 45 steel austenitized at 1050 o C. Structures appeared during applied cooling rates were presented. Shapes of dilatometric diagrams described austenite transformations and microstructures are mutually related. During austenite to ferrite transformations separation for equiaxial and coniferous ferrite was possible. The border cooling rate at which coniferous ferrite starts to precipitate was determined as well as M s and M f temperatures. Shape of CCT diagram developed with use of isochronal cooling is different from conventional one. Obtained results were discussed on the literature basis. (author)

  2. Inpo/industry job and task analysis efforts

    International Nuclear Information System (INIS)

    Wigley, W.W.

    1985-01-01

    One of the goals of INPO is to develop and coordinate industrywide programs to improve the education, training and qualification of nuclear utility personnel. To accomplish this goal, INPO's Training and Education Division: conducts periodic evaluations of industry training programs; provides assistance to the industry in developing training programs; manages the accreditation of utility training programs. These efforts are aimed at satisfying the need for training programs for nuclear utility personnel to be performance-based. Performance-based means that training programs provide an incumbent with the skills and knowledge required to safely perform the job. One of the ways that INPO has provided assistance to the industry is through the industrywide job and task analysis effort. I will discuss the job analysis and task analysis processes, the current status of JTA efforts, JTA products and JTA lessons learned

  3. Shock Revival in Core-collapse Supernovae: A Phase-diagram Analysis

    Science.gov (United States)

    Gabay, Daniel; Balberg, Shmuel; Keshet, Uri

    2015-12-01

    We examine the conditions for the revival of the stalled accretion shock in core-collapse supernovae, in the context of the neutrino heating mechanism. We combine one-dimensional simulations of the shock revival process with a derivation of a quasi-stationary approximation, which is both accurate and efficient in predicting the flow. In particular, this approach is used to explore how the evolution of the accretion shock depends on the shock radius, RS, and velocity, VS (in addition to other global properties of the system). We do so through a phase-space analysis of the shock acceleration, aS, in the {R}S{--}{V}S plane, shown to provide quantitative insights into the initiation and nature of runaway expansion. In the particular case of an initially stationary ({V}S=0, {a}S=0) profile, the prospects for an explosion can be assessed by the initial signs of the partial derivatives of the shock acceleration, in analogy to a linear damped/anti-damped oscillator. If \\partial {a}S/\\partial {R}S\\lt 0 and \\partial {a}S/\\partial {V}S\\gt 0, runaway will likely occur after several oscillations, while if \\partial {a}S/\\partial {R}S\\gt 0, runaway expansion will commence in a non-oscillatory fashion. These two modes of runaway correspond to low and high mass accretion rates, respectively. We also use the quasi-stationary approximation to assess the advection-to-heating timescale ratio in the gain region, often used as an explosion proxy. Indeed, this ratio does tend to ∼1 in conjunction with runaway conditions, but neither this unit value nor the specific choice of the gain region as a point of reference appear to be unique in this regard.

  4. EXPERIMENTAL ANALYSIS AND ISHIKAWA DIAGRAM FOR BURN ON EFFECT ON MANGANESE SILICON ALLOY MEDIUM CARBON STEEL SHAFT

    Directory of Open Access Journals (Sweden)

    AsmamawTegegne

    2013-12-01

    Full Text Available Burn on/metal penetration is one of the surface defects of metal castings in general and steel castings in particular. A research on the effect of burn on the six ton medium carbon steel shaft for making a roller of cold rolled steel sheet produced at one of the metals industry was carried out. The shaft was cast using sand casting by pouring through riser/feeding head step by step (with time interval of pouring. As it was required to use foam casting method for better surface finish and dimensional accuracy of the cast, the pattern was prepared from polystyrene and embedded by silica sand. Physical observations, photographic analysis, visual inspection, measurement of depth of penetration and fish bone diagram were used as method of results analysis. The shaft produced has strongly affected by sand sintering (burn on/metal penetration. Many reasons may be the case for these defects, however analysis results showed that the use of poorly designed gating system led to turbulence flow, uncontrollable high temperature fused the silica sand and liquid polystyrene penetrated the poorly reclaimed and rammed sand mold as a result of which eroded sand has penetrated the liquid metal deeply and reacted with it, consequently after solidification and finishing the required 240mm diameter of the shaft has reduced un evenly to 133mm minimum and 229mm maximum mm that end in the rejection of the shaft from the product since it is below the required standard for the designed application. In addition, it was not possible to remove the adhered sand by grinding. Thus burn on is included in mechanical type burn on.

  5. Introduction to Feynman diagrams

    CERN Document Server

    Bilenky, Samoil Mikhelevich

    1974-01-01

    Introduction to Feynman Diagrams provides Feynman diagram techniques and methods for calculating quantities measured experimentally. The book discusses topics Feynman diagrams intended for experimental physicists. Topics presented include methods for calculating the matrix elements (by perturbation theory) and the basic rules for constructing Feynman diagrams; techniques for calculating cross sections and polarizations; processes in which both leptons and hadrons take part; and the electromagnetic and weak form factors of nucleons. Experimental physicists and graduate students of physics will

  6. Single-Family Energy Auditor Job Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Head, Heather R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kurnik, Charles W [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-05-02

    The National Renewable Energy Laboratory (NREL) is contracted by the U.S. Department of Energy (DOE) Weatherization Assistance Program (WAP) to develop and maintain the resources under the Guidelines for Home Energy Professionals (GHEP) project. As part of the GHEP strategy to increase the quality of work conducted for single-family, residential energy-efficiency retrofits, the Home Energy Professionals Job Task Analysis are used as the foundation for quality training programs and trainers.

  7. Task analysis: How far are we from usable PRA input

    International Nuclear Information System (INIS)

    Gertman, D.I.; Blackman, H.S.; Hinton, M.F.

    1984-01-01

    This chapter reviews data collected at the Idaho National Engineering Laboratory for three DOE-owned reactors (the Advanced Test Reactor, the Power Burst Facility, and the Loss of Fluids Test Reactor) in order to identify usable Probabilistic Risk Assessment (PRA) input. Task analytic procedures involve the determination of manning and skill levels as a means of determining communication requirements, in assessing job performance aids, and in assessing the accuracy and completeness of emergency and maintenance procedures. The least understood aspect in PRA and plant reliability models is the human factor. A number of examples from the data base are discussed and offered as a means of providing more meaningful data than has been available to PRA analysts in the past. It is concluded that the plant hardware-procedures-personnel interfaces are essential to safe and efficient plant operations and that task analysis is a reasonably sound way of achieving a qualitative method for identifying those tasks most strongly associated with task difficulty, severity of consequence, and error probability

  8. Operations space diagram for ECRH and ECCD

    International Nuclear Information System (INIS)

    Bindslev, Henrik

    2004-01-01

    A Clemmov-Mullaly-Allis (CMA) type diagram, the ECW-CMA diagram, for representing the operational possibilities of electron cyclotron heating and current drive (ECRH/ECCD) systems for fusion plasmas is presented. In this diagram, with normalized density and normalized magnetic field coordinates, the parameter range in which it is possible to achieve a given task (e.g. O-mode current drive for stabilizing a neoclassical tearing mode) appears as a region. With also the Greenwald density limit shown, this diagram condenses the information on operational possibilities, facilitating the overview required at the design phase. At the operations phase it may also prove useful in setting up experimental scenarios by showing operational possibilities, avoiding the need for survey type ray-tracing at the initial planning stages. The diagram may also serve the purpose of communicating operational possibilities to non-experts. JET and ITER like plasmas are used, but the method is generic. (author)

  9. Operations space diagram for ECRH and ECCD

    DEFF Research Database (Denmark)

    Bindslev, H.

    2004-01-01

    at the design phase. At the operations phase it may also prove useful in setting up experimental scenarios by showing operational possibilities, avoiding the need for survey type ray-tracing at the initial planning stages. The diagram may also serve the purpose of communicating operational possibilities to non......A Clemmov-Mullaly-Allis (CMA) type diagram, the ECW-CMA diagram, for representing the operational possibilities of electron cyclotron heating and current drive (ECRH/ECCD) systems for fusion plasmas is presented. In this diagram, with normalized density and normalized magnetic field coordinates......, the parameter range in which it is possible to achieve a given task (e.g. O-mode current drive for stabilizing a neoclassical tearing mode) appears as a region. With also the Greenwald density limit shown, this diagram condenses the information on operational possibilities, facilitating the overview required...

  10. Time-dependent structural transformation analysis to high-level Petri net model with active state transition diagram

    Directory of Open Access Journals (Sweden)

    Saito Ayumu

    2010-04-01

    Full Text Available Abstract Background With an accumulation of in silico data obtained by simulating large-scale biological networks, a new interest of research is emerging for elucidating how living organism functions over time in cells. Investigating the dynamic features of current computational models promises a deeper understanding of complex cellular processes. This leads us to develop a method that utilizes structural properties of the model over all simulation time steps. Further, user-friendly overviews of dynamic behaviors can be considered to provide a great help in understanding the variations of system mechanisms. Results We propose a novel method for constructing and analyzing a so-called active state transition diagram (ASTD by using time-course simulation data of a high-level Petri net. Our method includes two new algorithms. The first algorithm extracts a series of subnets (called temporal subnets reflecting biological components contributing to the dynamics, while retaining positive mathematical qualities. The second one creates an ASTD composed of unique temporal subnets. ASTD provides users with concise information allowing them to grasp and trace how a key regulatory subnet and/or a network changes with time. The applicability of our method is demonstrated by the analysis of the underlying model for circadian rhythms in Drosophila. Conclusions Building ASTD is a useful means to convert a hybrid model dealing with discrete, continuous and more complicated events to finite time-dependent states. Based on ASTD, various analytical approaches can be applied to obtain new insights into not only systematic mechanisms but also dynamics.

  11. Development of task analysis method for operator tasks in main control room of an advanced nuclear power plant

    International Nuclear Information System (INIS)

    Lin Chiuhsiangloe; Hsieh Tsungling

    2016-01-01

    Task analysis methods provide an insight for quantitative and qualitative predictions of how people will use a proposed system, though the different versions have different emphases. Most of the methods can attest to the coverage of the functionality of a system and all provide estimates of task performance time. However, most of the tasks that operators deal with in a digital work environment in the main control room of an advanced nuclear power plant require high mental activity. Such mental tasks overlap and must be dealt with at the same time; most of them can be assumed to be highly parallel in nature. Therefore, the primary aim to be addressed in this paper was to develop a method that adopts CPM-GOMS (cognitive perceptual motor-goals operators methods selection rules) as the basic pattern of mental task analysis for the advanced main control room. A within-subjects experiment design was used to examine the validity of the modified CPM-GOMS. Thirty participants participated in two task types, which included high- and low-compatibility types. The results indicated that the performance was significantly higher on the high-compatibility task type than on the low-compatibility task type; that is, the modified CPM-GOMS could distinguish the difference between high- and low-compatibility mental tasks. (author)

  12. Warped penguin diagrams

    International Nuclear Information System (INIS)

    Csaki, Csaba; Grossman, Yuval; Tanedo, Philip; Tsai, Yuhsin

    2011-01-01

    We present an analysis of the loop-induced magnetic dipole operator in the Randall-Sundrum model of a warped extra dimension with anarchic bulk fermions and an IR brane-localized Higgs. These operators are finite at one-loop order and we explicitly calculate the branching ratio for μ→eγ using the mixed position/momentum space formalism. The particular bound on the anarchic Yukawa and Kaluza-Klein (KK) scales can depend on the flavor structure of the anarchic matrices. It is possible for a generic model to either be ruled out or unaffected by these bounds without any fine-tuning. We quantify how these models realize this surprising behavior. We also review tree-level lepton flavor bounds in these models and show that these are on the verge of tension with the μ→eγ bounds from typical models with a 3 TeV Kaluza-Klein scale. Further, we illuminate the nature of the one-loop finiteness of these diagrams and show how to accurately determine the degree of divergence of a five-dimensional loop diagram using both the five-dimensional and KK formalism. This power counting can be obfuscated in the four-dimensional Kaluza-Klein formalism and we explicitly point out subtleties that ensure that the two formalisms agree. Finally, we remark on the existence of a perturbative regime in which these one-loop results give the dominant contribution.

  13. Physical and cognitive task analysis in interventional radiology

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S [School of Psychology, University of Liverpool, Liverpool (United Kingdom); Healey, A [Royal Liverpool University Hospital, Liverpool (United Kingdom); Evans, J [Royal Liverpool University Hospital, Liverpool (United Kingdom); Murphy, M [Royal Liverpool University Hospital, Liverpool (United Kingdom); Crawshaw, M [Department of Psychology, University of Hull, Hull (United Kingdom); Gould, D [Royal Liverpool University Hospital, Liverpool (United Kingdom)

    2006-01-15

    AIM: To identify, describe and detail the cognitive thought processes, decision-making, and physical actions involved in the preparation and successful performance of core interventional radiology procedures. MATERIALS AND METHODS: Five commonly performed core interventional radiology procedures were selected for cognitive task analysis. Several examples of each procedure being performed by consultant interventional radiologists were videoed. The videos of those procedures, and the steps required for successful outcome, were analysed by a psychologist and an interventional radiologist. Once a skeleton algorithm of the procedures was defined, further refinement was achieved using individual interview techniques with consultant interventional radiologists. Additionally a critique of each iteration of the established algorithm was sought from non-participating independent consultant interventional radiologists. RESULTS: Detailed task descriptions and decision protocols were developed for five interventional radiology procedures (arterial puncture, nephrostomy, venous access, biopsy-using both ultrasound and computed tomography, and percutaneous transhepatic cholangiogram). Identical tasks performed within these procedures were identified and standardized within the protocols. CONCLUSIONS: Complex procedures were broken down and their constituent processes identified. This might be suitable for use as a training protocol to provide a universally acceptable safe practice at the most fundamental level. It is envisaged that data collected in this way can be used as an educational resource for trainees and could provide the basis for a training curriculum in interventional radiology. It will direct trainees towards safe practice of the highest standard. It will also provide performance objectives of a simulator model.

  14. Physical and cognitive task analysis in interventional radiology

    International Nuclear Information System (INIS)

    Johnson, S.; Healey, A.; Evans, J.; Murphy, M.; Crawshaw, M.; Gould, D.

    2006-01-01

    AIM: To identify, describe and detail the cognitive thought processes, decision-making, and physical actions involved in the preparation and successful performance of core interventional radiology procedures. MATERIALS AND METHODS: Five commonly performed core interventional radiology procedures were selected for cognitive task analysis. Several examples of each procedure being performed by consultant interventional radiologists were videoed. The videos of those procedures, and the steps required for successful outcome, were analysed by a psychologist and an interventional radiologist. Once a skeleton algorithm of the procedures was defined, further refinement was achieved using individual interview techniques with consultant interventional radiologists. Additionally a critique of each iteration of the established algorithm was sought from non-participating independent consultant interventional radiologists. RESULTS: Detailed task descriptions and decision protocols were developed for five interventional radiology procedures (arterial puncture, nephrostomy, venous access, biopsy-using both ultrasound and computed tomography, and percutaneous transhepatic cholangiogram). Identical tasks performed within these procedures were identified and standardized within the protocols. CONCLUSIONS: Complex procedures were broken down and their constituent processes identified. This might be suitable for use as a training protocol to provide a universally acceptable safe practice at the most fundamental level. It is envisaged that data collected in this way can be used as an educational resource for trainees and could provide the basis for a training curriculum in interventional radiology. It will direct trainees towards safe practice of the highest standard. It will also provide performance objectives of a simulator model

  15. Analysis and Modeling of Control Tasks in Dynamic Systems

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær; Krink, Thiemo; Jensen, Mikkel Thomas

    2002-01-01

    Most applications of evolutionary algorithms deal with static optimization problems. However, in recent years, there has been a growing interest in time-varying (dynamic) problems, which are typically found in real-world scenarios. One major challenge in this field is the design of realistic test......-case generators (TCGs), which requires a systematic analysis of dynamic optimization tasks. So far, only a few TCGs have been suggested. Our investigation leads to the conclusion that these TCGs are not capable of generating realistic dynamic benchmark tests. The result of our research is the design of a new TCG...

  16. Job task analysis: lessons learned from application in course development

    International Nuclear Information System (INIS)

    Meredith, J.B.

    1985-01-01

    Those at Public Service Electric and Gas Company are committed to a systematic approach to training known as Instructional System Design. Our performance-based training emphasizes the ISD process to have trainees do or perform the task whenever and wherever it is possible for the jobs for which they are being trained. Included is a brief description of our process for conducting and validating job analyses. The major thrust of this paper is primarily on the lessons that we have learned in the design and development of training programs based upon job analysis results

  17. A layout technique for class diagrams to be used in product configuration projects

    DEFF Research Database (Denmark)

    Haug, Anders; Hvam, Lars; Mortensen, Niels Henrik

    2010-01-01

    . On the other hand, the requirements for the design language are more focused on having a formalised and rich language. For this task class diagrams are often applied. To avoid the use of different modelling languages in the analysis and design phase, this paper proposes and tests a layout technique...

  18. Use of the Ishikawa diagram in a case-control analysis to assess the causes of a diffuse lamellar keratitis outbreak

    Directory of Open Access Journals (Sweden)

    Luis Henrique Lira

    Full Text Available ABSTRACT Purpose: To identify the causes of a diffuse lamellar keratitis (DLK outbreak using a systematic search tool in a case-control analysis. Methods: An Ishikawa diagram was used to guide physicians to determine the potential risk factors involved in this outbreak. Coherence between the occurrences and each possible cause listed in the diagram was verified, and the total number of eyes at risk was used to calculate the proportion of affected eyes. Multivariate analysis was performed using logistic regression to determine the independent effect of the risk factors, after controlling for confounders and test interactions. Results: All DLK cases were reported in 2007 between June 13 and December 21; during this period, 3,698 procedures were performed. Of the 1,682 flap-related procedures, 204 eyes of 141 individuals presented with DLK. No direct relationship was observed between the occurrence of DLK and the presence of any specific factors; however, flap-lifting enhancements, procedures performed during the morning shift, and non-use of therapeutic contact lenses after the surgery were significantly related to higher occurrence percentages of this condition. Conclusions: The Ishikawa diagram, like most quality tools, is a visualization and knowledge organization tool. This systematization allowed the investigators to thoroughly assess all the possible causes of DLK outbreak. A clear view of the entire surgical logistics permitted even more rigid management of the main factors involved in the process and, as a result, highlighted factors that deserved attention. The case-control analysis on every factor raised by the Ishikawa diagram indicated that the commonly suspected factors such as biofilm contamination of the water reservoir in autoclaves, the air-conditioning filter system, glove powder, microkeratome motor oil, and gentian violet markers were not related to the outbreak.

  19. Use of the Ishikawa diagram in a case-control analysis to assess the causes of a diffuse lamellar keratitis outbreak.

    Science.gov (United States)

    Lira, Luis Henrique; Hirai, Flávio E; Oliveira, Marivaldo; Portellinha, Waldir; Nakano, Eliane Mayumi

    2017-01-01

    To identify the causes of a diffuse lamellar keratitis (DLK) outbreak using a systematic search tool in a case-control analysis. An Ishikawa diagram was used to guide physicians to determine the potential risk factors involved in this outbreak. Coherence between the occurrences and each possible cause listed in the diagram was verified, and the total number of eyes at risk was used to calculate the proportion of affected eyes. Multivariate analysis was performed using logistic regression to determine the independent effect of the risk factors, after controlling for confounders and test interactions. All DLK cases were reported in 2007 between June 13 and December 21; during this period, 3,698 procedures were performed. Of the 1,682 flap-related procedures, 204 eyes of 141 individuals presented with DLK. No direct relationship was observed between the occurrence of DLK and the presence of any specific factors; however, flap-lifting enhancements, procedures performed during the morning shift, and non-use of therapeutic contact lenses after the surgery were significantly related to higher occurrence percentages of this condition. The Ishikawa diagram, like most quality tools, is a visualization and knowledge organization tool. This systematization allowed the investigators to thoroughly assess all the possible causes of DLK outbreak. A clear view of the entire surgical logistics permitted even more rigid management of the main factors involved in the process and, as a result, highlighted factors that deserved attention. The case-control analysis on every factor raised by the Ishikawa diagram indicated that the commonly suspected factors such as biofilm contamination of the water reservoir in autoclaves, the air-conditioning filter system, glove powder, microkeratome motor oil, and gentian violet markers were not related to the outbreak.

  20. Using Goal Setting and Task Analysis to Enhance Task-Based Language Learning and Teaching

    Science.gov (United States)

    Rubin, Joan

    2015-01-01

    Task-Based Language Learning and Teaching has received sustained attention from teachers and researchers for over thirty years. It is a well-established pedagogy that includes the following characteristics: major focus on authentic and real-world tasks, choice of linguistic resources by learners, and a clearly defined non-linguistic outcome. This…

  1. Validity of the alcohol purchase task: a meta-analysis.

    Science.gov (United States)

    Kiselica, Andrew M; Webber, Troy A; Bornovalova, Marina A

    2016-05-01

    Behavioral economists assess alcohol consumption as a function of unit price. This method allows construction of demand curves and demand indices, which are thought to provide precise numerical estimates of risk for alcohol problems. One of the more commonly used behavioral economic measures is the Alcohol Purchase Task (APT). Although the APT has shown promise as a measure of risk for alcohol problems, the construct validity and incremental utility of the APT remain unclear. This paper presents a meta-analysis of the APT literature. Sixteen studies were included in the meta-analysis. Studies were gathered via searches of the PsycInfo, PubMed, Web of Science and EconLit research databases. Random-effects meta-analyses with inverse variance weighting were used to calculate summary effect sizes for each demand index-drinking outcome relationship. Moderation of these effects by drinking status (regular versus heavy drinkers) was examined. Additionally, tests of the incremental utility of the APT indices in predicting drinking problems above and beyond measuring alcohol consumption were performed. The APT indices were correlated in the expected directions with drinking outcomes, although many effects were small in size. These effects were typically not moderated by the drinking status of the samples. Additionally, the intensity metric demonstrated incremental utility in predicting alcohol use disorder symptoms beyond measuring drinking. The Alcohol Purchase Task appears to have good construct validity, but limited incremental utility in estimating risk for alcohol problems. © 2015 Society for the Study of Addiction.

  2. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  3. Quantitative analysis of task selection for brain-computer interfaces

    Science.gov (United States)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  4. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  5. Simulation analysis and ternary diagram of municipal solid waste pyrolysis and gasification based on the equilibrium model.

    Science.gov (United States)

    Deng, Na; Zhang, Awen; Zhang, Qiang; He, Guansong; Cui, Wenqian; Chen, Guanyi; Song, Chengcai

    2017-07-01

    A self-sustained municipal solid waste (MSW) pyrolysis-gasification process with self-produced syngas as heat source was proposed and an equilibrium model was established to predict the syngas reuse rate considering variable MSW components. Simulation results indicated that for constant moisture (ash) content, with the increase of ash (moisture) content, syngas reuse rate gradually increased, and reached the maximum 100% when ash (moisture) content was 73.9% (60.4%). Novel ternary diagrams with moisture, ash and combustible as axes were proposed to predict the adaptability of the self-sustained process and syngas reuse rate for waste. For wastes of given components, its position in the ternary diagram can be determined and the syngas reuse rate can be obtained, which will provide guidance for system design. Assuming that the MSW was composed of 100% combustible content, ternary diagram shows that there was a minimum limiting value of 43.8% for the syngas reuse rate in the process. Copyright © 2017. Published by Elsevier Ltd.

  6. Risk analysis using AS/NZS 4360:2004, Bow-Tie diagram and ALARP on construction projects of Banyumanik Hospital

    Science.gov (United States)

    Sari, Diana Puspita; Pujotomo, Darminto; Wardani, Nadira Kusuma

    2017-11-01

    The Determination of risk is an uncertain event. Risks can have negative or positive impacts on project objectives. A project was defined as a series of activities and tasks that have a purpose, specifications, and limits of cost. Banyumanik Hospital Development Project is one of the construction projects in Semarang which have experienced some problems. The first problem is project delays on building stake. The second problem is delay of material supply. Finally, the problem that occurs is less management attention to health safety as evidenced by the unavailability of PPE for the workers. These problems will pose a risk to be a very important risk management performed by contractors at the Banyumanik Hospital Development Project to reduce the impact that would be caused by the risk borne by the provider of construction services. This research aim to risk identification, risk assessment and risk mitigation. Project risk management begins with the identification of risks based on the project life cycle. The risk assessment carried out by AS I NZS 4360: 2004 to the impacts of cost, time and quality. The results obtained from the method of AS I NZS 4360: 2004 is the risk that requires the handling of mitigation. Mitigated risk is the risk that had significant and high level. There are four risks that require risk mitigation with Bow-Tie diagrams which is work accidents, contract delays, material delays and design changes. Bow-Tie diagram method is a method for identifying causal and preventive action and recovery of a risk. Results obtained from Bow-Tie diagram method is a preventive action and recovery. This action is used as input to the ALARP method. ALARP method is used to determine the priority of the strategy proposed in the category broadly acceptable, tolerable, and unacceptable.

  7. Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.

    Science.gov (United States)

    Meyer, Veronika R

    2003-09-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.

  8. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  9. Cognitive Modeling and Task Analysis: Basic Processes and Individual Differences

    National Research Council Canada - National Science Library

    Ackerman, Phillip

    1999-01-01

    ... in a complex-skill environment. The subset of task conditions selected were those that involve basic processes of working memory, task monitoring, and differential loads on spatial reasoning and speed of perceiving...

  10. Optimizing UML Class Diagrams

    Directory of Open Access Journals (Sweden)

    Sergievskiy Maxim

    2018-01-01

    Full Text Available Most of object-oriented development technologies rely on the use of the universal modeling language UML; class diagrams play a very important role in the design process play, used to build a software system model. Modern CASE tools, which are the basic tools for object-oriented development, can’t be used to optimize UML diagrams. In this manuscript we will explain how, based on the use of design patterns and anti-patterns, class diagrams could be verified and optimized. Certain transformations can be carried out automatically; in other cases, potential inefficiencies will be indicated and recommendations given. This study also discusses additional CASE tools for validating and optimizing of UML class diagrams. For this purpose, a plugin has been developed that analyzes an XMI file containing a description of class diagrams.

  11. Algorithmic phase diagrams

    Science.gov (United States)

    Hockney, Roger

    1987-01-01

    Algorithmic phase diagrams are a neat and compact representation of the results of comparing the execution time of several algorithms for the solution of the same problem. As an example, the recent results are shown of Gannon and Van Rosendale on the solution of multiple tridiagonal systems of equations in the form of such diagrams. The act of preparing these diagrams has revealed an unexpectedly complex relationship between the best algorithm and the number and size of the tridiagonal systems, which was not evident from the algebraic formulae in the original paper. Even so, for a particular computer, one diagram suffices to predict the best algorithm for all problems that are likely to be encountered the prediction being read directly from the diagram without complex calculation.

  12. Cognitive task analysis and the design of computerized operator aids

    International Nuclear Information System (INIS)

    Andersson, H.

    1985-01-01

    The new technological possibilities have led to the initiation of many projects for the design and evaluation of computerized operator support systems to be implemented in nuclear power plant control rooms. A typical finding so far has been that operators often have a positive attitude towards such systems but still don't use them very much, mostly because they find almost the same information on the conventional control boards which they are accustomed to use. Still, however, there is a widely shared belief that conventional control rooms have short-comings that make the use of computerized aids necessary. One reason for the limited success so far is that the new systems often are poorly integrated with the existing conventional instrumentation and with the working procedures. The reluctance to use new computer based aids, despite their nice features, is therefore probably caused by an inadequate task analysis made prior to the design of these computerized operator support systems

  13. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  14. Use of task analysis in control room evaluations

    International Nuclear Information System (INIS)

    Ross, K.C.

    1981-01-01

    Responding to recently formulated regulatory requirements, the BWR Owners' Group, working in conjunction with General Electric, has formulated a method for performing human factors design reviews of nuclear power plant control rooms. This process incorporates task analyses to analyze operational aspects of panel layout and design. Correlation of operator functions defined by emergency procedures against required controls and displays has proven successful in identifying instrumentation required in the control room to adequately respond to transient conditions, and in evaluating the effectiveness of panel design and physical arrangement. Extensions of the analysis have provided information on operator response paths, frequency of use of instruments, and control room layout. The techniques used were based on a need to identify primary controls and indications required by the operator in performing each step of the applicable procedure. The relative locations of these instruments were then analyzed for information on the adequacy of the control room design for those conditions

  15. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2014-01-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  16. Design Diagrams for the Analysis of Active Pressure on Retaining Walls with the Effect of Line Surcharge

    Directory of Open Access Journals (Sweden)

    Mojtaba Ahmadabadi

    2017-01-01

    Full Text Available In this study, a formulation has been proposed to calculate the pressure on wall and determine the angle of failure wedge based on limit equilibrium method. The mentioned formulation is capable of calculating active pressure coefficient, culmination of forces in failure surface, and pressure distribution on wall with the effect of line surcharge. In addition, based on the proposed method, a simple formula has been proposed to calculate the angle of failure wedge by the effect of surcharge. Moreover, the proposed approach has the advantage of taking into account the effect of surcharge on elastoplastic environment by considering the parameters of soil and determining the extent to which the surcharge is effective in pressure distribution on the wall. However, in most previous methods and specifications, resultant lateral pressure from surcharge in elastic environment had been considered. Finally, based on the obtained results, the design diagrams for different soils and different surcharges have been proposed. According to these diagrams, pressure on wall, pressure distribution on wall, and angle of failure wedge will easily be achieved. Also, a computer program has been written in MATLAB software environment. Using the results of these codes, the pressure on wall with the effect of surcharge, the angle of failure wedge, and pressure distribution on wall will be determined.

  17. Analysis of operators' diagnosis tasks based on cognitive process

    International Nuclear Information System (INIS)

    Zhou Yong; Zhang Li

    2012-01-01

    Diagnosis tasks in nuclear power plants characterized as high-dynamic uncertainties are complex reasoning tasks. Diagnosis errors are the main causes for the error of commission. Firstly, based on mental model theory and perception/action cycle theory, a cognitive model for analyzing operators' diagnosis tasks is proposed. Then, the model is used to investigate a trip event which occurred at crystal river nuclear power plant. The application demonstrates typical cognitive bias and mistakes which operators may make when performing diagnosis tasks. They mainly include the strong confirmation tendency, difficulty to produce complete hypothesis sets, group mindset, non-systematic errors in hypothesis testing, and etc. (authors)

  18. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  19. Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation

    Directory of Open Access Journals (Sweden)

    Marlen Promann

    2015-03-01

    Full Text Available Libraries are implementing discovery layers to offer better user experiences. While usability tests have been helpful in evaluating the success or failure of implementing discovery layers in the library context, the focus has remained on its relative interface benefits over the traditional federated search. The informal site- and context specific usability tests have offered little to test the rigor of the discovery layers against the user goals, motivations and workflow they have been designed to support. This study proposes hierarchical task analysis (HTA as an important complementary evaluation method to usability testing of discovery layers. Relevant literature is reviewed for the discovery layers and the HTA method. As no previous application of HTA to the evaluation of discovery layers was found, this paper presents the application of HTA as an expert based and workflow centered (e.g. retrieving a relevant book or a journal article method to evaluating discovery layers. Purdue University’s Primo by Ex Libris was used to map eleven use cases as HTA charts. Nielsen’s Goal Composition theory was used as an analytical framework to evaluate the goal carts from two perspectives: a users’ physical interactions (i.e. clicks, and b user’s cognitive steps (i.e. decision points for what to do next. A brief comparison of HTA and usability test findings is offered as a way of conclusion.

  20. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-04-01

    Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  1. Causal diagrams in systems epidemiology

    Directory of Open Access Journals (Sweden)

    Joffe Michael

    2012-03-01

    Full Text Available Abstract Methods of diagrammatic modelling have been greatly developed in the past two decades. Outside the context of infectious diseases, systematic use of diagrams in epidemiology has been mainly confined to the analysis of a single link: that between a disease outcome and its proximal determinant(s. Transmitted causes ("causes of causes" tend not to be systematically analysed. The infectious disease epidemiology modelling tradition models the human population in its environment, typically with the exposure-health relationship and the determinants of exposure being considered at individual and group/ecological levels, respectively. Some properties of the resulting systems are quite general, and are seen in unrelated contexts such as biochemical pathways. Confining analysis to a single link misses the opportunity to discover such properties. The structure of a causal diagram is derived from knowledge about how the world works, as well as from statistical evidence. A single diagram can be used to characterise a whole research area, not just a single analysis - although this depends on the degree of consistency of the causal relationships between different populations - and can therefore be used to integrate multiple datasets. Additional advantages of system-wide models include: the use of instrumental variables - now emerging as an important technique in epidemiology in the context of mendelian randomisation, but under-used in the exploitation of "natural experiments"; the explicit use of change models, which have advantages with respect to inferring causation; and in the detection and elucidation of feedback.

  2. Causal diagrams in systems epidemiology.

    Science.gov (United States)

    Joffe, Michael; Gambhir, Manoj; Chadeau-Hyam, Marc; Vineis, Paolo

    2012-03-19

    Methods of diagrammatic modelling have been greatly developed in the past two decades. Outside the context of infectious diseases, systematic use of diagrams in epidemiology has been mainly confined to the analysis of a single link: that between a disease outcome and its proximal determinant(s). Transmitted causes ("causes of causes") tend not to be systematically analysed.The infectious disease epidemiology modelling tradition models the human population in its environment, typically with the exposure-health relationship and the determinants of exposure being considered at individual and group/ecological levels, respectively. Some properties of the resulting systems are quite general, and are seen in unrelated contexts such as biochemical pathways. Confining analysis to a single link misses the opportunity to discover such properties.The structure of a causal diagram is derived from knowledge about how the world works, as well as from statistical evidence. A single diagram can be used to characterise a whole research area, not just a single analysis - although this depends on the degree of consistency of the causal relationships between different populations - and can therefore be used to integrate multiple datasets.Additional advantages of system-wide models include: the use of instrumental variables - now emerging as an important technique in epidemiology in the context of mendelian randomisation, but under-used in the exploitation of "natural experiments"; the explicit use of change models, which have advantages with respect to inferring causation; and in the detection and elucidation of feedback.

  3. Heuristic Task Analysis on E-Learning Course Development: A Formative Research Study

    Science.gov (United States)

    Lee, Ji-Yeon; Reigeluth, Charles M.

    2009-01-01

    Utilizing heuristic task analysis (HTA), a method developed for eliciting, analyzing, and representing expertise in complex cognitive tasks, a formative research study was conducted on the task of e-learning course development to further improve the HTA process. Three instructional designers from three different post-secondary institutions in the…

  4. The Effects of Describing Antecedent Stimuli and Performance Criteria in Task Analysis Instruction for Graphing

    Science.gov (United States)

    Tyner, Bryan C.; Fienup, Daniel M.

    2016-01-01

    Task analyses are ubiquitous to applied behavior analysis interventions, yet little is known about the factors that make them effective. Numerous task analyses have been published in behavior analytic journals for constructing single-subject design graphs; however, learner outcomes using these task analyses may fall short of what could be…

  5. The Fishbone diagram to identify, systematize and analyze the sources of general purpose technologies

    OpenAIRE

    COCCIA, Mario

    2017-01-01

    Abstract. This study suggests the fishbone diagram for technological analysis. Fishbone diagram (also called Ishikawa diagrams or cause-and-effect diagrams) is a graphical technique to show the several causes of a specific event or phenomenon. In particular, a fishbone diagram (the shape is similar to a fish skeleton) is a common tool used for a cause and effect analysis to identify a complex interplay of causes for a specific problem or event. The fishbone diagram can be a comprehensive theo...

  6. Darwinian algorithms and the Wason selection task: a factorial analysis of social contract selection task problems.

    Science.gov (United States)

    Platt, R D; Griggs, R A

    1993-08-01

    In four experiments with 760 subjects, the present study examined Cosmides' Darwinian algorithm theory of reasoning: specifically, its explanation of facilitation on the Wason selection task. The first experiment replicated Cosmides' finding of facilitation for social contract versions of the selection task, using both her multiple-problem format and a single-problem format. Experiment 2 examined performance on Cosmides' three main social contract problems while manipulating the perspective of the subject and the presence and absence of cost-benefit information. The presence of cost-benefit information improved performance in two of the three problems while the perspective manipulation had no effect. In Experiment 3, the cost-benefit effect was replicated; and performance on one of the three problems was enhanced by the presence of explicit negatives on the NOT-P and NOT-Q cards. Experiment 4 examined the role of the deontic term "must" in the facilitation observed for two of the social contract problems. The presence of "must" led to a significant improvement in performance. The results of these experiments are strongly supportive of social contract theory in that cost-benefit information is necessary for substantial facilitation to be observed in Cosmides' problems. These findings also suggest the presence of other cues that can help guide subjects to a deontic social contract interpretation when the social contract nature of the problem is not clear.

  7. Safety-barrier diagrams as a safety management tool

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    2009-01-01

    Safety-barrier diagrams and “bow-tie” diagrams have become popular methods in risk analysis and safety management. This paper describes the syntax and principles for constructing consistent and valid safety-barrier diagrams. The latter's relation to other methods such as fault trees and Bayesian...

  8. Critical Analysis on the Defeat of Task Force Ranger

    National Research Council Canada - National Science Library

    Day, Clifford

    1997-01-01

    .... The final stage, UNOSOM II, involved a peace enforcement and nation building mission. On Sunday, 3 October 1993, the relative success of UNOSOM II suddenly turned violent when a US Task Force came under heavy fire from Somali gunmen...

  9. An Approach to Operational Analysis: Doctrinal Task Decomposition

    Science.gov (United States)

    2016-08-04

    Once the unit is selected , CATS will output all of the doctrinal collective tasks associated with the unit. Currently, CATS outputs this information...Army unit are controlled data items, but for explanation purposes consider this simple example using a restaurant as the unit of interest. Table 1...shows an example Task Model for a restaurant using language and format similar to what CATS provides. Only 3 levels are shown in the example, but

  10. A Hubble Diagram for Quasars

    Directory of Open Access Journals (Sweden)

    Susanna Bisogni

    2018-01-01

    Full Text Available The cosmological model is at present not tested between the redshift of the farthest observed supernovae (z ~ 1.4 and that of the Cosmic Microwave Background (z ~ 1,100. Here we introduce a new method to measure the cosmological parameters: we show that quasars can be used as “standard candles” by employing the non-linear relation between their intrinsic UV and X-ray emission as an absolute distance indicator. We built a sample of ~1,900 quasars with available UV and X-ray observations, and produced a Hubble Diagram up to z ~ 5. The analysis of the quasar Hubble Diagram, when used in combination with supernovae, provides robust constraints on the matter and energy content in the cosmos. The application of this method to forthcoming, larger quasar samples, will also provide tight constraints on the dark energy equation of state and its possible evolution with time.

  11. Identification and Analysis of Multi-tasking Product Information Search Sessions with Query Logs

    Directory of Open Access Journals (Sweden)

    Xiang Zhou

    2016-09-01

    Full Text Available Purpose: This research aims to identify product search tasks in online shopping and analyze the characteristics of consumer multi-tasking search sessions. Design/methodology/approach: The experimental dataset contains 8,949 queries of 582 users from 3,483 search sessions. A sequential comparison of the Jaccard similarity coefficient between two adjacent search queries and hierarchical clustering of queries is used to identify search tasks. Findings: (1 Users issued a similar number of queries (1.43 to 1.47 with similar lengths (7.3-7.6 characters per task in mono-tasking and multi-tasking sessions, and (2 Users spent more time on average in sessions with more tasks, but spent less time for each task when the number of tasks increased in a session. Research limitations: The task identification method that relies only on query terms does not completely reflect the complex nature of consumer shopping behavior. Practical implications: These results provide an exploratory understanding of the relationships among multiple shopping tasks, and can be useful for product recommendation and shopping task prediction. Originality/value: The originality of this research is its use of query clustering with online shopping task identification and analysis, and the analysis of product search session characteristics.

  12. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  13. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  14. Combined analysis of job and task benzene air exposures among workers at four US refinery operations.

    Science.gov (United States)

    Burns, Amanda; Shin, Jennifer Mi; Unice, Ken M; Gaffney, Shannon H; Kreider, Marisa L; Gelatt, Richard H; Panko, Julie M

    2017-03-01

    Workplace air samples analyzed for benzene at four US refineries from 1976 to 2007 were pooled into a single dataset to characterize similarities and differences between job titles, tasks and refineries, and to provide a robust dataset for exposure reconstruction. Approximately 12,000 non-task (>180 min) personal samples associated with 50 job titles and 4000 task (job titles and task codes across all four refineries, and (5) our analysis of variance (ANOVA) of the distribution of benzene air concentrations for select jobs/tasks across all four refineries. The jobs and tasks most frequently sampled included those with highest potential contact with refinery product streams containing benzene, which reflected the targeted sampling approach utilized by the facility industrial hygienists. Task and non-task data were analyzed to identify and account for significant differences within job-area, task-job, and task-area categories. This analysis demonstrated that in general, areas with benzene containing process streams were associated with greater benzene air concentrations compared to areas with process streams containing little to no benzene. For several job titles and tasks analyzed, there was a statistically significant decrease in benzene air concentration after 1990. This study provides a job and task-focused analysis of occupational exposure to benzene during refinery operations, and it should be useful for reconstructing refinery workers' exposures to benzene over the past 30 years.

  15. Safety analysis of patient transfers and handling tasks.

    Science.gov (United States)

    Vieira, Er; Kumar, S

    2009-10-01

    Low-back disorders are related to biomechanical demands, and nurses are among the professionals with the highest rates. Quantification of risk factors is important for safety assessment and reduction of low-back disorders. This study aimed to quantify physical demands of frequent nursing tasks and provide evidence-based recommendations to increase low-back safety. Thirty-six volunteer female nurses participated in a cross-sectional study of nine nursing tasks. Lumbar range of motion (ROM) and motion during nursing tasks were measured. Compression and shear forces at L5/S1, ligament strain and percentage of population without sufficient torso strength to perform 14 phases of nine nursing tasks were estimated. Peak flexions during trolley-to-bed, bed-to-chair and chair-to-bed transfers reached the maximum flexion ROM of the nurses. Average lumbar flexion during trolley-to-bed transfers was >50% of flexion ROM, being higher than during all other tasks. Mean (SD) compression at L5/S1 (4754 N (437 N)) and population without sufficient torso strength (37% (9%)) were highest during the pushing phase of bed-to-trolley transfers. Shear force (487 N (40 N)) and ligament strain (14% (5%)) were highest during the pulling phase of trolley-to-bed transfers. Nursing tasks impose high biomechanical demands on the lumbar spine. Excessive lumbar flexion and forces are critical aspects of manual transfers requiring most of the nurses' capabilities. Evidence-based recommendations to improve low-back safety in common nursing tasks were provided. Fitness to work, job modifications and training programs can now be designed and assessed based on the results.

  16. A comparative critical analysis of modern task-parallel runtimes.

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Kyle Bruce; Stark, Dylan; Murphy, Richard C.

    2012-12-01

    The rise in node-level parallelism has increased interest in task-based parallel runtimes for a wide array of application areas. Applications have a wide variety of task spawning patterns which frequently change during the course of application execution, based on the algorithm or solver kernel in use. Task scheduling and load balance regimes, however, are often highly optimized for specific patterns. This paper uses four basic task spawning patterns to quantify the impact of specific scheduling policy decisions on execution time. We compare the behavior of six publicly available tasking runtimes: Intel Cilk, Intel Threading Building Blocks (TBB), Intel OpenMP, GCC OpenMP, Qthreads, and High Performance ParalleX (HPX). With the exception of Qthreads, the runtimes prove to have schedulers that are highly sensitive to application structure. No runtime is able to provide the best performance in all cases, and those that do provide the best performance in some cases, unfortunately, provide extremely poor performance when application structure does not match the schedulers assumptions.

  17. Development of real-time multitask OSS based on cognitive task analysis

    International Nuclear Information System (INIS)

    Wang He; Cheng Shouyu

    2010-01-01

    A Real-time Multi-task Operator Support System (RMOSS) has been developed to support the operator's decision making process in the control room of NPP. VxWorks, one embedded real-time operation system, is used for RMOSS software development. According to the SRK modeling analysis result of the operator' decision making process, RMOSS is divided into five system subtasks, including Data Collection and Validation Task (DCVT), Operation Monitor Task (OMT), Fault Diagnostic Task (FDT), Operation Guideline Task (OGT) and Human Machine Interface Task (HMIT). The task test of RMOSS has been done in a real-time full scope simulator. The results showed that each task of RMOSS is capable of accomplishing their functions. (authors)

  18. Homotopy Diagrams of Algebras

    Czech Academy of Sciences Publication Activity Database

    Markl, Martin

    2002-01-01

    Roč. 69, - (2002), s. 161-180 ISSN 0009-725X. [Winter School "Geometry and Physics" /21./. Srní, 13.01.2001-20.01.2001] R&D Projects: GA ČR GA201/99/0675 Keywords : colored operad%cofibrant model%homotopy diagram Subject RIV: BA - General Mathematics

  19. Impulse-Momentum Diagrams

    Science.gov (United States)

    Rosengrant, David

    2011-01-01

    Multiple representations are a valuable tool to help students learn and understand physics concepts. Furthermore, representations help students learn how to think and act like real scientists. These representations include: pictures, free-body diagrams, energy bar charts, electrical circuits, and, more recently, computer simulations and…

  20. Equational binary decision diagrams

    NARCIS (Netherlands)

    J.F. Groote (Jan Friso); J.C. van de Pol (Jaco)

    2000-01-01

    textabstractWe incorporate equations in binary decision diagrams (BDD). The resulting objects are called EQ-BDDs. A straightforward notion of ordered EQ-BDDs (EQ-OBDD) is defined, and it is proved that each EQ-BDD is logically equivalent to an EQ-OBDD. Moreover, on EQ-OBDDs satisfiability and

  1. Limits of Voronoi Diagrams

    NARCIS (Netherlands)

    Lindenbergh, R.C.

    2002-01-01

    The classic Voronoi diagram of a configuration of distinct points in the plane associates to each point that part of the plane that is closer to the point than to any other point in the configuration. In this thesis we no longer require all points to be distinct. After the introduction in

  2. Task Analysis: A Systematic Approach to Designing New Careers Programs.

    Science.gov (United States)

    Jackson, Vivian C.

    This guide presents the primary approaches, tools, and techniques utilized by the New Careers Training Laboratory (NCTL) staff to provide skills in training and to conduct agency task analyses. Much of the technical information has been taken from an earlier NCTL publication by Tita Beal, "A New Careers Guide for Career Development…

  3. Designing Preclinical Instruction for Psychomotor Skills (II)--Instructional Engineering: Task Analysis.

    Science.gov (United States)

    Knight, G. William; And Others

    1994-01-01

    The first step in engineering the instruction of dental psychomotor skills, task analysis, is explained. A chart details the procedural, cognitive, desired-criteria, and desired-performance analysis of a single task, occlusal preparation for amalgam restoration with carious lesion. (MSE)

  4. A σ-T diagram analysis regarding the γ' inhibition in β ↔ β' + γ' cycling in CuAlNi single crystals

    International Nuclear Information System (INIS)

    Gastien, R.; Corbellani, C.E.; Sade, M.; Lovey, F.C.

    2006-01-01

    An effect of inhibition of the γ' martensitic structure in thermal and pseudoelastic β ↔ β' + γ' cycling in CuAlNi single crystals was reported previously [Gastien R, Corbellani CE, Alvarez Villar HN, Sade M, Lovey FC. Mater Sci Eng A 2003;349:191], and an experiment to determine the new thermodynamic parameters to obtain the stress-induced γ' structure was performed [Gastien R, Corbellani CE, Sade M, Lovey FC. Acta Mater 2005;53:1685]. In this paper, a thermodynamic analysis of this effect using σ-T diagrams is proposed, in order to obtain a proper estimation of the energy involved in the inhibition process for pseudoelastic β ↔ β' + γ' cycling

  5. Thermal analysis and phase diagrams of the LiF BiF3 e NaF BiF3 systems

    International Nuclear Information System (INIS)

    Nakamura, Gerson Hiroshi de Godoy

    2013-01-01

    Investigations of the binary systems LiF-BiF 3 and NaF-BiF 3 were performed with the objective of clarifying the thermal behavior and phase equilibria of these systems and their intermediary phases, an important requisite for high-quality crystal growth. Several samples in the entire range of compositions (0 to 100 mol% BiF 3 ) of both systems were subjected to experiments of differential thermal analysis (DTA) and thermogravimetry (TG), and also of differential scanning calorimetry (DSC). A few specific compositions were selected for X-ray diffraction to supplement the experimental data. Due to the high vulnerability of BiF 3 to oxygen contamination, its volatility and propensity to destroy metal parts upon heating, it was necessary to determine the optimal conditions for thermal analysis before investigating the systems themselves. Phase relations in the system LiF-BiF 3 were completely clarified and a phase diagram was calculated and evaluated via the commercial software Factsage. The diagram itself consists in a simple peritectic system in which the only intermediary compound, LiBiF 4 , decomposes into LiF and a liquid phase. The NaF-BiF 3 system could not be completely elucidated and the phase relations in the NaF poor side (> 50% BiF 3 ) are still unknown. In the NaF rich side, however, the possible peritectoid decomposition of the compound NaBiF 4 was identified. In both systems X-ray diffraction yielded crystal structures discrepant with the literature for the intermediary phases, LiBiF 4 , NaBiF 4 and a solid solution of NaF and BiF 3 called I. The observed structures remain unknown and explanations for the discrepancies were proposed. (author)

  6. Nuclear power plant control room task analysis. Pilot study for pressurized water reactors

    International Nuclear Information System (INIS)

    Barks, D.B.; Kozinsky, E.J.; Eckel, S.

    1982-05-01

    The purposes of this nuclear plant task analysis pilot study: to demonstrate the use of task analysis techniques on selected abnormal or emergency operation events in a nuclear power plant; to evaluate the use of simulator data obtained from an automated Performance Measurement System to supplement and validate data obtained by traditional task analysis methods; and to demonstrate sample applications of task analysis data to address questions pertinent to nuclear power plant operational safety: control room layout, staffing and training requirements, operating procedures, interpersonal communications, and job performance aids. Five data sources were investigated to provide information for a task analysis. These sources were (1) written operating procedures (event-based); (2) interviews with subject matter experts (the control room operators); (3) videotapes of the control room operators (senior reactor operators and reactor operators) while responding to each event in a simulator; (4) walk-/talk-throughs conducted by control room operators for each event; and (5) simulator data from the PMS

  7. A Cognitive Analysis of Armor Procedural Task Training

    Science.gov (United States)

    1982-03-01

    Verbal Behavior, 8, 323-343. Craik , F. I. M., & Lockhart , R. S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning...concep- tual or meaningful) coding of the task to be learned (e.g., Bjork, 1975; Craik & Lockhart , 1972; Melton & Martin, 1972). In order to remember a...were several serious problems with applying this approach in the context of entry- level military training. In particular, the soldier did not always

  8. The application of diagrams in architectural design

    Directory of Open Access Journals (Sweden)

    Dulić Olivera

    2014-01-01

    Full Text Available Diagrams in architecture represent the visualization of the thinking process, or selective abstraction of concepts or ideas translated into the form of drawings. In addition, they provide insight into the way of thinking about and in architecture, thus creating a balance between the visual and the conceptual. The subject of research presented in this paper are diagrams as a specific kind of architectural representation, and possibilities and importance of their application in the design process. Diagrams are almost old as architecture itself, and they are an element of some of the most important studies of architecture during all periods of history - which results in a large number of different definitions of diagrams, but also very different conceptualizations of their features, functions and applications. The diagrams become part of contemporary architectural discourse during the eighties and nineties of the twentieth century, especially through the work of architects like Bernard Tschumi, Peter Eisenman, Rem Koolhaas, SANAA and others. The use of diagrams in the design process allows unification of some of the essential aspects of the profession: architectural representation and design process, as well as the question of the concept of architectural and urban design at a time of rapid changes at all levels of contemporary society. The aim of the research is the analysis of the diagram as a specific medium for processing large amounts of information that the architect should consider and incorporate into the architectural work. On that basis, it is assumed that an architectural diagram allows the creator the identification and analysis of specific elements or ideas of physical form, thereby constantly maintaining concept of the integrity of the architectural work.

  9. Inter-subject phase synchronization for exploratory analysis of task-fMRI.

    Science.gov (United States)

    Bolt, Taylor; Nomi, Jason S; Vij, Shruti G; Chang, Catie; Uddin, Lucina Q

    2018-08-01

    Analysis of task-based fMRI data is conventionally carried out using a hypothesis-driven approach, where blood-oxygen-level dependent (BOLD) time courses are correlated with a hypothesized temporal structure. In some experimental designs, this temporal structure can be difficult to define. In other cases, experimenters may wish to take a more exploratory, data-driven approach to detecting task-driven BOLD activity. In this study, we demonstrate the efficiency and power of an inter-subject synchronization approach for exploratory analysis of task-based fMRI data. Combining the tools of instantaneous phase synchronization and independent component analysis, we characterize whole-brain task-driven responses in terms of group-wise similarity in temporal signal dynamics of brain networks. We applied this framework to fMRI data collected during performance of a simple motor task and a social cognitive task. Analyses using an inter-subject phase synchronization approach revealed a large number of brain networks that dynamically synchronized to various features of the task, often not predicted by the hypothesized temporal structure of the task. We suggest that this methodological framework, along with readily available tools in the fMRI community, provides a powerful exploratory, data-driven approach for analysis of task-driven BOLD activity. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Peircean diagrams of time

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter

    2011-01-01

    Some very good arguments can be given in favor of the Augustinean wisdom, according to which it is impossible to provide a satisfactory definition of the concept of time. However, even in the absence of a proper definition, it is possible to deal with conceptual problems regarding time. It can...... be done in terms of analogies and metaphors. In particular, it is attractive to make use of Peirce's diagrams by means of which various kinds of conceptual experimentation can be carried out. This paper investigates how Peircean diagrams can be used within the study of time. In particular, we discuss 1......) the topological properties of time, 2) the implicative structure in tense logic, 3) the notions of open future and branching time models, and finally 4) tenselogical alternatives to branching time models....

  11. Control wiring diagrams

    International Nuclear Information System (INIS)

    McCauley, T.M.; Eskinazi, M.; Henson, L.L.

    1989-01-01

    This paper discusses the changes in electrical document requirements that occur when construction is complete and a generating station starts commercial operation. The needs of operations and maintenance (O and M) personnel are analyzed and contrasted with those of construction to illustrate areas in which the construction documents (drawings, diagrams, and databases) are difficult to use for work at an operating station. The paper discusses the O and M electrical documents that the Arizona Nuclear Power Project (ANPP) believes are most beneficial for the three operating units at Palo Verde; these are control wiring diagrams and an associated document cross-reference list. The benefits offered by these new, station O and M-oriented documents are weighted against the cost of their creation and their impact on drawing maintenance

  12. TEP process flow diagram

    Energy Technology Data Exchange (ETDEWEB)

    Wilms, R Scott [Los Alamos National Laboratory; Carlson, Bryan [Los Alamos National Laboratory; Coons, James [Los Alamos National Laboratory; Kubic, William [Los Alamos National Laboratory

    2008-01-01

    This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

  13. Feynman diagram drawing made easy

    International Nuclear Information System (INIS)

    Baillargeon, M.

    1997-01-01

    We present a drawing package optimised for Feynman diagrams. These can be constructed interactively with a mouse-driven graphical interface or from a script file, more suitable to work with a diagram generator. It provides most features encountered in Feynman diagrams and allows to modify every part of a diagram after its creation. Special attention has been paid to obtain a high quality printout as easily as possible. This package is written in Tcl/Tk and in C. (orig.)

  14. Using task analysis to improve the requirements elicitation in health information system.

    Science.gov (United States)

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2007-01-01

    This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.

  15. Ring diagrams and phase transitions

    International Nuclear Information System (INIS)

    Takahashi, K.

    1986-01-01

    Ring diagrams at finite temperatures carry most infrared-singular parts among Feynman diagrams. Their effect to effective potentials are in general so significant that one must incorporate them as well as 1-loop diagrams. The author expresses these circumstances in some examples of supercooled phase transitions

  16. Automation of Feynman diagram evaluations

    International Nuclear Information System (INIS)

    Tentyukov, M.N.

    1998-01-01

    A C-program DIANA (DIagram ANAlyser) for the automation of Feynman diagram evaluations is presented. It consists of two parts: the analyzer of diagrams and the interpreter of a special text manipulating language. This language can be used to create a source code for analytical or numerical evaluations and to keep the control of the process in general

  17. State diagram of Pr-Bi system

    International Nuclear Information System (INIS)

    Abulkhaev, V.L.; Ganiev, I.N.

    1994-01-01

    By means of thermal differential analysis, X-ray and microstructural analysis the state diagram of Pr-Bi system was studied. Following intermetallic compounds were defined in the system: Pr 2 Bi, Pr 5 Bi 3 , Pr 4 Bi 3 , Pr Bi, PrBi 2 , Pr 2 Bi, Pr 5 Bi 3 , Pr 4 Bi 3 and PrBi 2 . The data analysis on Ln-Bi diagram allowed to determine the regularity of change of properties of intermetallic compounds in the line of rare earth elements of cerium subgroup.

  18. Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory

    Science.gov (United States)

    Fiester, Herbert R.

    2010-01-01

    The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…

  19. Increasing Pizza Box Assembly Using Task Analysis and a Least-to-Most Prompting Hierarchy

    Science.gov (United States)

    Stabnow, Erin F.

    2015-01-01

    This study was designed to use a task analysis and a least-to-most prompting hierarchy to teach students with cognitive disabilities pizza box assembly skills. The purpose of this study was to determine whether a least-to-most prompting hierarchy was effective in teaching students with cognitive disabilities to increase the number of task-analyzed…

  20. THE ANALYSIS OF CAUSES AND EFFECTS OF A PHENOMENON BY MEANS OF THE “FISHBONE” DIAGRAM

    Directory of Open Access Journals (Sweden)

    ECOBICI MIHAELA LOREDANA

    2017-10-01

    Full Text Available The risk has been and will remain one of the main problems any company is put to trial during its activity, regardless of its type. Whether we are talking about a financial activity, a production activity, a management activity, etc ... the risk is a matter which should not be neglected. The motivation behind a detailed analysis of risks lies in the complexity and their multiple effects, in the need for security, in the desire for a safe development of the activity of the company, a development of safe and cost-effective projects, the implementation of performance and safe technologies etc. Depending on the field of activity, the risk is dealt with in different ways. Whether we are talking about a transaction, a project, an organization, an asset, a monetary flow, the risk is composed of two elements: the probability of occurring of an event that could affect one of the units of analysis mentioned above and the effect that this has on the unit of analysis. Therefore, the manifestation of risk leads the analysis to a difficult situation to which the management of the organization is supposed to find viable solutions for surpassing. In the specialized literature there many methods and techniques in managing the emergence and the manifestation of risks, and one of these is the fishbone technique. Researches in the field have led to the gathering and analysis of items of information regarding this analytical tool. The results of the theoretical and practical research have resulted in a multitude of studies and research papers on such analysis tools that help improve understanding and identify the underlying causes of certajn problems. The purpose of this article is to achieve a representation of the relationships between the possible effects and possible causes that can influence a process, a phenomenon, an action.

  1. Reliability Block Diagram (RBD) Analysis of NASA Dryden Flight Research Center (DFRC) Flight Termination System and Power Supply

    Science.gov (United States)

    Morehouse, Dennis V.

    2006-01-01

    In order to perform public risk analyses for vehicles containing Flight Termination Systems (FTS), it is necessary for the analyst to know the reliability of each of the components of the FTS. These systems are typically divided into two segments; a transmitter system and associated equipment, typically in a ground station or on a support aircraft, and a receiver system and associated equipment on the target vehicle. This analysis attempts to analyze the reliability of the NASA DFRC flight termination system ground transmitter segment for use in the larger risk analysis and to compare the results against two established Department of Defense availability standards for such equipment.

  2. Task Analysis Questionnaire Booklet, Occupational Field 60/61.

    Science.gov (United States)

    1979-08-01

    112 " AND AbNVE MLY Th-r, I a neal fc I ’ (’ ’ 0 It , fo I [,w , n ’ ubjet:: (Thek Uioe app I I cit, I 1. ’,a 1 irvi nct lan. *,’:! Ielt .,’ ’Vn :’cv...the "TPMI S" >7" -. lumn ratin, the "TIME SPENT" in compa !-son t,, -ill other . you are required to do in your present job. ’:molelt the " TRAINING ...11h) ,o- .n. TURN TO I’T: NEXT PAGE IN T1t1’) D O ..’ AND START PART Ili, FU NOT I)NA!, TAiti7 II F. PART lII QUt’STIONNAIRE GENERAL TASKS i. Review

  3. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  4. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  5. Diagrams and Relational Maps: The Use of Graphic Elicitation Techniques with Interviewing for Data Collection, Analysis, and Display

    Directory of Open Access Journals (Sweden)

    Andrea J. Copeland PhD

    2012-12-01

    Full Text Available Graphic elicitation techniques, which ask research participants to provide visual data representing personal understandings of concepts, experiences, beliefs, or behaviors, can be especially useful in helping participants to express complex or abstract ideas or opinions. The benefits and drawbacks of using graphic elicitation techniques for data collection, data analysis, and data display in qualitative research studies are analyzed using examples from a research study that employed data matrices and relational maps in conjunction with semi-structured interviews. Results from this analysis demonstrate that the use of these combined techniques for data collection facilitates triangulation and helps to establish internal consistency of data, thereby increasing the trustworthiness of the interpretation of that data and lending support to validity and reliability claims. Findings support the notion that graphic elicitation techniques can be highly useful in qualitative research studies at the data collection, the data analysis, and the data reporting stages. For example, this study found that graphic elicitation techniques are especially useful for eliciting data related to emotions and emotional experiences.

  6. Systems analysis programs for hands-on integrated reliability evaluations (SAPHIRE) Version 5.0. Fault tree, event tree, and piping ampersand instrumentation diagram (FEP) editors reference manual: Volume 7

    International Nuclear Information System (INIS)

    McKay, M.K.; Skinner, N.L.; Wood, S.T.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Fault Tree, Event Tree, and Piping and Instrumentation Diagram (FEP) editors allow the user to graphically build and edit fault trees, and event trees, and piping and instrumentation diagrams (P and IDs). The software is designed to enable the independent use of the graphical-based editors found in the Integrated Reliability and Risk Assessment System (IRRAS). FEP is comprised of three separate editors (Fault Tree, Event Tree, and Piping and Instrumentation Diagram) and a utility module. This reference manual provides a screen-by-screen guide of the entire FEP System

  7. Staff Performance Analysis: A Method for Identifying Brigade Staff Tasks

    National Research Council Canada - National Science Library

    Ford, Laura

    1997-01-01

    ... members of conventional mounted brigade staff. Initial analysis of performance requirements in existing documentation revealed that the performance specifications were not sufficiently detailed for brigade battle staffs...

  8. Quantum MHV Diagrams

    OpenAIRE

    Brandhuber, Andreas; Travaglini, Gabriele

    2006-01-01

    Over the past two years, the use of on-shell techniques has deepened our understanding of the S-matrix of gauge theories and led to the calculation of many new scattering amplitudes. In these notes we review a particular on-shell method developed recently, the quantum MHV diagrams, and discuss applications to one-loop amplitudes. Furthermore, we briefly discuss the application of D-dimensional generalised unitarity to the calculation of scattering amplitudes in non-supersymmetric Yang-Mills.

  9. Linkage Technologies Which Enhance the Utility of Task-Based Occupational Analysis

    National Research Council Canada - National Science Library

    Phalen, William

    1999-01-01

    .... It is alleged that traditional task-based occupational analysis is too labor intensive, too costly, too cumbersome, and too static to meet the emerging and rapidly changing needs of a business...

  10. Task versus relationship conflict, team performance and team member satisfaction: a meta-analysis

    NARCIS (Netherlands)

    de Dreu, C.K.W.; Weingart, L.R.

    2003-01-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, resultsrevealed strong and negative correlations between relationship conflict, team performance, and

  11. Cognitive and collaborative demands of freight conductor activities: results and implications of a cognitive task analysis

    Science.gov (United States)

    2012-07-31

    This report presents the results of a cognitive task analysis (CTA) that examined the cognitive and collaborative demands placed on conductors, as well as the knowledge and skills that experienced conductors have developed that enable them to operate...

  12. A Cognitive Task Analysis for an Emergency Management Serious Game.

    Science.gov (United States)

    Dass, Susan; Barnieu, Joanne; Cummings, Paul; Cid, Victor

    2016-01-01

    The Bethesda Hospitals' Emergency Preparedness Partnership identified a need to design training systems for hospital emergency management scenarios that included incident command situations. As part of this partnership, the National Library of Medicine (NLM) was challenged to develop an engaging, learner-centered simulation to specifically address hospital procedures for highly infectious diseases (HIDs) for multiple hospital roles. A serious game approach was selected for the simulation because collaborative (multiplayer) immersive, game-based simulations have been proven to generate realistic and engaging learning experiences and, when properly designed, can enhance training while minimizing cost compared to full-scale disaster exercises (Spain et al., 2013). Although substantial research effort has been put into design and evaluation of serious games, less time has been spent on developing sound instructional design methodologies to support serious game development. So how does one collect the appropriate, relevant, contextualized content and then align with serious game design elements? This paper describes how a cognitive task approach supported by a live demonstration with a think-aloud protocol was used to collect the rich psychomotor, procedural, and cognitive data necessary for the design of a serious game for handling HIDs. Furthermore, the paper presents a process to translate the collected data into meaningful content to support rapid prototyping. Recommendations for data collection and translation for a serious game close the paper.

  13. A biomechanical analysis of common lunge tasks in badminton.

    Science.gov (United States)

    Kuntze, Gregor; Mansfield, Neil; Sellers, William

    2010-01-01

    The lunge is regularly used in badminton and is recognized for the high physical demands it places on the lower limbs. Despite its common occurrence, little information is available on the biomechanics of lunging in the singles game. A video-based pilot study confirmed the relatively high frequency of lunging, approximately 15% of all movements, in competitive singles games. The biomechanics and performance characteristics of three badminton-specific lunge tasks (kick, step-in, and hop lunge) were investigated in the laboratory with nine experienced male badminton players. Ground reaction forces and kinematic data were collected and lower limb joint kinetics calculated using an inverse dynamics approach. The step-in lunge was characterized by significantly lower mean horizontal reaction force at drive-off and lower mean peak hip joint power than the kick lunge. The hop lunge resulted in significantly larger mean reaction forces during loading and drive-off phases, as well as significantly larger mean peak ankle joint moments and knee and ankle joint powers than the kick or step-in lunges. These findings indicate that, within the setting of this investigation, the step-in lunge may be beneficial for reducing the muscular demands of lunge recovery and that the hop lunge allows for higher positive power output, thereby presenting an efficient lunging method.

  14. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    Science.gov (United States)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  15. Dashboard task monitor for managing ATLAS user analysis on the grid

    International Nuclear Information System (INIS)

    Sargsyan, L; Andreeva, J; Karavakis, E; Saiz, P; Tuckett, D; Jha, M; Kokoszkiewicz, L; Schovancova, J

    2014-01-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  16. Pilot-model analysis and simulation study of effect of control task desired control response

    Science.gov (United States)

    Adams, J. J.; Gera, J.; Jaudon, J. B.

    1978-01-01

    A pilot model analysis was performed that relates pilot control compensation, pilot aircraft system response, and aircraft response characteristics for longitudinal control. The results show that a higher aircraft short period frequency is required to achieve superior pilot aircraft system response in an altitude control task than is required in an attitude control task. These results were confirmed by a simulation study of target tracking. It was concluded that the pilot model analysis provides a theoretical basis for determining the effect of control task on pilot opinions.

  17. Task 2 Report: Algorithm Development and Performance Analysis

    Science.gov (United States)

    1993-07-01

    separated peaks ............................................. 39 7-16 Example ILGC data for schedule 3 phosphites showing an analysis method which integrates...more closely follows the baseline ................. 40 7-18 Example R.GC data for schedule 3 phosphites showing an analysis method resulting in unwanted...much of the ambiguity that can arise in GC/MS with trace environmental samples, for example. Correlated chromatography, on the other hand, separates the

  18. Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques

    Directory of Open Access Journals (Sweden)

    Dr. Ismail Ipek

    2014-02-01

    Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.

  19. Initiating an ergonomic analysis. A process for jobs with highly variable tasks.

    Science.gov (United States)

    Conrad, K M; Lavender, S A; Reichelt, P A; Meyer, F T

    2000-09-01

    Occupational health nurses play a vital role in addressing ergonomic problems in the workplace. Describing and documenting exposure to ergonomic risk factors is a relatively straightforward process in jobs in which the work is repetitive. In other types of work, the analysis becomes much more challenging because tasks may be repeated infrequently, or at irregular time intervals, or under different environmental and temporal conditions, thereby making it difficult to observe a "representative" sample of the work performed. This article describes a process used to identify highly variable job tasks for ergonomic analyses. The identification of tasks for ergonomic analysis was a two step process involving interviews and a survey of firefighters and paramedics from a consortium of 14 suburban fire departments. The interviews were used to generate a list of frequently performed, physically strenuous job tasks and to capture clear descriptions of those tasks and associated roles. The goals of the survey were to confirm the interview findings across the entire target population and to quantify the frequency and degree of strenuousness of each task. In turn, the quantitative results from the survey were used to prioritize job tasks for simulation. Although this process was used to study firefighters and paramedics, the approach is likely to be suitable for many other types of occupations in which the tasks are highly variable in content and irregular in frequency.

  20. A New Look at the Eclipse Timing Variation Diagram Analysis of Selected 3-body W UMa Systems

    Science.gov (United States)

    Christopoulou, P.-E.; Papageorgiou, A.

    2015-07-01

    The light travel effect produced by the presence of tertiary components can reveal much about the origin and evolution of over-contact binaries. Monitoring of W UMa systems over the last decade and/or the use of publicly available photometric surveys (NSVS, ASAS, etc.) has uncovered or suggested the presence of many unseen companions, which calls for an in-depth investigation of the parameters derived from cyclic period variations in order to confirm or reject the assumption of hidden companion(s). Progress in the analysis of eclipse timing variations is summarized here both from the empirical and the theoretical points of view, and a more extensive investigation of the proposed orbital parameters of third bodies is proposed. The code we have developed for this, implemented in Python, is set up to handle heuristic scanning with parameter perturbation in parameter space, and to establish realistic uncertainties from the least squares fitting. A computational example is given for TZ Boo, a W UMa system with a spectroscopically detected third component. Future options to be implemented include MCMC and bootstrapping.

  1. Task-related component analysis for functional neuroimaging and application to near-infrared spectroscopy data.

    Science.gov (United States)

    Tanaka, Hirokazu; Katura, Takusige; Sato, Hiroki

    2013-01-01

    Reproducibility of experimental results lies at the heart of scientific disciplines. Here we propose a signal processing method that extracts task-related components by maximizing the reproducibility during task periods from neuroimaging data. Unlike hypothesis-driven methods such as general linear models, no specific time courses are presumed, and unlike data-driven approaches such as independent component analysis, no arbitrary interpretation of components is needed. Task-related components are constructed by a linear, weighted sum of multiple time courses, and its weights are optimized so as to maximize inter-block correlations (CorrMax) or covariances (CovMax). Our analysis method is referred to as task-related component analysis (TRCA). The covariance maximization is formulated as a Rayleigh-Ritz eigenvalue problem, and corresponding eigenvectors give candidates of task-related components. In addition, a systematic statistical test based on eigenvalues is proposed, so task-related and -unrelated components are classified objectively and automatically. The proposed test of statistical significance is found to be independent of the degree of autocorrelation in data if the task duration is sufficiently longer than the temporal scale of autocorrelation, so TRCA can be applied to data with autocorrelation without any modification. We demonstrate that simple extensions of TRCA can provide most distinctive signals for two tasks and can integrate multiple modalities of information to remove task-unrelated artifacts. TRCA was successfully applied to synthetic data as well as near-infrared spectroscopy (NIRS) data of finger tapping. There were two statistically significant task-related components; one was a hemodynamic response, and another was a piece-wise linear time course. In summary, we conclude that TRCA has a wide range of applications in multi-channel biophysical and behavioral measurements. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Advancing the Certified in Public Health Examination: A Job Task Analysis.

    Science.gov (United States)

    Kurz, Richard S; Yager, Christopher; Yager, James D; Foster, Allison; Breidenbach, Daniel H; Irwin, Zachary

    In 2014, the National Board of Public Health Examiners performed a job task analysis (JTA) to revise the Certified in Public Health (CPH) examination. The objectives of this study were to describe the development, administration, and results of the JTA survey; to present an analysis of the survey results; and to review the implications of this first-ever public health JTA. An advisory committee of public health professionals developed a list of 200 public health job tasks categorized into 10 work domains. The list of tasks was incorporated into a web-based survey, and a snowball sample of public health professionals provided 4850 usable responses. Respondents rated job tasks as essential (4), very important (3), important (2), not very important (1), and never performed (0). The mean task importance ratings ranged from 2.61 to 3.01 (important to very important). The highest mean ratings were for tasks in the ethics domain (mean rating, 3.01). Respondents ranked 10 of the 200 tasks as the most important, with mean task rankings ranging from 2.98 to 3.39. We found subtle differences between male and female respondents and between master of public health and doctor of public health respondents in their rankings. The JTA established a set of job tasks in 10 public health work domains, and the results provided a foundation for refining the CPH examination. Additional steps are needed to further modify the content outline of the examination. An empirical assessment of public health job tasks, using methods such as principal components analysis, may provide additional insight.

  3. Pembuatan Kakas Bantu untuk Mendeteksi Ketidaksesuaian Diagram Urutan (Sequence Diagram dengan Diagram Kasus Penggunaan (Use Case Diagram

    Directory of Open Access Journals (Sweden)

    Andrias Meisyal Yuwantoko

    2017-03-01

    Full Text Available Sebuah diagram urutan dibuat  berdasarkan alur yang ada pada deskripsi kasus penggunaan. Alur tersebut dire- presentasikan dalam  bentuk  interaksi antara aktor  dan  sistem. Pemeriksaan rancangan diagram urutan perlu dilakukan untuk mengetahui ketidaksesuaian urutan alur  kasus penggunaan dengan urutan pesan yang dikirimkan oleh objek-objek pada diagram urutan. Rancangan diagram yang sesuai merupakan kunci ketepatan (correctness implementasi  perangkat lunak. Namun, pemeriksaan ketidaksesuaian masih dilakukan secara manual. Hal ini menjadi masalah apabila sebuah proyek perangkat lunak memiliki banyak  rancangan diagram dan sumber daya manusia tidak  mencukupi. Pemeriksaan membutuhkan waktu yang lama dan memiliki dampak pada waktu pengembangan perangkat lunak. Penelitian ini mengusulkan pembuatan kakas bantu  untuk mendeteksi ketidaksesuaian diagram urutan dengan diagram kasus penggunaan. Ketidaksesuaian dilihat dari kemiripan semantik kalimat antara alur pada deskripsi kasus penggunaan dan triplet. Dari hasil pembuatan kakas bantu, kakas bantu yang dibuat dapat mendeteksi ketidaksesuaian diagram urutan dengan diagram kasus penggunaan. Kakas  bantu ini diharapkan tidak hanya membantu pemeriksaan rancangan diagram akan tetapi mempercepat waktu pengembangan perangkat lunak.

  4. Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment

    CERN Document Server

    Grigoryeva, Maria; The ATLAS collaboration

    2018-01-01

    Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...

  5. Constituent phase diagrams of the Al-Cu-Fe-Mg-Ni-Si system and their application to the analysis of aluminium piston alloys

    Energy Technology Data Exchange (ETDEWEB)

    Belov, N.A. [Moscow Institute of Steel and Alloys, Leninsky prosp. 4, Moscow 119049 (Russian Federation); Eskin, D.G. [Netherlands Institute for Metals Research, Rotterdamseweg 137, 2628AL Delft (Netherlands)]. E-mail: deskin@nimr.nl; Avxentieva, N.N. [Moscow Institute of Steel and Alloys, Leninsky prosp. 4, Moscow 119049 (Russian Federation)

    2005-10-15

    The evaluation of phase equilibria in quinary systems that constitute the commercially important Al-Cu-Fe-Mg-Ni-Si alloying system is performed in the compositional range of casting alloys by means of metallography, electron probe microanalysis, X-ray diffractometry, differential scanning calorimetry, and by the analysis of phase equilibria in the constituent systems of lesser dimensionality. Suggested phase equilibria are illustrated by bi-, mono- and invariant solidification reactions, polythermal diagrams of solidification, distributions of phase fields in the solid state, and isothermal and polythermal sections. Phase composition of as-cast alloys is analyzed in terms of non-equilibrium solidification. It is shown that the increase in copper concentration in piston Al-Si alloys results in the decrease in the equilibrium solidus from 540 to 505 deg C. Under non-equilibrium solidification conditions, piston alloys finish solidification at {approx}505 deg C. Iron is bound in the quaternary Al{sub 8}FeMg{sub 3}Si{sub 6} phase in low-iron alloys and in the ternary Al{sub 9}FeNi and Al{sub 5}FeSi phases in high-iron alloys.

  6. Nuclear power plant control room crew task analysis database: SEEK system. Users manual

    International Nuclear Information System (INIS)

    Burgy, D.; Schroeder, L.

    1984-05-01

    The Crew Task Analysis SEEK Users Manual was prepared for the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission. It is designed for use with the existing computerized Control Room Crew Task Analysis Database. The SEEK system consists of a PR1ME computer with its associated peripherals and software augmented by General Physics Corporation SEEK database management software. The SEEK software programs provide the Crew Task Database user with rapid access to any number of records desired. The software uses English-like sentences to allow the user to construct logical sorts and outputs of the task data. Given the multiple-associative nature of the database, users can directly access the data at the plant, operating sequence, task or element level - or any combination of these levels. A complete description of the crew task data contained in the database is presented in NUREG/CR-3371, Task Analysis of Nuclear Power Plant Control Room Crews (Volumes 1 and 2)

  7. Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.

    Science.gov (United States)

    De Dreu, Carsten K W; Weingart, Laurie R

    2003-08-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.

  8. Task 11 - systems analysis of environmental management technologies

    Energy Technology Data Exchange (ETDEWEB)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy & Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech.

  9. Task 11 - systems analysis of environmental management technologies. Topical report

    International Nuclear Information System (INIS)

    Musich, M.A.

    1997-06-01

    A review was conducted of three systems analysis (SA) studies performed by Lockheed Idaho Technologies Company (LITCO) on integrated thermal treatment systems (ITTs) and integrated nonthermal treatment systems (INTSs) for the remediation of mixed low-level waste (MLLW) stored throughout the U.S. Department of Energy (DOE) weapons complex. The review was performed by an independent team led by the Energy ampersand Environment Research Center (EERC), including Science Applications International Corporation (SAIC), the Waste Policy Institute (WPI), and Virginia Tech

  10. Sintering diagrams of UO2

    International Nuclear Information System (INIS)

    Mohan, A.; Soni, N.C.; Moorthy, V.K.

    1979-01-01

    Ashby's method (see Acta Met., vol. 22, p. 275, 1974) of constructing sintering diagrams has been modified to obtain contribution diagrams directly from the computer. The interplay of sintering variables and mechanisms are studied and the factors that affect the participation of mechanisms in UO 2 are determined. By studying the physical properties, it emerges that the order of inaccuracies is small in most cases and do not affect the diagrams. On the other hand, even a 10% error in activation energies, which is quite plausible, would make a significant difference to the diagram. The main criticism of Ashby's approach is that the numerous properties and equations used, communicate their inaccuracies to the diagrams and make them unreliable. The present study has considerably reduced the number of factors that need to be refined to make the sintering diagrams more meaningful. (Auth.)

  11. Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis.

    Science.gov (United States)

    Wiens, Stefan; Szychowska, Malina; Nilsson, Mats E

    2016-01-01

    Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN.

  12. Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis

    Science.gov (United States)

    Wiens, Stefan; Szychowska, Malina; Nilsson, Mats E.

    2016-01-01

    Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN. PMID:26741815

  13. Fusion Diagrams in the - and - Systems

    Science.gov (United States)

    Asadov, M. M.; Akhmedova, N. A.

    2014-10-01

    A calculation model of the Gibbs energy of ternary oxide compounds from the binary components was used. Thermodynamic properties of -- ternary systems in the condensed state were calculated. Thermodynamic data of binary and ternary compounds were used to determine the stable sections. The probability of reactions between the corresponding components in the -- system was estimated. Fusibility diagrams of systems - and - were studied by physical-chemical analysis. The isothermal section of the phase diagram of -- at 298 K is built, as well as the projection of the liquid surface of --.

  14. Nuclear power plant personnel qualifications and training: TAPS: the task analysis profiling system. Volume 2

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1985-06-01

    This report discusses an automated task analysis profiling system (TAPS) designed to provide a linking tool between the behaviors of nuclear power plant operators in performing their tasks and the measurement tools necessary to evaluate their in-plant performance. TAPS assists in the identification of the entry-level skill, knowledge, ability and attitude (SKAA) requirements for the various tasks and rapidly associates them with measurement tests and human factors principles. This report describes the development of TAPS and presents its first demonstration. It begins with characteristics of skilled human performance and proceeds to postulate a cognitive model to formally describe these characteristics. A method is derived for linking SKAA characteristics to measurement tests. The entire process is then automated in the form of a task analysis computer program. The development of the program is detailed and a user guide with annotated code listings and supporting test information is provided

  15. Planar quark diagrams and binary spin processes

    International Nuclear Information System (INIS)

    Grigoryan, A.A.; Ivanov, N.Ya.

    1986-01-01

    Contributions of planar diagrams to the binary scattering processes are analyzed. The analysis is based on the predictions of quark-gluon picture of strong interactions for the coupling of reggeons with quarks as well as on the SU(6)-classification of hadrons. The dependence of contributions of nonplanar corrections on spins and quark composition of interacting particles is discussed

  16. AGAPE-ET for human error analysis of emergency tasks and its application

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2002-01-01

    The paper presents a proceduralised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), covering both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET method is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of the performance influencing factors (PIFs) on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations and a human error analysis procedure based on the error analysis items is organised to help the analysts cue or guide overall human error analysis. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The paper also presents the application of AGAPE-ET to 31 nuclear emergency tasks and its results

  17. Repowering analysis: Hanford Generating Project (HGP), Task Order Number 6

    International Nuclear Information System (INIS)

    1988-12-01

    The Hanford Generating Project (HGP), owned by the Washington Public Power Supply System, consists of two low pressure steam turbines, generators, and associated equipment located adjacent to the Department of Energy's (DOE) N-Reactor. HGP has been able to produce approximately 800 MWe with low pressure steam supplied by N-Reactor. DOE has placed N-Reactor in cold standby status for an undetermined length of time. This results in the idling of the HGP since no alternative source of steam is available. Bonneville Power Administration contracted with Fluor Daniel, Inc. to investigate the feasibility and cost of constructing a new source of steam for (repowering) one of the HGP turbines. The steam turbine is currently operated with 135 psia steam. The turbines can be rebuilt to operate with 500 psia steam pressure by adding additional stages, buckets, nozzles, and diaphragms. Because of the low pressure design, this turbine can never achieve the efficiencies possible in new high pressure turbines by the presences of existing equipment reduces the capital cost of a new generating resource. Five repowering options were investigated in this study. Three cases utilizing gas turbine combined cycle steam generation equipment, one case utilizing a gas fired boiler, and a case utilizing a coal fired boiler. This report presents Fluor Daniel's analysis of these repowering options

  18. Analysis of Member State RED implementation. Final Report (Task 2)

    Energy Technology Data Exchange (ETDEWEB)

    Peters, D.; Alberici, S.; Toop, G. [Ecofys, Utrecht (Netherlands); Kretschmer, B. [Institute for European Environmental Policy IEEP, London (United Kingdom)

    2012-12-15

    This report describes the way EU Member States have transposed the sustainability and chain of custody requirements for biofuels as laid down in the Renewable Energy Directive (RED) and Fuel Quality Directive (FQD). In the assessment of Member States' implementation, the report mainly focuses on effectiveness and administrative burden. Have Member States transposed the Directives in such a way that compliance with the sustainability criteria can be ensured as effectively as possible? To what extent does the Member States' implementation lead to unnecessary administrative burden for economic operators in the (bio)fuel supply chain? The report focuses specifically on the transposition of the sustainability and chain of custody requirements, not on the target for renewables on transport. This means that for example the double counting provision is not included as part of the scope of this report. This report starts with an introduction covering the implementation of the Renewable Energy (and Fuel Quality) Directive into national legislation, the methodology by which Member States were assessed against effectiveness and administrative burden and the categorisation of Member State's national systems for RED-implementation (Chapter 1). The report continues with a high level description of each Member State system assessed (Chapter 2). Following this, the report includes analysis of the Member States on the effectiveness and administrative burden of a number of key ('major') measures (Chapter 3). The final chapter presents the conclusions and recommendations (Chapter 4)

  19. Degradation mode analysis: An approach to establish effective predictive maintenance tasks

    International Nuclear Information System (INIS)

    Sonnett, D.E.; Douglass, P.T.; Barnard, D.D.

    1991-01-01

    A significant number of nuclear generating stations have been employing Reliability Centered Maintenance methodology to arrive at applicable and effective maintenance tasks for their plant equipment. The resultant endpoint of most programs has been an increased emphasis on predictive maintenance as the task of choice for monitoring and trending plant equipment condition to address failure mechanisms of the analyses. Many of these plants have spent several years conducting reliability centered analysis before they seriously begin implementing predictive program improvements. In this paper we present another methodology, entitled Degradation Mode Analysis, which provides a more direct method to quickly and economically achieve the major benefit of reliability centered analysis, namely predictive maintenance. (author)

  20. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  1. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  2. Qualitative Task Analysis to Enhance Sports Characterization: A Surfing Case Study

    Directory of Open Access Journals (Sweden)

    Moreira Miguel

    2014-10-01

    Full Text Available The aim of this study was to develop a Matrix of Analysis for Sports Tasks (MAST, regardless of the sports activity, based on practice classification and task analysis. Being this a qualitative research our main question was: in assessing sports’ structure is it possible to make the characterization of any discipline through context and individuals’ behaviours? The sample was within a surf discipline in a competition flowing having 5 of the top 16 Portuguese surfers training together. Based on a qualitative method, studying the surf as the main activity was an interpretative study case. The MAST was applied in four phases: taxonomy; tasks and context description; task analysis; teaching and performance strategies. Its application allowed the activities’ characterization through the observation, surfer’s opinions and bibliographical support. The triangulation of the data was used as an information data treatment. The elements were classified by the challenges proposed to the practitioners and the taxonomy was constituted by the sport activities, group, modality and discipline. Surf is a discipline of surfing which is a sliding sport modality, therefore, a nature sport. In the context description, we had the wave’s components and constraints and the surfboards’ qualities. Through task analysis we obtained a taxonomy of surf manoeuvres. The structural and functional analysis allowed finding solutions for learning of surf techniques with trampoline and skateboards because these fit in sliding sports. MAST makes possible the development of strategies that benefit teaching and performance intervention

  3. Qualitative Task Analysis to Enhance Sports Characterization: A Surfing Case Study

    Science.gov (United States)

    Moreira, Miguel; Peixoto, César

    2014-01-01

    The aim of this study was to develop a Matrix of Analysis for Sports Tasks (MAST), regardless of the sports activity, based on practice classification and task analysis. Being this a qualitative research our main question was: in assessing sports’ structure is it possible to make the characterization of any discipline through context and individuals’ behaviours? The sample was within a surf discipline in a competition flowing having 5 of the top 16 Portuguese surfers training together. Based on a qualitative method, studying the surf as the main activity was an interpretative study case. The MAST was applied in four phases: taxonomy; tasks and context description; task analysis; teaching and performance strategies. Its application allowed the activities’ characterization through the observation, surfer’s opinions and bibliographical support. The triangulation of the data was used as an information data treatment. The elements were classified by the challenges proposed to the practitioners and the taxonomy was constituted by the sport activities, group, modality and discipline. Surf is a discipline of surfing which is a sliding sport modality, therefore, a nature sport. In the context description, we had the wave’s components and constraints and the surfboards’ qualities. Through task analysis we obtained a taxonomy of surf manoeuvres. The structural and functional analysis allowed finding solutions for learning of surf techniques with trampoline and skateboards because these fit in sliding sports. MAST makes possible the development of strategies that benefit teaching and performance intervention. PMID:25414757

  4. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia.

    Science.gov (United States)

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest "not capable" and "never" done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. One hundred and thirty-eight midwives participated in the study. The majority of respondents recognized the importance of midwifery tasks (89%), felt they were capable (91.8%), reported doing them frequently (63.9%), and learned them during preservice education (56.3%). We identified competence gaps in tasks related to obstetric complications, gynecology, public health, professional duties, and prevention of mother to child transmission of HIV. Moreover, our study helped to determine composition of the licensing exam for university graduates. The task analysis indicates that midwives provide critical reproductive, maternal, newborn, and child health care

  5. Structural analysis interpretation task for the magnet system for Mirror Fusion Test Facility (MFTF)

    International Nuclear Information System (INIS)

    Baldi, R.W.

    1979-11-01

    The primary objective of this study was to develop recommendations to improve and substantiate the structural integrity of the highly stresses small radius region of the MFTF magnet. The specific approach is outlined: (1) Extract detail stress/strain data from General Dynamics Convair Finite-Element Refinement Analysis. (2) Diagram local plate stress distribution and its relationship to the adjacent weldment. (3) Update the parametric fracture mechanics analysis using most recent MFTF related data developed by National Bureau of Standards. (4) Review sequence and assembly as modified by Chicago Bridge and Iron for adaptability to refinements. (5) Investigate the need for fillet radii weldments to reduce stress concentrations at critical corners. (6) Review quality assurance plan for adequacy to insure structural quality in the small radius region. (7) Review instrumentation plan for adequacy of structural diagnostics in small radius region. (8) Participate in planning a small-scale fatigue test program of a typical MFTF weldment

  6. Co-Constructional Task Analysis: Moving beyond Adult-Based Models to Assess Young Children's Task Performance

    Science.gov (United States)

    Lee, Scott Weng Fai

    2013-01-01

    The assessment of young children's thinking competence in task performances has typically followed the novice-to-expert regimen involving models of strategies that adults use when engaged in cognitive tasks such as problem-solving and decision-making. Socio-constructivists argue for a balanced pedagogical approach between the adult and child that…

  7. Feynman diagrams without Feynman parameters

    International Nuclear Information System (INIS)

    Mendels, E.

    1978-01-01

    Dimensionally regularized Feynman diagrams are represented by means of products of k-functions. The infinite part of these diagrams is found very easily, also if they are overlapping, and the separation of the several kinds of divergences comes out quite naturally. Ward identities are proven in a transparent way. Series expansions in terms of the external momenta and their inner products are possible

  8. Diagram Techniques in Group Theory

    Science.gov (United States)

    Stedman, Geoffrey E.

    2009-09-01

    Preface; 1. Elementary examples; 2. Angular momentum coupling diagram techniques; 3. Extension to compact simple phase groups; 4. Symmetric and unitary groups; 5. Lie groups and Lie algebras; 6. Polarisation dependence of multiphoton processes; 7. Quantum field theoretic diagram techniques for atomic systems; 8. Applications; Appendix; References; Indexes.

  9. Contingency diagrams as teaching tools

    OpenAIRE

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching.

  10. Do tasks make a difference? Accounting for heterogeneity of performance of children with reading difficulties on tasks of executive function: findings from a meta-analysis.

    Science.gov (United States)

    Booth, Josephine N; Boyle, James M E; Kelly, Steve W

    2010-03-01

    Research studies have implicated executive functions in reading difficulties (RD). But while some studies have found children with RD to be impaired on tasks of executive function other studies report unimpaired performance. A meta-analysis was carried out to determine whether these discrepant findings can be accounted for by differences in the tasks of executive function that are utilized. A total of 48 studies comparing the performance on tasks of executive function of children with RD with their typically developing peers were included in the meta-analysis, yielding 180 effect sizes. An overall effect size of 0.57 (SE .03) was obtained, indicating that children with RD have impairments on tasks of executive function. However, effect sizes varied considerably suggesting that the impairment is not uniform. Moderator analysis revealed that task modality and IQ-achievement discrepancy definitions of RD influenced the magnitude of effect; however, the age and gender of participants and the nature of the RD did not have an influence. While the children's RD were associated with executive function impairments, variation in effect size is a product of the assessment task employed, underlying task demands, and definitional criteria.

  11. Impact decision support diagrams

    Science.gov (United States)

    Boslough, Mark

    2014-10-01

    One way to frame the job of planetary defense is to “find the optimal approach for finding the optimal approach” to NEO mitigation. This requires a framework for defining in advance what should be done under various circumstances. The two-dimensional action matrix from the recent NRC report “Defending Planet Earth” can be generalized to a notional “Impact Decision Support Diagram” by extending it into a third dimension. The NRC action matrix incorporated two important axes: size and time-to-impact, but probability of impact is also critical (it is part of the definitions of both the Torino and Palermo scales). Uncertainty has been neglected, but is also crucial. It can be incorporated by subsuming it into the NEO size axis by redefining size to be three standard deviations greater than the best estimate, thereby providing a built-in conservative margin. The independent variable is time-to-impact, which is known with high precision. The other two axes are both quantitative assessments of uncertainty and are both time dependent. Thus, the diagram is entirely an expression of uncertainty. The true impact probability is either one or zero, and the true size does not change. The domain contains information about the current uncertainty, which changes with time (as opposed to reality, which does not change).

  12. Development of contextual task analysis for NPP control room operators' work

    International Nuclear Information System (INIS)

    Hukki, K.

    1998-01-01

    The paper introduces a contextual approach to task analysis concerning control room operators' tasks and task conditions in nuclear power plants. The approach is based on the ecological concept of the situational appropriateness of activity. The task demands are dependent on the ultimate task of the operators which is to maintain the critical safety functions of the process. The context also sets boundary conditions to the fulfilment of these demands. The conceptualisation of the context affords possibilities to comprehend and make visible the core demands of the operators' work. Characteristic to the approach is that the conceptualisation is made both from the point of the operators who are making interpretations of the situation and from the point of the process to be controlled. The context is described as a world of operators' possibilities and constraints and, at the same time, in relation to the demands set by the nature of the process. The method is under development and has been applied in simulator training, in the evaluation of the control room information and in the integrated development of reliability analysis. The method emphasizes the role of explicit conceptualisation of the task situations. Explicity enhances its role as a conceptual tool and, therefore, promotes common awareness in these domains. (orig.)

  13. Closed-loop, pilot/vehicle analysis of the approach and landing task

    Science.gov (United States)

    Anderson, M. R.; Schmidt, D. K.

    1986-01-01

    In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.

  14. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Genus Ranges of Chord Diagrams.

    Science.gov (United States)

    Burns, Jonathan; Jonoska, Nataša; Saito, Masahico

    2015-04-01

    A chord diagram consists of a circle, called the backbone, with line segments, called chords, whose endpoints are attached to distinct points on the circle. The genus of a chord diagram is the genus of the orientable surface obtained by thickening the backbone to an annulus and attaching bands to the inner boundary circle at the ends of each chord. Variations of this construction are considered here, where bands are possibly attached to the outer boundary circle of the annulus. The genus range of a chord diagram is the genus values over all such variations of surfaces thus obtained from a given chord diagram. Genus ranges of chord diagrams for a fixed number of chords are studied. Integer intervals that can be, and those that cannot be, realized as genus ranges are investigated. Computer calculations are presented, and play a key role in discovering and proving the properties of genus ranges.

  16. Using plant procedures as the basis for conducting a job and task analysis

    International Nuclear Information System (INIS)

    Haynes, F.H.; Ruth, B.W.

    1985-01-01

    Plant procedures were selected, by Northeast Utilities (NU), as the basis for conducting Job and Task Analyses (JTA). The resultant JTA was used to design procedure based simulator training programs for Millstone 1, 2, and Connecticut Yankee. The task listings were both plant specific and exhibited excellent correlation to INPO's generic PWR and BWR task analyses. Using the procedures based method enabled us to perform the JTA using plant and training staff. This proved cost effective in terms of both time and money. Learning objectives developed from the JTA were easily justified and correlated directly to job performance within the context of the plant procedures. In addition, the analysis generated a comprehensive review of plant procedures and, conversely, the plant's normal procedure revision process generated an automatic trigger for updating the task data

  17. Influence diagram in evaluating the subjective judgment

    International Nuclear Information System (INIS)

    Hong, Y.

    1997-01-01

    The author developed the idea of the subjective influence diagrams to evaluate subjective judgment. The subjective judgment of a stake holder is a primary decision making proposition. It involves a basic decision process an the individual attitude of the stake holder for his decision purpose. The subjective judgment dominates the some final decisions. A complex decision process may include the subjective judgment. An influence diagram framework is a simplest tool for analyzing subjective judgment process. In the framework, the characters of influence diagrams generate the describing the analyzing, and the evaluating of the subjective judgment. The relationship between the information and the decision, such as independent character between them, is the main issue. Then utility function is the calculating tool to evaluation, the stake holder can make optimal decision. Through the analysis about the decision process and relationship, the building process of the influence diagram identically describes the subjective judgment. Some examples are given to explain the property of subjective judgment and the analysis process

  18. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A. [Pacific Science and Engineering Group, San Diego, CA (United States); Saunders, W.M.; Lepage, R.P.; Chin, E. [University of California San Diego Medical Center, CA (United States). Div. of Radiation Oncology; Schoenfeld, I.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.

  19. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    International Nuclear Information System (INIS)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A.; Saunders, W.M.; Lepage, R.P.; Chin, E.; Schoenfeld, I.; Serig, D.I.

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses

  20. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  1. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    Kaye, R.D.; Henriksen, K.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  2. Student Task Analysis for the Development of E-Learning Lectural System in Basic Chemistry Courses in FKIP UMMY Solok

    Science.gov (United States)

    Afrahamiryano, A.; Ariani, D.

    2018-04-01

    The student task analysis is one part of the define stage in development research using the 4-D development model. Analysis of this task is useful to determine the level of understanding of students on lecture materials that have been given. The results of this task analysis serve as a measuring tool to determine the level of success of learning and as a basis in the development of lecture system. Analysis of this task is done by the method of observation and documentation study of the tasks undertaken by students. The results of this analysis are then described and after that triangulation are done to draw conclusions. The results of the analysis indicate that the students' level of understanding is high for theoretical and low material for counting material. Based on the results of this task analysis, it can be concluded that e-learning lecture system developed should be able to increase students' understanding on basic chemicals that are calculated.

  3. A Critical Appraisal of the `Day' Diagram

    Science.gov (United States)

    Roberts, A. P.; Tauxe, L.; Heslop, D.

    2017-12-01

    The `Day' diagram [Day et al., 1977; doi:10.1016/0031-9201(77)90108-X] is used widely to infer the mean domain state of magnetic mineral assemblages. The Day plot coordinates are the ratios of the saturation remanent magnetization to saturation magnetization (Mrs/Ms) and the coercivity of remanence to coercivity (Bcr/Bc), as determined from a major hysteresis loop and a backfield demagnetization curve. Based on theoretical and empirical arguments, Day plots are typically demarcated into stable single domain (SD), `pseudosingle domain' (`PSD'), and multidomain (MD) zones. It is a simple task to determine Mrs/Ms and Bcr/Bc for a sample and to assign a mean domain state based on the boundaries defined by Day et al. [1977]. Many other parameters contribute to variability in a Day diagram, including surface oxidation, mineral stoichiometry, stress state, magnetostatic interactions, and mixtures of magnetic particles with different sizes and shapes. Bulk magnetic measurements usually lack detailed independent evidence to constrain each free parameter, which makes the Day diagram fundamentally ambiguous. This raises questions about its usefulness for diagnosing magnetic particle size variations. The Day diagram is also used to make inferences about binary mixing of magnetic particles, where, for example, mixtures of SD and MD particles give rise to a bulk `PSD' response even though the concentration of `PSD' grains could be zero. In our assessment of thousands of hysteresis measurements of geological samples, binary mixing occurs in a tiny number of cases. Ternary, quaternary, and higher order mixing are usually observed. Also, uniaxial SD and MD end-members are nearly always inappropriate for considering mixing because uniaxial SD particles are virtually non-existent in igneous rocks. Thus, use of mixing lines in Day diagrams routinely provides unsatisfactory representations of particle size variations. We critically appraise the Day diagram and argue that its many

  4. Performance monitoring and analysis of task-based OpenMP.

    Directory of Open Access Journals (Sweden)

    Yi Ding

    Full Text Available OpenMP, a typical shared memory programming paradigm, has been extensively applied in high performance computing community due to the popularity of multicore architectures in recent years. The most significant feature of the OpenMP 3.0 specification is the introduction of the task constructs to express parallelism at a much finer level of detail. This feature, however, has posed new challenges for performance monitoring and analysis. In particular, task creation is separated from its execution, causing the traditional monitoring methods to be ineffective. This paper presents a mechanism to monitor task-based OpenMP programs with interposition and proposes two demonstration graphs for performance analysis as well. The results of two experiments are discussed to evaluate the overhead of monitoring mechanism and to verify the effects of demonstration graphs using the BOTS benchmarks.

  5. Para-equilibrium phase diagrams

    International Nuclear Information System (INIS)

    Pelton, Arthur D.; Koukkari, Pertti; Pajarre, Risto; Eriksson, Gunnar

    2014-01-01

    Highlights: • A rapidly cooled system may attain a state of para-equilibrium. • In this state rapidly diffusing elements reach equilibrium but others are immobile. • Application of the Phase Rule to para-equilibrium phase diagrams is discussed. • A general algorithm to calculate para-equilibrium phase diagrams is described. - Abstract: If an initially homogeneous system at high temperature is rapidly cooled, a temporary para-equilibrium state may result in which rapidly diffusing elements have reached equilibrium but more slowly diffusing elements have remained essentially immobile. The best known example occurs when homogeneous austenite is quenched. A para-equilibrium phase assemblage may be calculated thermodynamically by Gibbs free energy minimization under the constraint that the ratios of the slowly diffusing elements are the same in all phases. Several examples of calculated para-equilibrium phase diagram sections are presented and the application of the Phase Rule is discussed. Although the rules governing the geometry of these diagrams may appear at first to be somewhat different from those for full equilibrium phase diagrams, it is shown that in fact they obey exactly the same rules with the following provision. Since the molar ratios of non-diffusing elements are the same in all phases at para-equilibrium, these ratios act, as far as the geometry of the diagram is concerned, like “potential” variables (such as T, pressure or chemical potentials) rather than like “normal” composition variables which need not be the same in all phases. A general algorithm to calculate para-equilibrium phase diagrams is presented. In the limit, if a para-equilibrium calculation is performed under the constraint that no elements diffuse, then the resultant phase diagram shows the single phase with the minimum Gibbs free energy at any point on the diagram; such calculations are of interest in physical vapor deposition when deposition is so rapid that phase

  6. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  7. Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia

    Science.gov (United States)

    Gucev, Gligor V.

    2012-01-01

    Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…

  8. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    Energy Technology Data Exchange (ETDEWEB)

    Torralba, B.; Martinez-Arias, R.

    2007-07-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  9. Boundary error analysis and categorization in the TRECVID news story segmentation task

    NARCIS (Netherlands)

    Arlandis, J.; Over, P.; Kraaij, W.

    2005-01-01

    In this paper, an error analysis based on boundary error popularity (frequency) including semantic boundary categorization is applied in the context of the news story segmentation task from TRECVTD1. Clusters of systems were defined based on the input resources they used including video, audio and

  10. Formative Research on the Simplifying Conditions Method (SCM) for Task Analysis and Sequencing.

    Science.gov (United States)

    Kim, YoungHwan; Reigluth, Charles M.

    The Simplifying Conditions Method (SCM) is a set of guidelines for task analysis and sequencing of instructional content under the Elaboration Theory (ET). This article introduces the fundamentals of SCM and presents the findings from a formative research study on SCM. It was conducted in two distinct phases: design and instruction. In the first…

  11. Classroom-Based Functional Analysis and Intervention for Disruptive and Off-Task Behaviors

    Science.gov (United States)

    Shumate, Emily D.; Wills, Howard P.

    2010-01-01

    Although there is a growing body of literature on the use of functional analysis in schools, there is a need for more demonstrations of this technology being used during the course of typical instruction. In this study, we conducted functional analyses of disruptive and off-task behavior in a reading classroom setting for 3 participants of typical…

  12. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    International Nuclear Information System (INIS)

    Torralba, B.; Martinez-Arias, R.

    2007-01-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  13. Use of Job Task Analysis (JTA) in the development of craft training programs

    International Nuclear Information System (INIS)

    Gonyeau, J.A.; Long, R.E.

    1985-01-01

    Northern States Power Company is making a major effort to develop performance based training. It is finding the use of JTA data very helpful in the revision of its maintenance craft training programs. The technique being used involves a group of interns from the Training and Development Program of the University of Minnesota. These interns are largely graduate students, but with no nuclear and little mechanical/electrical experience. A Job Analysis for each discipline was used to: guide the following task analysis, determine program content, evaluate existing OJT check lists, and to define the four crafts used for mechanical maintenance. From the Job Analysis, a Training Task List was developed and correlated to training materials. The analysis of the tasks on the Training Task List is proceeding. Taxonomies of systems or subjects are compared to existing lesson plans. These taxonomies are useful when writing new lesson plans. The taxonomies are an excellent start for the development of enabling objectives. A Nine-Step Plan is being followed in the application of JTA data to the development and refinement of performance based training

  14. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    NARCIS (Netherlands)

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    PURPOSE: Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the

  15. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    Directory of Open Access Journals (Sweden)

    Yigzaw T

    2016-05-01

    Full Text Available Tegbar Yigzaw,1 Catherine Carr,2 Jelle Stekelenburg,3,4 Jos van Roosmalen,5 Hannah Gibson,1 Mintwab Gelagay,1 Azeb Admassu6 1Jhpiego, Addis Ababa, Ethiopia; 2Jhpiego, Washington DC, USA; 3Department of Obstetrics and Gynecology, Leeuwarden Medical Centre, Leeuwarden, 4Department of Health Sciences, Global Health, University Medical Centre Groningen, University of Groningen, Groningen, 5Faculty of Earth and Life Sciences, Vrije Universiteit, Amsterdam, the Netherlands; 6Federal Ministry of Health, Addis Ababa, Ethiopia Purpose: Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. Methods: We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest “not capable” and “never” done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. Results: One hundred and thirty-eight midwives participated in the study. The majority of

  16. Radiological emergency response for community agencies with cognitive task analysis, risk analysis, and decision support framework.

    Science.gov (United States)

    Meyer, Travis S; Muething, Joseph Z; Lima, Gustavo Amoras Souza; Torres, Breno Raemy Rangel; del Rosario, Trystyn Keia; Gomes, José Orlando; Lambert, James H

    2012-01-01

    Radiological nuclear emergency responders must be able to coordinate evacuation and relief efforts following the release of radioactive material into populated areas. In order to respond quickly and effectively to a nuclear emergency, high-level coordination is needed between a number of large, independent organizations, including police, military, hazmat, and transportation authorities. Given the complexity, scale, time-pressure, and potential negative consequences inherent in radiological emergency responses, tracking and communicating information that will assist decision makers during a crisis is crucial. The emergency response team at the Angra dos Reis nuclear power facility, located outside of Rio de Janeiro, Brazil, presently conducts emergency response simulations once every two years to prepare organizational leaders for real-life emergency situations. However, current exercises are conducted without the aid of electronic or software tools, resulting in possible cognitive overload and delays in decision-making. This paper describes the development of a decision support system employing systems methodologies, including cognitive task analysis and human-machine interface design. The decision support system can aid the coordination team by automating cognitive functions and improving information sharing. A prototype of the design will be evaluated by plant officials in Brazil and incorporated to a future trial run of a response simulation.

  17. Factor analysis for imperfect maintenance planning at nuclear power plants by cognitive task analysis

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Iida, Hiroyasu

    2011-01-01

    Imperfect maintenance planning was frequently identified in domestic nuclear power plants. To prevent such an event, we analyzed causal factors in maintenance planning stages and showed the directionality of countermeasures in this study. There is a pragmatic limit in finding the causal factors from the items based on report descriptions. Therefore, the idea of the systemic accident model, which is used to monitor the performance variability in normal circumstances, is taken as a new concept instead of investigating negative factors. As an actual method for analyzing usual activities, cognitive task analysis (CTA) was applied. Persons who experienced various maintenance activities at one electric power company were interviewed about sources related to decision making during maintenance planning, and then usual factors affecting planning were extracted as performance variability factors. The tendency of domestic events was analyzed using the classification item of those factors, and the directionality of countermeasures was shown. The following are critical for preventing imperfect maintenance planning: the persons in charge should fully understand the situation of the equipment for which they are responsible in the work planning and maintenance evaluation stages, and they should definitely understand, for example, the maintenance bases of that equipment. (author)

  18. Aligning Event Logs to Task-Time Matrix Clinical Pathways in BPMN for Variance Analysis.

    Science.gov (United States)

    Yan, Hui; Van Gorp, Pieter; Kaymak, Uzay; Lu, Xudong; Ji, Lei; Chiau, Choo Chiap; Korsten, Hendrikus H M; Duan, Huilong

    2018-03-01

    Clinical pathways (CPs) are popular healthcare management tools to standardize care and ensure quality. Analyzing CP compliance levels and variances is known to be useful for training and CP redesign purposes. Flexible semantics of the business process model and notation (BPMN) language has been shown to be useful for the modeling and analysis of complex protocols. However, in practical cases one may want to exploit that CPs often have the form of task-time matrices. This paper presents a new method parsing complex BPMN models and aligning traces to the models heuristically. A case study on variance analysis is undertaken, where a CP from the practice and two large sets of patients data from an electronic medical record (EMR) database are used. The results demonstrate that automated variance analysis between BPMN task-time models and real-life EMR data are feasible, whereas that was not the case for the existing analysis techniques. We also provide meaningful insights for further improvement.

  19. Generalized balanced power diagrams for 3D representations of polycrystals

    DEFF Research Database (Denmark)

    Alpers, Andreas; Brieden, Andreas; Gritzmann, Peter

    2015-01-01

    Characterizing the grain structure of polycrystalline material is an important task in material science. The present paper introduces the concept of generalized balanced power diagrams as a concise alternative to voxelated mappings. Here, each grain is represented by (measured approximations of...

  20. Investigating the QCD phase diagram with hadron multiplicities at NICA

    Energy Technology Data Exchange (ETDEWEB)

    Becattini, F. [Universita di Firenze (Italy); INFN, Firenze (Italy); Stock, R. [Goethe University, Frankfurt am Main (Germany)

    2016-08-15

    We discuss the potential of the experimental programme at NICA to investigate the QCD phase diagram and particularly the position of the critical line at large baryon-chemical potential with accurate measurements of particle multiplicities. We briefly review the present status and we outline the tasks to be accomplished both theoretically and the experimentally to make hadronic abundances a sensitive probe. (orig.)

  1. Causal Diagrams for Empirical Research

    OpenAIRE

    Pearl, Judea

    1994-01-01

    The primary aim of this paper is to show how graphical models can be used as a mathematical language for integrating statistical and subject-matter information. In particular, the paper develops a principled, nonparametric framework for causal inference, in which diagrams are queried to determine if the assumptions available are sufficient for identifiying causal effects from non-experimental data. If so the diagrams can be queried to produce mathematical expressions for causal effects in ter...

  2. Wind Diagrams in Medieval Iceland

    DEFF Research Database (Denmark)

    Kedwards, Dale

    2014-01-01

    This article presents a study of the sole wind diagram that survives from medieval Iceland, preserved in the encyclopaedic miscellany in Copenhagen's Arnamagnæan Institute with the shelf mark AM 732b 4to (c. 1300-25). It examines the wind diagram and its accompanying text, an excerpt on the winds...... from Isidore of Seville's Etymologies. It also examines the perimeter of winds on two medieval Icelandic world maps, and the visual traditions from which they draw....

  3. Phase diagrams of the elements

    International Nuclear Information System (INIS)

    Young, D.A.

    1975-01-01

    A summary of the pressure-temperature phase diagrams of the elements is presented, with graphs of the experimentally determined solid-solid phase boundaries and melting curves. Comments, including theoretical discussion, are provided for each diagram. The crystal structure of each solid phase is identified and discussed. This work is aimed at encouraging further experimental and theoretical research on phase transitions in the elements

  4. Reconstructing Data Flow Diagrams from Structure Charts Based on the Input and Output Relationship

    OpenAIRE

    YAMAMOTO, Shuichiro

    1995-01-01

    The traceability of data flow diagrams against structure charts is very important for large software development. Specifying if there is a relationship between a data flow diagram and a structure chart is a time consuming task. Existing CASE tools provide a way to maintain traceability. If we can extract the input-output relationship of a system from a structure chart, the corresponding data flow diagram can be automatically generated from the relationship. For example, Benedusi et al. propos...

  5. Directionality analysis on functional magnetic resonance imaging during motor task using Granger causality.

    Science.gov (United States)

    Anwar, A R; Muthalib, M; Perrey, S; Galka, A; Granert, O; Wolff, S; Deuschl, G; Raethjen, J; Heute, U; Muthuraman, M

    2012-01-01

    Directionality analysis of signals originating from different parts of brain during motor tasks has gained a lot of interest. Since brain activity can be recorded over time, methods of time series analysis can be applied to medical time series as well. Granger Causality is a method to find a causal relationship between time series. Such causality can be referred to as a directional connection and is not necessarily bidirectional. The aim of this study is to differentiate between different motor tasks on the basis of activation maps and also to understand the nature of connections present between different parts of the brain. In this paper, three different motor tasks (finger tapping, simple finger sequencing, and complex finger sequencing) are analyzed. Time series for each task were extracted from functional magnetic resonance imaging (fMRI) data, which have a very good spatial resolution and can look into the sub-cortical regions of the brain. Activation maps based on fMRI images show that, in case of complex finger sequencing, most parts of the brain are active, unlike finger tapping during which only limited regions show activity. Directionality analysis on time series extracted from contralateral motor cortex (CMC), supplementary motor area (SMA), and cerebellum (CER) show bidirectional connections between these parts of the brain. In case of simple finger sequencing and complex finger sequencing, the strongest connections originate from SMA and CMC, while connections originating from CER in either direction are the weakest ones in magnitude during all paradigms.

  6. Drawing Euler Diagrams with Circles: The Theory of Piercings.

    Science.gov (United States)

    Stapleton, Gem; Leishi Zhang; Howse, John; Rodgers, Peter

    2011-07-01

    Euler diagrams are effective tools for visualizing set intersections. They have a large number of application areas ranging from statistical data analysis to software engineering. However, the automated generation of Euler diagrams has never been easy: given an abstract description of a required Euler diagram, it is computationally expensive to generate the diagram. Moreover, the generated diagrams represent sets by polygons, sometimes with quite irregular shapes that make the diagrams less comprehensible. In this paper, we address these two issues by developing the theory of piercings, where we define single piercing curves and double piercing curves. We prove that if a diagram can be built inductively by successively adding piercing curves under certain constraints, then it can be drawn with circles, which are more esthetically pleasing than arbitrary polygons. The theory of piercings is developed at the abstract level. In addition, we present a Java implementation that, given an inductively pierced abstract description, generates an Euler diagram consisting only of circles within polynomial time.

  7. Does complexity matter? Meta-analysis of learner performance in artificial grammar tasks.

    Science.gov (United States)

    Schiff, Rachel; Katan, Pesia

    2014-01-01

    Complexity has been shown to affect performance on artificial grammar learning (AGL) tasks (categorization of test items as grammatical/ungrammatical according to the implicitly trained grammar rules). However, previously published AGL experiments did not utilize consistent measures to investigate the comprehensive effect of grammar complexity on task performance. The present study focused on computerizing Bollt and Jones's (2000) technique of calculating topological entropy (TE), a quantitative measure of AGL charts' complexity, with the aim of examining associations between grammar systems' TE and learners' AGL task performance. We surveyed the literature and identified 56 previous AGL experiments based on 10 different grammars that met the sampling criteria. Using the automated matrix-lift-action method, we assigned a TE value for each of these 10 previously used AGL systems and examined its correlation with learners' task performance. The meta-regression analysis showed a significant correlation, demonstrating that the complexity effect transcended the different settings and conditions in which the categorization task was performed. The results reinforced the importance of using this new automated tool to uniformly measure grammar systems' complexity when experimenting with and evaluating the findings of AGL studies.

  8. A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.

    Science.gov (United States)

    Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L

    2018-05-16

    During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.

  9. Using task analysis to generate evidence for strengthening midwifery education, practice, and regulation in Ethiopia

    Science.gov (United States)

    Yigzaw, Tegbar; Carr, Catherine; Stekelenburg, Jelle; van Roosmalen, Jos; Gibson, Hannah; Gelagay, Mintwab; Admassu, Azeb

    2016-01-01

    Purpose Realizing aspirations for meeting the global reproductive, maternal, newborn, and child health goals depends not only on increasing the numbers but also on improving the capability of midwifery workforce. We conducted a task analysis study to identify the needs for strengthening the midwifery workforce in Ethiopia. Methods We conducted a cross-sectional study of recently qualified midwives in Ethiopia. Purposively selected participants from representative geographic and practice settings completed a self-administered questionnaire, making judgments about the frequency of performance, criticality, competence, and location of training for a list of validated midwifery tasks. Using Statistical Package for the Social Sciences, Version 20, we computed the percentages and averages to describe participant and practice characteristics. We identified priority preservice education gaps by considering the tasks least frequently learned in preservice, most frequently mentioned for not being trained, and had the highest not capable response. Identification of top priorities for in-service training considered tasks with highest “not capable” and “never” done responses. We determined the licensing exam blueprint by weighing the composite mean scores for frequency and criticality variables and expert rating across practice categories. Results One hundred and thirty-eight midwives participated in the study. The majority of respondents recognized the importance of midwifery tasks (89%), felt they were capable (91.8%), reported doing them frequently (63.9%), and learned them during preservice education (56.3%). We identified competence gaps in tasks related to obstetric complications, gynecology, public health, professional duties, and prevention of mother to child transmission of HIV. Moreover, our study helped to determine composition of the licensing exam for university graduates. Conclusion The task analysis indicates that midwives provide critical reproductive

  10. Human reliability analysis of performing tasks in plants based on fuzzy integral

    International Nuclear Information System (INIS)

    Washio, Takashi; Kitamura, Yutaka; Takahashi, Hideaki

    1991-01-01

    The effective improvement of the human working conditions in nuclear power plants might be a solution for the enhancement of the operation safety. The human reliability analysis (HRA) gives a methodological basis of the improvement based on the evaluation of human reliability under various working conditions. This study investigates some difficulties of the human reliability analysis using conventional linear models and recent fuzzy integral models, and provides some solutions to the difficulties. The following practical features of the provided methods are confirmed in comparison with the conventional methods: (1) Applicability to various types of tasks (2) Capability of evaluating complicated dependencies among working condition factors (3) A priori human reliability evaluation based on a systematic task analysis of human action processes (4) A conversion scheme to probability from indices representing human reliability. (author)

  11. Collaborative diagramming during problem based learning in medical education: Do computerized diagrams support basic science knowledge construction?

    Science.gov (United States)

    De Leng, Bas; Gijlers, Hannie

    2015-05-01

    To examine how collaborative diagramming affects discussion and knowledge construction when learning complex basic science topics in medical education, including its effectiveness in the reformulation phase of problem-based learning. Opinions and perceptions of students (n = 70) and tutors (n = 4) who used collaborative diagramming in tutorial groups were collected with a questionnaire and focus group discussions. A framework derived from the analysis of discourse in computer-supported collaborative leaning was used to construct the questionnaire. Video observations were used during the focus group discussions. Both students and tutors felt that collaborative diagramming positively affected discussion and knowledge construction. Students particularly appreciated that diagrams helped them to structure knowledge, to develop an overview of topics, and stimulated them to find relationships between topics. Tutors emphasized that diagramming increased interaction and enhanced the focus and detail of the discussion. Favourable conditions were the following: working with a shared whiteboard, using a diagram format that facilitated distribution, and applying half filled-in diagrams for non-content expert tutors and\\or for heterogeneous groups with low achieving students. The empirical findings in this study support the findings of earlier more descriptive studies that diagramming in a collaborative setting is valuable for learning complex knowledge in medicine.

  12. Articulating training methods using Job Task Analysis (JTA) - determined proficiency levels

    International Nuclear Information System (INIS)

    McDonald, B.A.

    1985-01-01

    The INPO task analysis process, as well as that of many utilities, is based on the approach used by the US Navy. This is undoubtedly due to the Navy nuclear background of many of those involved in introducing the systems approach to training to the nuclear power industry. This report outlines an approach, used by a major North-Central utility, which includes a process developed by the Air Force. Air Force task analysis and instructional system development includes the use of a proficiency code. The code includes consideration of three types of learning - task performance, task knowledge, and subject knowledge - and four levels of competence for each. The use of this classification system facilitates the identification of desired competency levels at the completion of formal training in the classroom and lab, and of informal training on the job. By using the Air Force's proficiency code. The utility's program developers were able to develop generic training for its main training facility and site-specific training at its nuclear plants, using the most efficiency and cost-effective training methods

  13. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  14. Control system of the inspection robots group applying auctions and multi-criteria analysis for task allocation

    Science.gov (United States)

    Panfil, Wawrzyniec; Moczulski, Wojciech

    2017-10-01

    In the paper presented is a control system of a mobile robots group intended for carrying out inspection missions. The main research problem was to define such a control system in order to facilitate a cooperation of the robots resulting in realization of the committed inspection tasks. Many of the well-known control systems use auctions for tasks allocation, where a subject of an auction is a task to be allocated. It seems that in the case of missions characterized by much larger number of tasks than number of robots it will be better if robots (instead of tasks) are subjects of auctions. The second identified problem concerns the one-sided robot-to-task fitness evaluation. Simultaneous assessment of the robot-to-task fitness and task attractiveness for robot should affect positively for the overall effectiveness of the multi-robot system performance. The elaborated system allows to assign tasks to robots using various methods for evaluation of fitness between robots and tasks, and using some tasks allocation methods. There is proposed the method for multi-criteria analysis, which is composed of two assessments, i.e. robot's concurrency position for task among other robots and task's attractiveness for robot among other tasks. Furthermore, there are proposed methods for tasks allocation applying the mentioned multi-criteria analysis method. The verification of both the elaborated system and the proposed tasks' allocation methods was carried out with the help of simulated experiments. The object under test was a group of inspection mobile robots being a virtual counterpart of the real mobile-robot group.

  15. The Systems Approach to Functional Job Analysis. Task Analysis of the Physician's Assistant: Volume I--Task Analysis Methodology and Techniques.

    Science.gov (United States)

    Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.

    Utilizing a systematic sampling technique, the professional activities of small groups of pediatricians, family practitioners, surgeons, obstetricians, and internists were observed for 4 or 5 days by a medical student who checked a prearranged activity sheet every 30 seconds to: (1) identify those tasks and activities an assistant could be trained…

  16. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    Directory of Open Access Journals (Sweden)

    Guan Yu

    Full Text Available Accurately identifying mild cognitive impairment (MCI individuals who will progress to Alzheimer's disease (AD is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI and fluorodeoxyglucose positron emission tomography (FDG-PET. However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI subjects and 226 stable MCI (sMCI subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images and also the single-task classification method (using only MRI or only subjects with both MRI and

  17. Task-evoked brain functional magnetic susceptibility mapping by independent component analysis (χICA).

    Science.gov (United States)

    Chen, Zikuan; Calhoun, Vince D

    2016-03-01

    Conventionally, independent component analysis (ICA) is performed on an fMRI magnitude dataset to analyze brain functional mapping (AICA). By solving the inverse problem of fMRI, we can reconstruct the brain magnetic susceptibility (χ) functional states. Upon the reconstructed χ dataspace, we propose an ICA-based brain functional χ mapping method (χICA) to extract task-evoked brain functional map. A complex division algorithm is applied to a timeseries of fMRI phase images to extract temporal phase changes (relative to an OFF-state snapshot). A computed inverse MRI (CIMRI) model is used to reconstruct a 4D brain χ response dataset. χICA is implemented by applying a spatial InfoMax ICA algorithm to the reconstructed 4D χ dataspace. With finger-tapping experiments on a 7T system, the χICA-extracted χ-depicted functional map is similar to the SPM-inferred functional χ map by a spatial correlation of 0.67 ± 0.05. In comparison, the AICA-extracted magnitude-depicted map is correlated with the SPM magnitude map by 0.81 ± 0.05. The understanding of the inferiority of χICA to AICA for task-evoked functional map is an ongoing research topic. For task-evoked brain functional mapping, we compare the data-driven ICA method with the task-correlated SPM method. In particular, we compare χICA with AICA for extracting task-correlated timecourses and functional maps. χICA can extract a χ-depicted task-evoked brain functional map from a reconstructed χ dataspace without the knowledge about brain hemodynamic responses. The χICA-extracted brain functional χ map reveals a bidirectional BOLD response pattern that is unavailable (or different) from AICA. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Multi-task linear programming discriminant analysis for the identification of progressive MCI individuals.

    Science.gov (United States)

    Yu, Guan; Liu, Yufeng; Thung, Kim-Han; Shen, Dinggang

    2014-01-01

    Accurately identifying mild cognitive impairment (MCI) individuals who will progress to Alzheimer's disease (AD) is very important for making early interventions. Many classification methods focus on integrating multiple imaging modalities such as magnetic resonance imaging (MRI) and fluorodeoxyglucose positron emission tomography (FDG-PET). However, the main challenge for MCI classification using multiple imaging modalities is the existence of a lot of missing data in many subjects. For example, in the Alzheimer's Disease Neuroimaging Initiative (ADNI) study, almost half of the subjects do not have PET images. In this paper, we propose a new and flexible binary classification method, namely Multi-task Linear Programming Discriminant (MLPD) analysis, for the incomplete multi-source feature learning. Specifically, we decompose the classification problem into different classification tasks, i.e., one for each combination of available data sources. To solve all different classification tasks jointly, our proposed MLPD method links them together by constraining them to achieve the similar estimated mean difference between the two classes (under classification) for those shared features. Compared with the state-of-the-art incomplete Multi-Source Feature (iMSF) learning method, instead of constraining different classification tasks to choose a common feature subset for those shared features, MLPD can flexibly and adaptively choose different feature subsets for different classification tasks. Furthermore, our proposed MLPD method can be efficiently implemented by linear programming. To validate our MLPD method, we perform experiments on the ADNI baseline dataset with the incomplete MRI and PET images from 167 progressive MCI (pMCI) subjects and 226 stable MCI (sMCI) subjects. We further compared our method with the iMSF method (using incomplete MRI and PET images) and also the single-task classification method (using only MRI or only subjects with both MRI and PET images

  19. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  20. Do Tasks Make a Difference? Accounting for Heterogeneity of Performance of Children with Reading Difficulties on Tasks of Executive Function: Findings from a Meta-Analysis

    Science.gov (United States)

    Booth, Josephine N.; Boyle, James M. E.; Kelly, Steve W.

    2010-01-01

    Research studies have implicated executive functions in reading difficulties (RD). But while some studies have found children with RD to be impaired on tasks of executive function other studies report unimpaired performance. A meta-analysis was carried out to determine whether these discrepant findings can be accounted for by differences in the…

  1. ANALYSIS OF THE GAZE BEHAVIOUR OF THE WORKER ON THE CARBURETOR ASSEMBLY TASK

    Directory of Open Access Journals (Sweden)

    Novie Susanto

    2015-06-01

    Full Text Available This study presents analysis of the area of interest (AOI and the gaze behavior of human during assembly task. This study aims at investigating the human behavior in detail using an eye‐tracking system during assembly task using LEGO brick and an actual manufactured product, a carburetor. An analysis using heat map data based on the recorded videos from the eye-tracking system is taken into account to examine and investigate the gaze behavior of human. The results of this study show that the carburetor assembly requires more attention than the product made from LEGO bricks. About 50% of the participants experience the necessity to visually inspect the interim state of the work object during the simulation of the assembly sequence on the screen. They also show the tendency to want to be more certain about part fitting in the actual work object.

  2. Geometry Helps to Compare Persistence Diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Kerber, Michael; Morozov, Dmitriy; Nigmetov, Arnur

    2015-11-16

    Exploiting geometric structure to improve the asymptotic complexity of discrete assignment problems is a well-studied subject. In contrast, the practical advantages of using geometry for such problems have not been explored. We implement geometric variants of the Hopcroft--Karp algorithm for bottleneck matching (based on previous work by Efrat el al.), and of the auction algorithm by Bertsekas for Wasserstein distance computation. Both implementations use k-d trees to replace a linear scan with a geometric proximity query. Our interest in this problem stems from the desire to compute distances between persistence diagrams, a problem that comes up frequently in topological data analysis. We show that our geometric matching algorithms lead to a substantial performance gain, both in running time and in memory consumption, over their purely combinatorial counterparts. Moreover, our implementation significantly outperforms the only other implementation available for comparing persistence diagrams.

  3. Change Best: Task 2.3. Analysis of policy mix and development of Energy Efficiency Services

    International Nuclear Information System (INIS)

    Boonekamp, P.; Vethman, P.

    2010-04-01

    The aim of the Change Best project is to promote the development of an energy efficiency service (EES) market and to give good practice examples of changes in energy service business, strategies, and supportive policies and measures in the course of the implementation of Directive 2006/32/EC on Energy End-Use Efficiency and Energy Services. This report addresses task 2.3: Analysis of policy mix and development of Energy Efficiency Services.

  4. Development of calibration training and procedures using job-task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.A.

    1993-12-01

    Efforts to handle an increased workload with dwindling manpower in the Physical and Electrical Standards Laboratory (Standards Lab) at the Oak Ridge Y-12 Plant are described. Empowerment of workers via Total Quality Management (TQM) is the basis for their efforts. A survey and follow-up team work was the course of action. The job-task analysis received honors by their peers at the Y-12 Plant.

  5. Differential recruitment of theory of mind brain network across three tasks: An independent component analysis.

    Science.gov (United States)

    Thye, Melissa D; Ammons, Carla J; Murdaugh, Donna L; Kana, Rajesh K

    2018-07-16

    Social neuroscience research has focused on an identified network of brain regions primarily associated with processing Theory of Mind (ToM). However, ToM is a broad cognitive process, which encompasses several sub-processes, such as mental state detection and intentional attribution, and the connectivity of brain regions underlying the broader ToM network in response to paradigms assessing these sub-processes requires further characterization. Standard fMRI analyses which focus only on brain activity cannot capture information about ToM processing at a network level. An alternative method, independent component analysis (ICA), is a data-driven technique used to isolate intrinsic connectivity networks, and this approach provides insight into network-level regional recruitment. In this fMRI study, three complementary, but distinct ToM tasks assessing mental state detection (e.g. RMIE: Reading the Mind in the Eyes; RMIV: Reading the Mind in the Voice) and intentional attribution (Causality task) were each analyzed using ICA in order to separately characterize the recruitment and functional connectivity of core nodes in the ToM network in response to the sub-processes of ToM. Based on visual comparison of the derived networks for each task, the spatiotemporal network patterns were similar between the RMIE and RMIV tasks, which elicited mentalizing about the mental states of others, and these networks differed from the network derived for the Causality task, which elicited mentalizing about goal-directed actions. The medial prefrontal cortex, precuneus, and right inferior frontal gyrus were seen in the components with the highest correlation with the task condition for each of the tasks highlighting the role of these regions in general ToM processing. Using a data-driven approach, the current study captured the differences in task-related brain response to ToM in three distinct ToM paradigms. The findings of this study further elucidate the neural mechanisms associated

  6. Concurrent multidisciplinary mechanical design based on design task analysis and knowledge sharing; Sekkei task bunseki to joho kyoyu ni yoru mechatronics kyocho sekkei

    Energy Technology Data Exchange (ETDEWEB)

    Kondo, K.; Ozawa, M.; Mori, T. [Toshiba Corp., Tokyo (Japan)

    1999-09-01

    We have developed a systematic design task planning method based on a design structure matrix(DSM) and a lumped model- based framework for knowledge sharing in a concurrent design environment as key techniques for developing higher quality products in a shorter design time. The DSM facilitates systematic analysis of dependencies among design tasks and optimization of the design process. The framework based on a lumped model description of mechanical systems enables concurrent and cooperative work among multidisciplinary designers at an early stage of the design process. In this paper, we also discuss the relationships between these techniques and the product development flow from product definition to detailed design. (author)

  7. Using non-linear analogue of Nyquist diagrams for analysis of the equation describing the hemodynamics in blood vessels near pathologies

    Science.gov (United States)

    Cherevko, A. A.; Bord, E. E.; Khe, A. K.; Panarin, V. A.; Orlov, K. J.; Chupakhin, A. P.

    2016-06-01

    This article considers method of describing the behaviour of hemodynamic parameters near vascular pathologies. We study the influence of arterial aneurysms and arteriovenous malformations on the vascular system. The proposed method involves using generalized model of Van der Pol-Duffing to find out the characteristic behaviour of blood flow parameters. These parameters are blood velocity and pressure in the vessel. The velocity and pressure are obtained during the neurosurgery measurements. It is noted that substituting velocity on the right side of the equation gives good pressure approximation. Thus, the model reproduces clinical data well enough. In regard to the right side of the equation, it means external impact on the system. The harmonic functions with various frequencies and amplitudes are substituted on the right side of the equation to investigate its properties. Besides, variation of the right side parameters provides additional information about pressure. Non-linear analogue of Nyquist diagrams is used to find out how the properties of solution depend on the parameter values. We have analysed 60 cases with aneurysms and 14 cases with arteriovenous malformations. It is shown that the diagrams are divided into classes. Also, the classes are replaced by another one in the definite order with increasing of the right side amplitude.

  8. New detectors for powders diagrams

    International Nuclear Information System (INIS)

    Convert, P.

    1975-01-01

    During the last few years, all the classical neutron diffractometers for powders have used one or maybe a few counters. So, it takes a long time to obtain a diagram which causes many disadvantages: 1) very long experiments: one or two days (or flux on the sample about 10 6 n/cm 2 /a); 2) necessity of big samples: many cm 3 ; 3) necessity of having the whole diagram before changing anything in the experiment: magnetic field, temperature, quality of the sample; 4) necessity of having collimators of a few times ten minutes to obtain correct statistics in the diagram. Because of these disadvantages, several attempts have been made to speed up the experimental procedure such as using more counters, the detection of neutrons on a resistive wire, etc. In Grenoble, new position-sensitive detectors have been constructed using a digital technique

  9. Proposal of Constraints Analysis Method Based on Network Model for Task Planning

    Science.gov (United States)

    Tomiyama, Tomoe; Sato, Tatsuhiro; Morita, Toyohisa; Sasaki, Toshiro

    Deregulation has been accelerating several activities toward reengineering business processes, such as railway through service and modal shift in logistics. Making those activities successful, business entities have to regulate new business rules or know-how (we call them ‘constraints’). According to the new constraints, they need to manage business resources such as instruments, materials, workers and so on. In this paper, we propose a constraint analysis method to define constraints for task planning of the new business processes. To visualize each constraint's influence on planning, we propose a network model which represents allocation relations between tasks and resources. The network can also represent task ordering relations and resource grouping relations. The proposed method formalizes the way of defining constraints manually as repeatedly checking the network structure and finding conflicts between constraints. Being applied to crew scheduling problems shows that the method can adequately represent and define constraints of some task planning problems with the following fundamental features, (1) specifying work pattern to some resources, (2) restricting the number of resources for some works, (3) requiring multiple resources for some works, (4) prior allocation of some resources to some works and (5) considering the workload balance between resources.

  10. Analysis of Time n Frequency EEG Feature Extraction Methods for Mental Task Classification

    Directory of Open Access Journals (Sweden)

    Caglar Uyulan

    2017-01-01

    Full Text Available Many endogenous and external components may affect the physiological, mental and behavioral states in humans. Monitoring tools are required to evaluate biomarkers, identify biological events, and predict their outcomes. Being one of the valuable indicators, brain biomarkers derived from temporal or spectral electroencephalography (EEG signals processing, allow for the classification of mental disorders and mental tasks. An EEG signal has a nonstationary nature and individual frequency feature, hence it can be concluded that each subject has peculiar timing and data to extract unique features. In order to classify data, which are collected by performing four mental task (reciting the alphabet backwards, imagination of rotation of a cube, imagination of right hand movements (open/close and performing mathematical operations, discriminative features were extracted using four competitive time-frequency techniques; Wavelet Packet Decomposition (WPD, Morlet Wavelet Transform (MWT, Short Time Fourier Transform (STFT and Wavelet Filter Bank (WFB, respectively. The extracted features using both time and frequency domain information were then reduced using a principal component analysis for subset reduction. Finally, the reduced subsets were fed into a multi-layer perceptron neural network (MP-NN trained with back propagation (BP algorithm to generate a predictive model. This study mainly focuses on comparing the relative performance of time-frequency feature extraction methods that are used to classify mental tasks. The real-time (RT conducted experimental results underlined that the WPD feature extraction method outperforms with 92% classification accuracy compared to three other aforementioned methods for four different mental tasks.

  11. Hawaii Energy Strategy Project 2: Fossil Energy Review. Task IV. Scenario development and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, N.D.; Breazeale, K. [ed.

    1993-12-01

    The Hawaii Energy Strategy (HES) Program is a seven-project effort led by the State of Hawaii Department of Business, Economic Development & Tourism (DBEDT) to investigate a wide spectrum of Hawaii energy issues. The East-West Center`s Program on Resources: Energy and Minerals, has been assigned HES Project 2, Fossil Energy Review, which focuses on fossil energy use in Hawaii and the greater regional and global markets. HES Project 2 has four parts: Task I (World and Regional Fossil Energy Dynamics) covers petroleum, natural gas, and coal in global and regional contexts, along with a discussion of energy and the environment. Task II (Fossil Energy in Hawaii) focuses more closely on fossil energy use in Hawaii: current utilization and trends, the structure of imports, possible future sources of supply, fuel substitutability, and energy security. Task III`s emphasis is Greenfield Options; that is, fossil energy sources not yet used in Hawaii. This task is divided into two sections: first, an in-depth {open_quotes}Assessment of Coal Technology Options and Implications for the State of Hawaii,{close_quotes} along with a spreadsheet analysis model, which was subcontracted to the Environmental Assessment and Information Sciences Division of Argonne National Laboratory; and second, a chapter on liquefied natural gas (LNG) in the Asia-Pacific market and the issues surrounding possible introduction of LNG into the Hawaii market.

  12. Multi-currency Influence Diagrams

    DEFF Research Database (Denmark)

    Nielsen, Søren Holbech; Nielsen, Thomas Dyhre; Jensen, Finn V.

    2007-01-01

    When using the influence diagrams framework for solving a decision problem with several different quantitative utilities, the traditional approach has been to convert the utilities into one common currency. This conversion is carried out using a tacit transformation, under the assumption...... that the converted problem is equivalent to the original one. In this paper we present an extension of the influence diagram framework. The extension allows for these decision problems to be modelled in their original form. We present an algorithm that, given a linear conversion function between the currencies...

  13. Diagrams for symmetric product orbifolds

    International Nuclear Information System (INIS)

    Pakman, Ari; Rastelli, Leonardo; Razamat, Shlomo S.

    2009-01-01

    We develop a diagrammatic language for symmetric product orbifolds of two-dimensional conformal field theories. Correlation functions of twist operators are written as sums of diagrams: each diagram corresponds to a branched covering map from a surface where the fields are single-valued to the base sphere where twist operators are inserted. This diagrammatic language facilitates the study of the large N limit and makes more transparent the analogy between symmetric product orbifolds and free non-abelian gauge theories. We give a general algorithm to calculate the leading large N contribution to four-point correlators of twist fields.

  14. Visualizing weighted networks: a performance comparison of adjacency matrices versus node-link diagrams

    Science.gov (United States)

    McIntire, John P.; Osesina, O. Isaac; Bartley, Cecilia; Tudoreanu, M. Eduard; Havig, Paul R.; Geiselman, Eric E.

    2012-06-01

    Ensuring the proper and effective ways to visualize network data is important for many areas of academia, applied sciences, the military, and the public. Fields such as social network analysis, genetics, biochemistry, intelligence, cybersecurity, neural network modeling, transit systems, communications, etc. often deal with large, complex network datasets that can be difficult to interact with, study, and use. There have been surprisingly few human factors performance studies on the relative effectiveness of different graph drawings or network diagram techniques to convey information to a viewer. This is particularly true for weighted networks which include the strength of connections between nodes, not just information about which nodes are linked to other nodes. We describe a human factors study in which participants performed four separate network analysis tasks (finding a direct link between given nodes, finding an interconnected node between given nodes, estimating link strengths, and estimating the most densely interconnected nodes) on two different network visualizations: an adjacency matrix with a heat-map versus a node-link diagram. The results should help shed light on effective methods of visualizing network data for some representative analysis tasks, with the ultimate goal of improving usability and performance for viewers of network data displays.

  15. Construction of mammographic examination process ontology using bottom-up hierarchical task analysis.

    Science.gov (United States)

    Yagahara, Ayako; Yokooka, Yuki; Jiang, Guoqian; Tsuji, Shintarou; Fukuda, Akihisa; Nishimoto, Naoki; Kurowarabi, Kunio; Ogasawara, Katsuhiko

    2018-03-01

    Describing complex mammography examination processes is important for improving the quality of mammograms. It is often difficult for experienced radiologic technologists to explain the process because their techniques depend on their experience and intuition. In our previous study, we analyzed the process using a new bottom-up hierarchical task analysis and identified key components of the process. Leveraging the results of the previous study, the purpose of this study was to construct a mammographic examination process ontology to formally describe the relationships between the process and image evaluation criteria to improve the quality of mammograms. First, we identified and created root classes: task, plan, and clinical image evaluation (CIE). Second, we described an "is-a" relation referring to the result of the previous study and the structure of the CIE. Third, the procedural steps in the ontology were described using the new properties: "isPerformedBefore," "isPerformedAfter," and "isPerformedAfterIfNecessary." Finally, the relationships between tasks and CIEs were described using the "isAffectedBy" property to represent the influence of the process on image quality. In total, there were 219 classes in the ontology. By introducing new properties related to the process flow, a sophisticated mammography examination process could be visualized. In relationships between tasks and CIEs, it became clear that the tasks affecting the evaluation criteria related to positioning were greater in number than those for image quality. We developed a mammographic examination process ontology that makes knowledge explicit for a comprehensive mammography process. Our research will support education and help promote knowledge sharing about mammography examination expertise.

  16. [Comparison of film-screen combinations with contrast detail diagram and interactive image analysis. 2: Linear assessment of grey scale ranges with interactive image analysis].

    Science.gov (United States)

    Stamm, G; Eichbaum, G; Hagemann, G

    1997-09-01

    The following three screen-film combinations were compared: a) a combination of anticrossover film and UV-light emitting screens, b) a combination of blue-light emitting screens and film, and c) a conventional green fluorescing screen-film combination. Radiographs of a specially designed plexiglass phantom (0.2 x 0.2 x 0.12 m3) with bar patterns of lead and plaster and of air, respectively were obtained using the following parameters: 12 pulse generator, 0.6 mm focus size, 4.7 mm aluminum pre-filter, a grid with 40 lines/cm (12:1) and a focus-detector distance of 1.15 m. Image analysis was performed using an IBAS system and a Zeiss Kontron computer. Display conditions were the following: display distance 0.12 m, a vario film objective 35/70 (Zeiss), a video camera tube with a PbO photocathode, 625 lines (Siemens Heimann), an IBAS image matrix of 512 x 512 pixels with a resolution of 7 lines/mm, the projected matrix area was 5000 microns2. Grey scale ranges were measured on a line perpendicular to the grouped bar patterns. The difference between the maximum and minimum density value served as signal. The spatial resolution of the detector system was measured when the signal value was three times higher than the standard deviation of the means of multiple density measurements. The results showed considerable advantages of the two new screen-film combinations as compared to the conventional screen-film combination. The result was contradictory to the findings with pure visual assessment of thresholds (part I) that had found no differences. The authors concluded that (automatic) interactive image analysis algorithms serve as an objective measure and are specifically advantageous when small differences in image quality are to be evaluated.

  17. Correlating behavioral responses to FMRI signals from human prefrontal cortex: examining cognitive processes using task analysis.

    Science.gov (United States)

    DeSouza, Joseph F X; Ovaysikia, Shima; Pynn, Laura

    2012-06-20

    The aim of this methods paper is to describe how to implement a neuroimaging technique to examine complementary brain processes engaged by two similar tasks. Participants' behavior during task performance in an fMRI scanner can then be correlated to the brain activity using the blood-oxygen-level-dependent signal. We measure behavior to be able to sort correct trials, where the subject performed the task correctly and then be able to examine the brain signals related to correct performance. Conversely, if subjects do not perform the task correctly, and these trials are included in the same analysis with the correct trials we would introduce trials that were not only for correct performance. Thus, in many cases these errors can be used themselves to then correlate brain activity to them. We describe two complementary tasks that are used in our lab to examine the brain during suppression of an automatic responses: the stroop(1) and anti-saccade tasks. The emotional stroop paradigm instructs participants to either report the superimposed emotional 'word' across the affective faces or the facial 'expressions' of the face stimuli(1,2). When the word and the facial expression refer to different emotions, a conflict between what must be said and what is automatically read occurs. The participant has to resolve the conflict between two simultaneously competing processes of word reading and facial expression. Our urge to read out a word leads to strong 'stimulus-response (SR)' associations; hence inhibiting these strong SR's is difficult and participants are prone to making errors. Overcoming this conflict and directing attention away from the face or the word requires the subject to inhibit bottom up processes which typically directs attention to the more salient stimulus. Similarly, in the anti-saccade task(3,4,5,6), where an instruction cue is used to direct only attention to a peripheral stimulus location but then the eye movement is made to the mirror opposite position

  18. Thermal-Hydraulic Analysis Tasks for ANAV NPPs in Support of Plant Operation and Control

    Directory of Open Access Journals (Sweden)

    L. Batet

    2007-11-01

    Full Text Available Thermal-hydraulic analysis tasks aimed at supporting plant operation and control of nuclear power plants are an important issue for the Asociación Nuclear Ascó-Vandellòs (ANAV. ANAV is the consortium that runs the Ascó power plants (2 units and the Vandellòs-II power plant. The reactors are Westinghouse-design, 3-loop PWRs with an approximate electrical power of 1000 MW. The Technical University of Catalonia (UPC thermal-hydraulic analysis team has jointly worked together with ANAV engineers at different levels in the analysis and improvement of these reactors. This article is an illustration of the usefulness of computational analysis for operational support. The contents presented were operational between 1985 and 2001 and subsequently changed slightly following various organizational adjustments. The paper has two different parts. In the first part, it describes the specific aspects of thermal-hydraulic analysis tasks related to operation and control and, in the second part, it briefly presents the results of three examples of analyses that were performed. All the presented examples are related to actual situations in which the scenarios were studied by analysts using thermal-hydraulic codes and prepared nodalizations. The paper also includes a qualitative evaluation of the benefits obtained by ANAV through thermal-hydraulic analyses aimed at supporting operation and plant control.

  19. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  20. Project Management Plan for the INEL technology logic diagrams

    International Nuclear Information System (INIS)

    Rudin, M.J.

    1992-10-01

    This Project Management Plan (PjMP) describes the elements of project planning and control that apply to activities outlined in Technical Task Plan (TTP) ID-121117, ''Technology Logic Diagrams For The INEL.'' The work on this project will be conducted by personnel in EG ampersand G Idaho, Inc.'s Waste Technology Development Program. Technology logic diagrams represent a formal methodology to identify technology gaps or needs within Environmental Restoration/Waste Management Operations, which will focus on Office of Environmental Restoration and Waste Management (EM-50) research and development, demonstration, test, and evaluation efforts throughout the US Department of Energy complex. This PjMP describes the objectives, organization, roles and responsibilities, workscope and processes for implementing and managing the technology logic diagram for the Idaho National Engineering Laboratory project

  1. Implementation of Hierarchical Task Analysis for User Interface Design in Drawing Application for Early Childhood Education

    Directory of Open Access Journals (Sweden)

    Mira Kania Sabariah

    2016-05-01

    Full Text Available Draw learning in early childhood is an important lesson and full of stimulation of the process of growth and development of children which could help to train the fine motor skills. We have had a lot of applications that can be used to perform learning, including interactive learning applications. Referring to the observations that have been conducted showed that the experiences given by the applications that exist today are very diverse and have not been able to represent the model of learning and characteristics of early childhood (4-6 years. Based on the results, Hierarchical Task Analysis method generated a list of tasks that must be done in designing an user interface that represents the user experience in draw learning. Then by using the Heuristic Evaluation method the usability of the model has fulfilled a very good level of understanding and also it can be enhanced and produce a better model.

  2. HIV/AIDS case management tasks and activities: the results of a functional analysis study.

    Science.gov (United States)

    Grube, B; Chernesky, R H

    2001-01-01

    Functional analysis, a variation of the time study technique, was used to examine how HIV/AIDS case managers in the tri-county region of New York State spend their time-the actual tasks and activities they choose to perform relative to the total universe of activities and tasks subsumed in the general category of case management. The picture developed was of a system operating primarily in a crisis mode, spending relatively brief amounts of time completing a range of activities and providing an extensive scope of services for or on behalf of clients. The bulk of the work was client centered, not administrative, and involved providing disease management and essential services (e.g., family and mental health). The implications of these findings are discussed, with particular attention paid to the potential influence of client profiles and worker demographics.

  3. Algorithms and programs for consequence diagram and fault tree construction

    International Nuclear Information System (INIS)

    Hollo, E.; Taylor, J.R.

    1976-12-01

    A presentation of algorithms and programs for consequence diagram and sequential fault tree construction that are intended for reliability and disturbance analysis of large systems. The system to be analyzed must be given as a block diagram formed by mini fault trees of individual system components. The programs were written in LISP programming language and run on a PDP8 computer with 8k words of storage. A description is given of the methods used and of the program construction and working. (author)

  4. Diagrams in the polaron model

    International Nuclear Information System (INIS)

    Smondyrev, M.A.

    1985-01-01

    The perturbation theory for the polaron energy is systematically treated on the diagrammatic basis. Feynman diagrams being constructed allow to calculate the polaron energy up to the third order in powers of the coupling constant. Similar calculations are performed for the average number of virtual phonons

  5. Algorithmic approach to diagram techniques

    International Nuclear Information System (INIS)

    Ponticopoulos, L.

    1980-10-01

    An algorithmic approach to diagram techniques of elementary particles is proposed. The definition and axiomatics of the theory of algorithms are presented, followed by the list of instructions of an algorithm formalizing the construction of graphs and the assignment of mathematical objects to them. (T.A.)

  6. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  7. Infrared thermography method for fast estimation of phase diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Palomo Del Barrio, Elena [Université de Bordeaux, Institut de Mécanique et d’Ingénierie, Esplanade des Arts et Métiers, 33405 Talence (France); Cadoret, Régis [Centre National de la Recherche Scientifique, Institut de Mécanique et d’Ingénierie, Esplanade des Arts et Métiers, 33405 Talence (France); Daranlot, Julien [Solvay, Laboratoire du Futur, 178 Av du Dr Schweitzer, 33608 Pessac (France); Achchaq, Fouzia, E-mail: fouzia.achchaq@u-bordeaux.fr [Université de Bordeaux, Institut de Mécanique et d’Ingénierie, Esplanade des Arts et Métiers, 33405 Talence (France)

    2016-02-10

    Highlights: • Infrared thermography is proposed to determine phase diagrams in record time. • Phase boundaries are detected by means of emissivity changes during heating. • Transition lines are identified by using Singular Value Decomposition techniques. • Different binary systems have been used for validation purposes. - Abstract: Phase change materials (PCM) are widely used today in thermal energy storage applications. Pure PCMs are rarely used because of non adapted melting points. Instead of them, mixtures are preferred. The search of suitable mixtures, preferably eutectics, is often a tedious and time consuming task which requires the determination of phase diagrams. In order to accelerate this screening step, a new method for estimating phase diagrams in record time (1–3 h) has been established and validated. A sample composed by small droplets of mixtures with different compositions (as many as necessary to have a good coverage of the phase diagram) deposited on a flat substrate is first prepared and cooled down to ambient temperature so that all droplets crystallize. The plate is then heated at constant heating rate up to a sufficiently high temperature for melting all the small crystals. The heating process is imaged by using an infrared camera. An appropriate method based on singular values decomposition technique has been developed to analyze the recorded images and to determine the transition lines of the phase diagram. The method has been applied to determine several simple eutectic phase diagrams and the reached results have been validated by comparison with the phase diagrams obtained by Differential Scanning Calorimeter measurements and by thermodynamic modelling.

  8. Self-narrative reconstruction in emotion-focused therapy: A preliminary task analysis.

    Science.gov (United States)

    Cunha, Carla; Mendes, Inês; Ribeiro, António P; Angus, Lynne; Greenberg, Leslie S; Gonçalves, Miguel M

    2017-11-01

    This research explored the consolidation phase of emotion-focused therapy (EFT) for depression and studies-through a task-analysis method-how client-therapist dyads evolved from the exploration of the problem to self-narrative reconstruction. Innovative moments (IMs) were used to situate the process of self-narrative reconstruction within sessions, particularly through reconceptualization and performing change IMs. We contrasted the observation of these occurrences with a rational model of self-narrative reconstruction, previously built. This study presents the rational model and the revised rational-empirical model of the self-narrative reconstruction task in three EFT dyads, suggesting nine steps necessary for task resolution: (1) Explicit recognition of differences in the present and steps in the path of change; (2) Development of a meta-perspective contrast between present self and past self; (3) Amplification of contrast in the self; (4) A positive appreciation of changes is conveyed; (5) Occurrence of feelings of empowerment, competence, and mastery; (6) Reference to difficulties still present; (7) Emphasis on the loss of centrality of the problem; (8) Perception of change as a gradual, developing process; and (9) Reference to projects, experiences of change, or elaboration of new plans. Central aspects of therapist activity in facilitating the client's progression along these nine steps are also elaborated.

  9. Electroencephalogram complexity analysis in children with attention-deficit/hyperactivity disorder during a visual cognitive task.

    Science.gov (United States)

    Zarafshan, Hadi; Khaleghi, Ali; Mohammadi, Mohammad Reza; Moeini, Mahdi; Malmir, Nastaran

    2016-01-01

    The aim of this study was to investigate electroencephalogram (EEG) dynamics using complexity analysis in children with attention-deficit/hyperactivity disorder (ADHD) compared with healthy control children when performing a cognitive task. Thirty 7-12-year-old children meeting Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition (DSM-5) criteria for ADHD and 30 healthy control children underwent an EEG evaluation during a cognitive task, and Lempel-Ziv complexity (LZC) values were computed. There were no significant differences between ADHD and control groups on age and gender. The mean LZC of the ADHD children was significantly larger than healthy children over the right anterior and right posterior regions during the cognitive performance. In the ADHD group, complexity of the right hemisphere was higher than that of the left hemisphere, but the complexity of the left hemisphere was higher than that of the right hemisphere in the normal group. Although fronto-striatal dysfunction is considered conclusive evidence for the pathophysiology of ADHD, our arithmetic mental task has provided evidence of structural and functional changes in the posterior regions and probably cerebellum in ADHD.

  10. Analysis of Mexico wind tunnel measurements. Final report of IEA Task 29, Mexnext (Phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    Schepers, J.G.; Boorsma, K. [Energy research Center of the Netherlands ECN, Petten (Netherlands); Cho, T. [Korea Aerospace Research Institute KARI, Daejeon (Korea, Republic of); Gomez-Iradi, S. [National Renewable Energy Center of Spain CENER, Sarriguren (Spain); Schaffarczyk, P. [A. Jeromin University of Applied Sciences, CEWind EG, Kiel (Germany); Shen, W.Z. [The Technical University of Denmark, Kongens Lyngby (Denmark); Lutz, T. [K. Meister University of Stuttgart, Stuttgart (Germany); Stoevesandt, B. [ForWind, Zentrum fuer Windenergieforschung, Oldenburg (Germany); Schreck, S. [National Renewable Energy Laboratory NREL, Golden, CO (United States); Micallef, D.; Pereira, R.; Sant, T. [Delft University of Technology TUD, Delft (Netherlands); Madsen, H.A.; Soerensen, N. [Risoe-DTU, Roskilde (Denmark)

    2012-02-15

    This report describes the work performed within the first phase of IEA Task 29 Mexnext. In this IEA Task 29 a total of 20 organisations from 11 different countries collaborated in analysing the measurements which have been performed in the EU project 'Mexico'. Within this Mexico project 9 European institutes carried out a wind tunnel experiment in the Large Low Speed Facility (LLF) of the German Dutch Wind Facilities DNW on a rotor with a diameter of 4.5 m. Pressure distributions were measured at five locations along the blade along with detailed flow field measurements around the rotor plane using stereo PIV. As a result of the international collaboration within this task a very thorough analysis of the data could be carried out and a large number of codes were validated not only in terms of loads but also in terms of underlying flow field. The detailed pressure measurements along the blade in combination with the detailed flow field measurements gave a unique opportunity to better understand the response of a wind turbine to the incoming flow field. Deficiencies in modelling have been established and directions for model improvement can be given.

  11. Utilizing job/task analysis to establish content validity in the design of training programs

    Energy Technology Data Exchange (ETDEWEB)

    Nay, W.E.

    1988-01-01

    The decade of the 1980's has been a turbulent time for the Department of Energy. With concern mounting about the terrorist threat, a wave of congressional inquiries and internal inspections crossed the nation and engulfed many of the nuclear laboratories and facilities operated by DOE contractors. A typical finding was the need to improve, and increase, the training of the protective force. The immediate reaction resulted in a wide variety of responses, with most contractors feeling safer with too much, rather than not enough training. As soon as the initial pressures to upgrade subsided, a task force was established to evaluate the overall training needs. Representatives from the contractor facilities worked together to conduct a job analysis of the protective force. A generic task inventory was established, and validated at the different sites. This list has been invaluable for determining the tasks, conditions, and standards needed to develop well stated learning objectives. The enhanced training programs are being refined to ensure job content validity based on the data collected.

  12. Reliability of steam-turbine rotors. Task 1. Lifetime prediction analysis system. Final report

    International Nuclear Information System (INIS)

    Nair, P.K.; Pennick, H.G.; Peters, J.E.; Wells, C.H.

    1982-12-01

    Task 1 of RP 502, Reliability of Steam Turbine Rotors, resulted in the development of a computerized lifetime prediction analysis system (STRAP) for the automatic evaluation of rotor integrity based upon the results of a boresonic examination of near-bore defects. Concurrently an advanced boresonic examination system (TREES), designed to acquire data automatically for lifetime analysis, was developed and delivered to the maintenance shop of a major utility. This system and a semi-automated, state-of-the-art system (BUCS) were evaluated on two retired rotors as part of the Task 2 effort. A modified nonproprietary version of STRAP, called SAFER, is now available for rotor lifetime prediction analysis. STRAP and SAFER share a common fracture analysis postprocessor for rapid evaluation of either conventional boresonic amplitude data or TREES cell data. The final version of this postprocessor contains general stress intensity correlations for elliptical cracks in a radial stress gradient and provision for elastic-plastic instability of the ligament between an imbedded crack and the bore surface. Both linear elastic and ligament rupture models were developed for rapid analysis of linkup within three-dimensional clusters of defects. Bore stress-rupture criteria are included, but a creep-fatigue crack growth data base is not available. Physical and mechanical properties of air-melt 1CrMoV forgings are built into the program; however, only bounding values of fracture toughness versus temperature are available. Owing to the lack of data regarding the probability of flaw detection for the boresonic systems and of quantitative verification of the flaw linkup analysis, automatic evlauation of boresonic results is not recommended, and the lifetime prediction system is currently restricted to conservative, deterministic analysis of specified flaw geometries

  13. Muscle Fatigue Analysis of the Deltoid during Three Head-Related Static Isometric Contraction Tasks

    Directory of Open Access Journals (Sweden)

    Wenxiang Cui

    2017-05-01

    Full Text Available This study aimed to investigate the fatiguing characteristics of muscle-tendon units (MTUs within skeletal muscles during static isometric contraction tasks. The deltoid was selected as the target muscle and three head-related static isometric contraction tasks were designed to activate three heads of the deltoid in different modes. Nine male subjects participated in this study. Surface electromyography (SEMG signals were collected synchronously from the three heads of the deltoid. The performances of five SEMG parameters, including root mean square (RMS, mean power frequency (MPF, the first coefficient of autoregressive model (ARC1, sample entropy (SE and Higuchi’s fractal dimension (HFD, in quantification of fatigue, were evaluated in terms of sensitivity to variability ratio (SVR and consistency firstly. Then, the HFD parameter was selected as the fatigue index for further muscle fatigue analysis. The experimental results demonstrated that the three deltoid heads presented different activation modes during three head-related fatiguing contractions. The fatiguing characteristics of the three heads were found to be task-dependent, and the heads kept in a relatively high activation level were more prone to fatigue. In addition, the differences in fatiguing rate between heads increased with the increase in load. The findings of this study can be helpful in better understanding the underlying neuromuscular control strategies of the central nervous system (CNS. Based on the results of this study, the CNS was thought to control the contraction of the deltoid by taking the three heads as functional units, but a certain synergy among heads might also exist to accomplish a contraction task.

  14. Reuse-centric Requirements Analysis with Task Models, Scenarios, and Critical Parameters

    Directory of Open Access Journals (Sweden)

    Cyril Montabert

    2007-02-01

    Full Text Available This paper outlines a requirements-analysis process that unites task models, scenarios, and critical parameters to exploit and generate reusable knowledge at the requirements phase. Through the deployment of a critical-parameter-based approach to task modeling, the process yields the establishment of an integrative and formalized model issued from scenarios that can be used for requirements characterization. Furthermore, not only can this entity serve as interface to a knowledge repository relying on a critical-parameter-based taxonomy to support reuse but its characterization in terms of critical parameters also allows the model to constitute a broader reuse solution. We discuss our vision for a user-centric and reuse-centric approach to requirements analysis, present previous efforts implicated with this line of work, and state the revisions brought to extend the reuse potential and effectiveness of a previous iteration of a requirements tool implementing such process. Finally, the paper describes the sequence and nature of the activities involved with the conduct of our proposed requirements-analysis technique, concluding by previewing ongoing work in the field that will explore the feasibility for designers to use our approach.

  15. Introducing People – Genre Analysis and Oral Comprehension and Oral Production Tasks

    Directory of Open Access Journals (Sweden)

    Keila Rocha Reis de Carvalho

    2012-02-01

    Full Text Available This paper aims at presenting an analysis of the genre introducing people and at suggesting listening comprehension and oral production tasks. This work was developed according to the characterization of the rhetorical organization of situations taken from seventeen films that contain the genre under analysis. Although several studies in the ESP area carried out recently (Andrade, 2003; Cardoso, 2003; Shergue, 2003; Belmonte, 2003; Serafini, 2003 have identified listening comprehension and oral production as the abilities that should be prioritized in an English course, much needs to be done, especially concerning the oral genres that take into account the language the learners of English as a second language need in their target situation. This work is based on Hutchinson & Waters (1987 theoretical background on ESP, Swales’ (1990 genre analysis, Ramos’ (2004 pedagogical proposal, and also on Ellis´ (2003 tasks concept. The familiarization of learners of English as a second language with this genre will provide them with the opportunity to better understand and use the English language in their academic and professional life.

  16. Wristbands as aids to reduce misidentification: an ethnographically guided task analysis.

    Science.gov (United States)

    Smith, Andrew F; Casey, Kate; Wilson, James; Fischbacher-Smith, Denis

    2011-10-01

    Wristbands are recommended in the UK as a means of verifying patient identity but have been little studied. We aimed to document how wristbands are used in practice. and participants Task analysis of wristband application and use, drawing on qualitative analysis of workplace observation of, and interviews with, clinical and non-clinical staff. Two acute district general hospitals in northern England. Our findings indicate high levels of awareness amongst clinical staff of local and national policies on wristband use, but some ambiguity about the details therein. In contrast, non-clinical staff such as ward clerks and porters were less aware of policy, although their actions also expose patients to risks resulting from misidentification. Of seven subtasks identified by the task analysis of wristband application and use, three appeared to offer particular opportunity for error. Making the decision to apply, especially in emergency patients, is important because delay in application can delay correct identification. Advance preparation of wristbands for elective admission without the patient being present can risk erroneous data or misapplication. Lastly, utilization of wristbands to verify patient identity was greater in some clinical circumstances (blood transfusion and medication administration) than in others (before transferring patients around the hospital and during handovers of care). Wristbands for patient identification are not being used to their full potential. Attention to detail in application and use, especially during handover and transfer, and an appreciation of the role played by 'non-clinical' staff, may offer further gains in patient safety.

  17. Task analysis and structure scheme for center manager station in large container inspection system

    International Nuclear Information System (INIS)

    Li Zheng; Gao Wenhuan; Wang Jingjin; Kang Kejun; Chen Zhiqiang

    1997-01-01

    LCIS works as follows: the accelerator generates beam pulses which are formed into fan shape; the scanning system drags a lorry with a container passing through the beam in constant speed; the detector array detects the beam penetrating the lorry; the projection data acquisition system reads the projections and completes an inspection image of the lorry. All these works are controlled and synchronized by the center manage station. The author will describe the process of the projection data acquisition in scanning mode and the methods of real-time projection data processing. the task analysis and the structure scheme of center manager station is presented

  18. Analysis of brain activity and response to colour stimuli during learning tasks: an EEG study

    Science.gov (United States)

    Folgieri, Raffaella; Lucchiari, Claudio; Marini, Daniele

    2013-02-01

    The research project intends to demonstrate how EEG detection through BCI device can improve the analysis and the interpretation of colours-driven cognitive processes through the combined approach of cognitive science and information technology methods. To this end, firstly it was decided to design an experiment based on comparing the results of the traditional (qualitative and quantitative) cognitive analysis approach with the EEG signal analysis of the evoked potentials. In our case, the sensorial stimulus is represented by the colours, while the cognitive task consists in remembering the words appearing on the screen, with different combination of foreground (words) and background colours. In this work we analysed data collected from a sample of students involved in a learning process during which they received visual stimuli based on colour variation. The stimuli concerned both the background of the text to learn and the colour of the characters. The experiment indicated some interesting results concerning the use of primary (RGB) and complementary (CMY) colours.

  19. A Content Analysis of General Chemistry Laboratory Manuals for Evidence of Higher-Order Cognitive Tasks

    Science.gov (United States)

    Domin, Daniel S.

    1999-01-01

    The science laboratory instructional environment is ideal for fostering the development of problem-solving, manipulative, and higher-order thinking skills: the skills needed by today's learner to compete in an ever increasing technology-based society. This paper reports the results of a content analysis of ten general chemistry laboratory manuals. Three experiments from each manual were examined for evidence of higher-order cognitive activities. Analysis was based upon the six major cognitive categories of Bloom's Taxonomy of Educational Objectives: knowledge, comprehension, application, analysis, synthesis, and evaluation. The results of this study show that the overwhelming majority of general chemistry laboratory manuals provide tasks that require the use of only the lower-order cognitive skills: knowledge, comprehension, and application. Two of the laboratory manuals were disparate in having activities that utilized higher-order cognition. I describe the instructional strategies used within these manuals to foster higher-order cognitive development.

  20. Rapid analysis of hay attributes using NIRS. Final report, Task II alfalfa supply system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-24

    This final report provides technical information on the development of a near infrared reflectance spectroscopy (NIRS) system for the analysis of alfalfa hay. The purpose of the system is to provide consistent quality for processing alfalfa stems for fuel and alfalfa leaf meal products for livestock feed. Project tasks were to: (1) develop an NIRS driven analytical system for analysis of alfalfa hay and processed alfalfa products; (2) assist in hiring a qualified NIRS technician and recommend changes in testing equipment necessary to provide accurate analysis; (3) calibrate the NIRS instrument for accurate analyses; and (4) develop prototype equipment and sampling procedures as a first step towards development of a totally automated sampling system that would rapidly sample and record incoming feedstock and outbound product. An accurate hay testing program was developed, along with calibration equations for analyzing alfalfa hay and sun-cured alfalfa pellets. A preliminary leaf steam calibration protocol was also developed. 7 refs., 11 figs., 10 tabs.

  1. Construction of UML class diagram with Model-Driven Development

    Directory of Open Access Journals (Sweden)

    Tomasz Górski

    2016-03-01

    Full Text Available Model transformations play a key role in software development projects based on Model--Driven Development (MDD principles. Transformations allow for automation of repetitive and well-defined steps, thus shortening design time and reducing a number of errors. In the object-oriented approach, the key elements are use cases. They are described, modelled and later designed until executable application code is obtained. The aim of the paper is to present transformation of a model-to-model type, Communication-2-Class, which automates construction of Unified Modelling Language (UML class diagram in the context of the analysis/design model. An UML class diagram is created based on UML communication diagram within use case realization. As a result, a class diagram shows all of the classes involved in the use case realization and the relationships among them. The plug-in which implements Communication-2-Class transformation was implemented in the IBM Rational Software Architect. The article presents the tests results of developed plug-in, which realizes Communication-2-Class transformation, showing capabilities of shortening use case realization’s design time.[b]Keywords[/b]: Model-Driven Development, transformations, Unified Modelling Language, analysis/design model, UML class diagram, UML communication diagram

  2. GRASP/Ada: Graphical Representations of Algorithms, Structures, and Processes for Ada. The development of a program analysis environment for Ada: Reverse engineering tools for Ada, task 2, phase 3

    Science.gov (United States)

    Cross, James H., II

    1991-01-01

    The main objective is the investigation, formulation, and generation of graphical representations of algorithms, structures, and processes for Ada (GRASP/Ada). The presented task, in which various graphical representations that can be extracted or generated from source code are described and categorized, is focused on reverse engineering. The following subject areas are covered: the system model; control structure diagram generator; object oriented design diagram generator; user interface; and the GRASP library.

  3. The Butterfly Diagram Internal Structure

    International Nuclear Information System (INIS)

    Ternullo, Maurizio

    2013-01-01

    A time-latitude diagram, where the spotgroup area is taken into account, is presented for cycles 12 through 23. The results show that the spotted area is concentrated in few, small portions ( k nots ) of the Butterfly Diagram (BD). The BD may be described as a cluster of knots. Knots are distributed in the butterfly wings in a seemingly randomly way. A knot may appear at either lower or higher latitudes than previous ones, in spite of the prevalent tendency to appear at lower and lower latitudes. Accordingly, the spotted area centroid, far from continuously drifting equatorward, drifts poleward or remains stationary in any hemisphere for significant fractions (≈ 1/3) of the cycle total duration. In a relevant number of semicycles, knots seem to form two roughly parallel, oblique c hains , separated by an underspotted band. This picture suggests that two (or more) ''activity streams'' approach the equator at a rate higher than the spot zone as a whole.

  4. Risk assessment of underpass infrastructure project based on IS0 31000 and ISO 21500 using fishbone diagram and RFMEA (project risk failure mode and effects analysis) method

    Science.gov (United States)

    Purwanggono, Bambang; Margarette, Anastasia

    2017-12-01

    Completion time of highway construction is very meaningful for smooth transportation, moreover expected number of ownership motor vehicle will increase each year. Therefore, this study was conducted with to analyze the constraints that contained in an infrastructure development project. This research was conducted on Jatingaleh Underpass Project, Semarang. This research was carried out while the project is running, on the implementation, this project is experiencing delays. This research is done to find out what are the constraints that occur in execution of a road infrastructure project, in particular that causes delays. The method that used to find the root cause is fishbone diagram to obtain a possible means of mitigation. Coupled with the RFMEA method used to determine the critical risks that must be addressed immediately on road infrastructure project. The result of data tabulation in this study indicates that the most possible mitigation tool to make a Standard Operating Procedure (SOP) recommendations to disrupt utilities that interfere project implementation. Process of risk assessment has been carried out systematically based on ISO 31000:2009 on risk management and for determination of delayed variables, the requirements of process groups according to ISO 21500:2013 on project management were used.

  5. European Extremely Large Telescope (E-ELT) availability stochastic model: integrating failure mode and effect analysis (FMEA), influence diagram, and Bayesian network together

    Science.gov (United States)

    Verzichelli, Gianluca

    2016-08-01

    An Availability Stochastic Model for the E-ELT has been developed in GeNIE. The latter is a Graphical User Interface (GUI) for the Structural Modeling, Inference, and Learning Engine (SMILE), originally distributed by the Decision Systems Laboratory from the University of Pittsburgh, and now being a product of Bayes Fusion, LLC. The E-ELT will be the largest optical/near-infrared telescope in the world. Its design comprises an Alt-Azimuth mount reflecting telescope with a 39-metre-diameter segmented primary mirror, a 4-metre-diameter secondary mirror, a 3.75-metre-diameter tertiary mirror, adaptive optics and multiple instruments. This paper highlights how a Model has been developed for an earlier on assessment of the Telescope Avail- ability. It also describes the modular structure and the underlying assumptions that have been adopted for developing the model and demonstrates the integration of FMEA, Influence Diagram and Bayesian Network elements. These have been considered for a better characterization of the Model inputs and outputs and for taking into account Degraded-based Reliability (DBR). Lastly, it provides an overview of how the information and knowledge captured in the model may be used for an earlier on definition of the Failure, Detection, Isolation and Recovery (FDIR) Control Strategy and the Telescope Minimum Master Equipment List (T-MMEL).

  6. Communications data delivery system analysis task 2 report : high-level options for secure communications data delivery systems.

    Science.gov (United States)

    2012-05-16

    This Communications Data Delivery System Analysis Task 2 report describes and analyzes options for Vehicle to Vehicle (V2V) and Vehicle to Infrastructure (V2I) communications data delivery systems using various communication media (Dedicated Short Ra...

  7. Scheil-Gulliver Constituent Diagrams

    Science.gov (United States)

    Pelton, Arthur D.; Eriksson, Gunnar; Bale, Christopher W.

    2017-06-01

    During solidification of alloys, conditions often approach those of Scheil-Gulliver cooling in which it is assumed that solid phases, once precipitated, remain unchanged. That is, they no longer react with the liquid or with each other. In the case of equilibrium solidification, equilibrium phase diagrams provide a valuable means of visualizing the effects of composition changes upon the final microstructure. In the present study, we propose for the first time the concept of Scheil-Gulliver constituent diagrams which play the same role as that in the case of Scheil-Gulliver cooling. It is shown how these diagrams can be calculated and plotted by the currently available thermodynamic database computing systems that combine Gibbs energy minimization software with large databases of optimized thermodynamic properties of solutions and compounds. Examples calculated using the FactSage system are presented for the Al-Li and Al-Mg-Zn systems, and for the Au-Bi-Sb-Pb system and its binary and ternary subsystems.

  8. The human factors and job task analysis in nuclear power plant operation

    International Nuclear Information System (INIS)

    Stefanescu, Petre; Mihailescu, Nicolae; Dragusin, Octavian

    1999-01-01

    After a long period of time, during the development of the NPP technology, where the plant hardware has been considered to be the main factor for a safe, reliable and economic operation, the industry is now changing to an adequate responsibility of plant hardware and operation. Since the human factors has been not discussed methodically so far, there is still a lack of improved classification systems for human errors as well as a lack of methods for the systematic approach in designing the operator's working system, as for instance by using the job task analysis (J.T.A.). The J.T.A. appears to be an adequate method to study the human factor in the nuclear power plant operation, enabling an easy conversion to operational improvements. While the results of the analysis of human errors tell 'what' is to be improved, the J.T.A. shows 'how' to improve, for increasing the quality of the work and the safety of the operator's working system. The paper analyses the issue of setting the task and displays four criteria used to select aspects in NPP operation which require special consideration as personal training, design of control room, content and layout of the procedure manual, or organizing the operating personnel. The results are given as three tables giving: 1- Evaluation of deficiencies in the Working System; 2- Evaluation of the Deficiencies of Operator's Disposition; 3- Evaluation of the Mental Structure of Operation

  9. Adapting Cognitive Task Analysis to Investigate Clinical Decision Making and Medication Safety Incidents.

    Science.gov (United States)

    Russ, Alissa L; Militello, Laura G; Glassman, Peter A; Arthur, Karen J; Zillich, Alan J; Weiner, Michael

    2017-05-03

    Cognitive task analysis (CTA) can yield valuable insights into healthcare professionals' cognition and inform system design to promote safe, quality care. Our objective was to adapt CTA-the critical decision method, specifically-to investigate patient safety incidents, overcome barriers to implementing this method, and facilitate more widespread use of cognitive task analysis in healthcare. We adapted CTA to facilitate recruitment of healthcare professionals and developed a data collection tool to capture incidents as they occurred. We also leveraged the electronic health record (EHR) to expand data capture and used EHR-stimulated recall to aid reconstruction of safety incidents. We investigated 3 categories of medication-related incidents: adverse drug reactions, drug-drug interactions, and drug-disease interactions. Healthcare professionals submitted incidents, and a subset of incidents was selected for CTA. We analyzed several outcomes to characterize incident capture and completed CTA interviews. We captured 101 incidents. Eighty incidents (79%) met eligibility criteria. We completed 60 CTA interviews, 20 for each incident category. Capturing incidents before interviews allowed us to shorten the interview duration and reduced reliance on healthcare professionals' recall. Incorporating the EHR into CTA enriched data collection. The adapted CTA technique was successful in capturing specific categories of safety incidents. Our approach may be especially useful for investigating safety incidents that healthcare professionals "fix and forget." Our innovations to CTA are expected to expand the application of this method in healthcare and inform a wide range of studies on clinical decision making and patient safety.

  10. ALE Meta-Analysis of Schizophrenics Performing the N-Back Task

    Science.gov (United States)

    Harrell, Zachary

    2010-10-01

    MRI/fMRI has already proven itself as a valuable tool in the diagnosis and treatment of many illnesses of the brain, including cognitive problems. By exploiting the differences in magnetic susceptibility between oxygenated and deoxygenated hemoglobin, fMRI can measure blood flow in various regions of interest within the brain. This can determine the level of brain activity in relation to motor or cognitive functions and provide a metric for tissue damage or illness symptoms. Structural imaging techniques have shown lesions or deficiencies in tissue volumes in schizophrenics corresponding to areas primarily in the frontal and temporal lobes. These areas are currently known to be involved in working memory and attention, which many schizophrenics have trouble with. The ALE (Activation Likelihood Estimation) Meta-Analysis is able to statistically determine the significance of brain area activations based on the post-hoc combination of multiple studies. This process is useful for giving a general model of brain function in relation to a particular task designed to engage the affected areas (such as working memory for the n-back task). The advantages of the ALE Meta-Analysis include elimination of single subject anomalies, elimination of false/extremely weak activations, and verification of function/location hypotheses.

  11. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    Natalizio, A.; Kalyanam, K.M.

    1994-09-01

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  12. Comparative evaluation of three cognitive error analysis methods through an application to accident management tasks in NPPs

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kim, Jae Whan; Ha, Jae Joo; Yoon, Wan C.

    1999-01-01

    This study was performed to comparatively evaluate selected Human Reliability Analysis (HRA) methods which mainly focus on cognitive error analysis, and to derive the requirement of a new human error analysis (HEA) framework for Accident Management (AM) in nuclear power plants(NPPs). In order to achieve this goal, we carried out a case study of human error analysis on an AM task in NPPs. In the study we evaluated three cognitive HEA methods, HRMS, CREAM and PHECA, which were selected through the review of the currently available seven cognitive HEA methods. The task of reactor cavity flooding was chosen for the application study as one of typical tasks of AM in NPPs. From the study, we derived seven requirement items for a new HEA method of AM in NPPs. We could also evaluate the applicability of three cognitive HEA methods to AM tasks. CREAM is considered to be more appropriate than others for the analysis of AM tasks. But, PHECA is regarded less appropriate for the predictive HEA technique as well as for the analysis of AM tasks. In addition to these, the advantages and disadvantages of each method are described. (author)

  13. Using Affinity Diagrams to Evaluate Interactive Prototypes

    DEFF Research Database (Denmark)

    Lucero, Andrés

    2015-01-01

    our particular use of affinity diagramming in prototype evaluations. We reflect on a decade’s experience using affinity diagramming across a number of projects, both in industry and academia. Our affinity diagramming process in interaction design has been tailored and consists of four stages: creating...

  14. Automation of block assignment planning using a diagram-based scenario modeling method

    Directory of Open Access Journals (Sweden)

    Hwang In Hyuck

    2014-03-01

    Full Text Available Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  15. Automation of block assignment planning using a diagram-based scenario modeling method

    Directory of Open Access Journals (Sweden)

    In Hyuck Hwang

    2014-03-01

    Full Text Available Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is because the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manually by experienced workers. In this study, a method of representing the block assignment rules using a diagram was suggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  16. Abnormal Brain Activation During Theory of Mind Tasks in Schizophrenia: A Meta-Analysis.

    Science.gov (United States)

    Kronbichler, Lisa; Tschernegg, Melanie; Martin, Anna Isabel; Schurz, Matthias; Kronbichler, Martin

    2017-10-21

    Social cognition abilities are severely impaired in schizophrenia (SZ). The current meta-analysis used foci of 21 individual studies on functional abnormalities in the schizophrenic brain in order to identify regions that reveal convergent under- or over-activation during theory of mind (TOM) tasks. Studies were included in the analyses when contrasting tasks that require the processing of mental states with tasks which did not. Only studies that investigated patients with an ICD or DSM diagnosis were included. Quantitative voxel-based meta-analyses were done using Seed-based d Mapping software. Common TOM regions like medial-prefrontal cortex and temporo-parietal junction revealed abnormal activation in schizophrenic patients: Under-activation was identified in the medial prefrontal cortex, left orbito-frontal cortex, and in a small section of the left posterior temporo-parietal junction. Remarkably, robust over-activation was identified in a more dorsal, bilateral section of the temporo-parietal junction. Further abnormal activation was identified in medial occipito-parietal cortex, right premotor areas, left cingulate gyrus, and lingual gyrus. The findings of this study suggest that SZ patients simultaneously show over- and under-activation in TOM-related regions. Especially interesting, temporo-parietal junction reveals diverging activation patterns with an under-activating left posterior and an over-activating bilateral dorsal section. In conclusion, SZ patients show less specialized brain activation in regions linked to TOM and increased activation in attention-related networks suggesting compensatory effects. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.

  17. Job/task analysis for I ampersand C [Instrumentation and Controls] instrument technicians at the High Flux Isotope Reactor

    International Nuclear Information System (INIS)

    Duke, L.L.

    1989-09-01

    To comply with Department of Energy Order 5480.XX (Draft), a job/task analysis was initiated by the Maintenance Management Department at Oak Ridge National Laboratory (ORNL). The analysis was applicable to instrument technicians working at the ORNL High Flux Isotope Reactor (HFIR). This document presents the procedures and results of that analysis. 2 refs., 2 figs

  18. Diagram, a Learning Environment for Initiation to Object-Oriented Modeling with UML Class Diagrams

    Science.gov (United States)

    Py, Dominique; Auxepaules, Ludovic; Alonso, Mathilde

    2013-01-01

    This paper presents Diagram, a learning environment for object-oriented modelling (OOM) with UML class diagrams. Diagram an open environment, in which the teacher can add new exercises without constraints on the vocabulary or the size of the diagram. The interface includes methodological help, encourages self-correcting and self-monitoring, and…

  19. STATE-OF-THE-ART TASKS AND ACHIEVEMENTS OF PARALINGUISTIC SPEECH ANALYSIS SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. A. Karpov

    2016-07-01

    Full Text Available We present analytical survey of state-of-the-art actual tasks in the area of computational paralinguistics, as well as the recent achievements of automatic systems for paralinguistic analysis of conversational speech. Paralinguistics studies non-verbal aspects of human communication and speech such as: natural emotions, accents, psycho-physiological states, pronunciation features, speaker’s voice parameters, etc. We describe architecture of a baseline computer system for acoustical paralinguistic analysis, its main components and useful speech processing methods. We present some information on an International contest called Computational Paralinguistics Challenge (ComParE, which is held each year since 2009 in the framework of the International conference INTERSPEECH organized by the International Speech Communication Association. We present sub-challenges (tasks that were proposed at the ComParE Challenges in 2009-2016, and analyze winning computer systems for each sub-challenge and obtained results. The last completed ComParE-2015 Challenge was organized in September 2015 in Germany and proposed 3 sub-challenges: 1 Degree of Nativeness (DN sub-challenge, determination of nativeness degree of speakers based on acoustics; 2 Parkinson's Condition (PC sub-challenge, recognition of a degree of Parkinson’s condition based on speech analysis; 3 Eating Condition (EC sub-challenge, determination of the eating condition state during speaking or a dialogue, and classification of consumed food type (one of seven classes of food by the speaker. In the last sub-challenge (EC, the winner was a joint Turkish-Russian team consisting of the authors of the given paper. We have developed the most efficient computer-based system for detection and classification of the corresponding (EC acoustical paralinguistic events. The paper deals with the architecture of this system, its main modules and methods, as well as the description of used training and evaluation

  20. Space Station data system analysis/architecture study. Task 1: Functional requirements definition, DR-5

    Science.gov (United States)

    1985-01-01

    The initial task in the Space Station Data System (SSDS) Analysis/Architecture Study is the definition of the functional and key performance requirements for the SSDS. The SSDS is the set of hardware and software, both on the ground and in space, that provides the basic data management services for Space Station customers and systems. The primary purpose of the requirements development activity was to provide a coordinated, documented requirements set as a basis for the system definition of the SSDS and for other subsequent study activities. These requirements should also prove useful to other Space Station activities in that they provide an indication of the scope of the information services and systems that will be needed in the Space Station program. The major results of the requirements development task are as follows: (1) identification of a conceptual topology and architecture for the end-to-end Space Station Information Systems (SSIS); (2) development of a complete set of functional requirements and design drivers for the SSIS; (3) development of functional requirements and key performance requirements for the Space Station Data System (SSDS); and (4) definition of an operating concept for the SSIS. The operating concept was developed both from a Space Station payload customer and operator perspective in order to allow a requirements practicality assessment.

  1. Voronoi Diagrams Without Bounding Boxes

    Science.gov (United States)

    Sang, E. T. K.

    2015-10-01

    We present a technique for presenting geographic data in Voronoi diagrams without having to specify a bounding box. The method restricts Voronoi cells to points within a user-defined distance of the data points. The mathematical foundation of the approach is presented as well. The cell clipping method is particularly useful for presenting geographic data that is spread in an irregular way over a map, as for example the Dutch dialect data displayed in Figure 2. The automatic generation of reasonable cell boundaries also makes redundant a frequently used solution to this problem that requires data owners to specify region boundaries, as in Goebl (2010) and Nerbonne et al (2011).

  2. Multi-currency Influence Diagrams

    DEFF Research Database (Denmark)

    Nielsen, Søren Holbech; Nielsen, Thomas Dyhre; Jensen, Finn Verner

    2004-01-01

    Solution of decision problems, which involve utilities of several currencies, have traditionally required the problems to be converted into decision problems involving utilities of only one currency. This conversion are carried out using a tacit transformation, under the assumption...... that the converted problem is equivalent to the original one. In this paper we present an extension of the Influence Diagram framework, which allows for these decision problems to be modelled in their original form. We present an algorithm that, given a conversion function between the currencies, discovers...

  3. Phase diagrams for surface alloys

    DEFF Research Database (Denmark)

    Christensen, Asbjørn; Ruban, Andrei; Stoltze, Per

    1997-01-01

    We discuss surface alloy phases and their stability based on surface phase diagrams constructed from the surface energy as a function of the surface composition. We show that in the simplest cases of pseudomorphic overlayers there are four generic classes of systems, characterized by the sign...... is based on density-functional calculations using the coherent-potential approximation and on effective-medium theory. We give self-consistent density-functional results for the segregation energy and surface mixing energy for all combinations of the transition and noble metals. Finally we discuss...

  4. Good research practices for comparative effectiveness research: approaches to mitigate bias and confounding in the design of nonrandomized studies of treatment effects using secondary data sources: the International Society for Pharmacoeconomics and Outcomes Research Good Research Practices for Retrospective Database Analysis Task Force Report--Part II.

    Science.gov (United States)

    Cox, Emily; Martin, Bradley C; Van Staa, Tjeerd; Garbe, Edeltraut; Siebert, Uwe; Johnson, Michael L

    2009-01-01

    The goal of comparative effectiveness analysis is to examine the relationship between two variables, treatment, or exposure and effectiveness or outcome. Unlike data obtained through randomized controlled trials, researchers face greater challenges with causal inference with observational studies. Recognizing these challenges, a task force was formed to develop a guidance document on methodological approaches to addresses these biases. The task force was commissioned and a Chair was selected by the International Society for Pharmacoeconomics and Outcomes Research Board of Directors in October 2007. This report, the second of three reported in this issue of the Journal, discusses the inherent biases when using secondary data sources for comparative effectiveness analysis and provides methodological recommendations to help mitigate these biases. The task force report provides recommendations and tools for researchers to mitigate threats to validity from bias and confounding in measurement of exposure and outcome. Recommendations on design of study included: the need for data analysis plan with causal diagrams; detailed attention to classification bias in definition of exposure and clinical outcome; careful and appropriate use of restriction; extreme care to identify and control for confounding factors, including time-dependent confounding. Design of nonrandomized studies of comparative effectiveness face several daunting issues, including measurement of exposure and outcome challenged by misclassification and confounding. Use of causal diagrams and restriction are two techniques that can improve the theoretical basis for analyzing treatment effects in study populations of more homogeneity, with reduced loss of generalizability.

  5. Performance-based training: from job and task analysis to training materials

    International Nuclear Information System (INIS)

    Davis, L.T.; Spinney, R.W.

    1983-01-01

    Historically, the smoke filled room approach has been used to revise training programs: instructors would sit down and design a program based on existing training materials and any federal requirements that applied. This failure to reflect a systematic definition of required job functions, responsibilities and performance standards in training programs has resulted in generic program deficiencies: they do not provide complete training of required skills and knowledge. Recognition of this need for change, coupled with a decrease in experienced industry personnel inputs and long training pipelines, has heightened the need for efficient performance-based training programs which are derived from and referenced to job performance criteria. This paper presents the process for developing performance-based training materials based on job and task analysis products

  6. Human factors assessment in PRA using Task Analysis Linked Evaluation Technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1991-01-01

    Thirty years ago the US military and US aviation industry, and more recently, in response to the US Three Mile Island and USSR Chernobyl accidents, the US commercial nuclear power industry, acknowledged that human error, as an immediate precursor, and as a latent or indirect influence in the form of training, maintainability, inservice test, and surveillance programs, is a primary contributor to unreality and risk in complex high-reliability systems. A 1985 Nuclear Regulatory Commission (NRC) study of Licensee Event Reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Despite the magnitude and nature of human error cited in that study, there has been limited attention to personnel-centered issues, especially person-to-person issues involving group processes, management and organizational environment. The paper discusses NRC integration and applications research with respect to the Task Analysis Linked Evaluation Technique (TALENT) in risk assessment applications

  7. Identifying optimum performance trade-offs using a cognitively bounded rational analysis model of discretionary task interleaving.

    Science.gov (United States)

    Janssen, Christian P; Brumby, Duncan P; Dowell, John; Chater, Nick; Howes, Andrew

    2011-01-01

    We report the results of a dual-task study in which participants performed a tracking and typing task under various experimental conditions. An objective payoff function was used to provide explicit feedback on how participants should trade off performance between the tasks. Results show that participants' dual-task interleaving strategy was sensitive to changes in the difficulty of the tracking task and resulted in differences in overall task performance. To test the hypothesis that people select strategies that maximize payoff, a Cognitively Bounded Rational Analysis model was developed. This analysis evaluated a variety of dual-task interleaving strategies to identify the optimal strategy for maximizing payoff in each condition. The model predicts that the region of optimum performance is different between experimental conditions. The correspondence between human data and the prediction of the optimal strategy is found to be remarkably high across a number of performance measures. This suggests that participants were honing their behavior to maximize payoff. Limitations are discussed. Copyright © 2011 Cognitive Science Society, Inc.

  8. Graph theoretical analysis of EEG effective connectivity in vascular dementia patients during a visual oddball task.

    Science.gov (United States)

    Wang, Chao; Xu, Jin; Zhao, Songzhen; Lou, Wutao

    2016-01-01

    The study was dedicated to investigating the change in information processing in brain networks of vascular dementia (VaD) patients during the process of decision making. EEG was recorded from 18 VaD patients and 19 healthy controls when subjects were performing a visual oddball task. The whole task was divided into several stages by using global field power analysis. In the stage related to the decision-making process, graph theoretical analysis was applied to the binary directed network derived from EEG signals at nine electrodes in the frontal, central, and parietal regions in δ (0.5-3.5Hz), θ (4-7Hz), α1 (8-10Hz), α2 (11-13Hz), and β (14-30Hz) frequency bands based on directed transfer function. A weakened outgoing information flow, a decrease in out-degree, and an increase in in-degree were found in the parietal region in VaD patients, compared to healthy controls. In VaD patients, the parietal region may also lose its hub status in brain networks. In addition, the clustering coefficient was significantly lower in VaD patients. Impairment might be present in the parietal region or its connections with other regions, and it may serve as one of the causes for cognitive decline in VaD patients. The brain networks of VaD patients were significantly altered toward random networks. The present study extended our understanding of VaD from the perspective of brain functional networks, and it provided possible interpretations for cognitive deficits in VaD patients. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. An analysis of the application of AI to the development of intelligent aids for flight crew tasks

    Science.gov (United States)

    Baron, S.; Feehrer, C.

    1985-01-01

    This report presents the results of a study aimed at developing a basis for applying artificial intelligence to the flight deck environment of commercial transport aircraft. In particular, the study was comprised of four tasks: (1) analysis of flight crew tasks, (2) survey of the state-of-the-art of relevant artificial intelligence areas, (3) identification of human factors issues relevant to intelligent cockpit aids, and (4) identification of artificial intelligence areas requiring further research.

  10. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  11. The limit shape problem for ensembles of Young diagrams

    CERN Document Server

    Hora, Akihito

    2016-01-01

    This book treats ensembles of Young diagrams originating from group-theoretical contexts and investigates what statistical properties are observed there in a large-scale limit. The focus is mainly on analyzing the interesting phenomenon that specific curves appear in the appropriate scaling limit for the profiles of Young diagrams. This problem is regarded as an important origin of recent vital studies on harmonic analysis of huge symmetry structures. As mathematics, an asymptotic theory of representations is developed of the symmetric groups of degree n as n goes to infinity. The framework of rigorous limit theorems (especially the law of large numbers) in probability theory is employed as well as combinatorial analysis of group characters of symmetric groups and applications of Voiculescu's free probability. The central destination here is a clear description of the asymptotic behavior of rescaled profiles of Young diagrams in the Plancherel ensemble from both static and dynamic points of view.

  12. Cognitive task analysis of nuclear power plant operators for man-machine interface design

    International Nuclear Information System (INIS)

    Itoh, J.I.; Yoshimura, S.; Ohtsuka, T.

    1990-01-01

    This paper aims to ascertain and further develop design guidelines for a man-machine interface compatible with plant operators' problem solving strategies. As the framework for this study, operator's information processing activities were modeled, based on J. Rasmussen's framework for cognitive task analysis. Two experiments were carried out. One was an experiment aimed at gaining an understanding of internal mechanisms involved in mistakes and slips which occurred in operators' responses to incidents and accidents. As a result of fifteen cases of operator performance analysis, sixty one human errors were identified. Further analysis of the errors showed that frequently occurring error mechanisms were absent-mindedness, lack of recognition of patterns in diagnosis and failed procedure formulation due to memory lapses. The other kind of experiment was carried out to identify the envelope of trajectories for the operator's search in the problem space consisting of the two dimensions of means-ends and whole-part relations while dealing with transients. Two cases of experimental sessions were conducted with the thinking-aloud method. From analyses based on verbal protocols, trajectories of operator's search were derived, covering from the whole plant level through the component level in the whole-part dimension and covering from the functional purpose level through the physical form level in the means-ends dimension. The findings obtained from these analyses serve as a basis for developing design guidelines for man-machine interfaces in control rooms of nuclear power plants

  13. Diagrams benefit symbolic problem-solving.

    Science.gov (United States)

    Chu, Junyi; Rittle-Johnson, Bethany; Fyfe, Emily R

    2017-06-01

    The format of a mathematics problem often influences students' problem-solving performance. For example, providing diagrams in conjunction with story problems can benefit students' understanding, choice of strategy, and accuracy on story problems. However, it remains unclear whether providing diagrams in conjunction with symbolic equations can benefit problem-solving performance as well. We tested the impact of diagram presence on students' performance on algebra equation problems to determine whether diagrams increase problem-solving success. We also examined the influence of item- and student-level factors to test the robustness of the diagram effect. We worked with 61 seventh-grade students who had received 2 months of pre-algebra instruction. Students participated in an experimenter-led classroom session. Using a within-subjects design, students solved algebra problems in two matched formats (equation and equation-with-diagram). The presence of diagrams increased equation-solving accuracy and the use of informal strategies. This diagram benefit was independent of student ability and item complexity. The benefits of diagrams found previously for story problems generalized to symbolic problems. The findings are consistent with cognitive models of problem-solving and suggest that diagrams may be a useful additional representation of symbolic problems. © 2017 The British Psychological Society.

  14. COMMIT at SemEval-2017 Task 5: Ontology-based Method for Sentiment Analysis of Financial Headlines

    NARCIS (Netherlands)

    Schouten, Kim; Frasincar, Flavius; de Jong, F.M.G.

    2017-01-01

    This paper describes our submission to Task 5 of SemEval 2017, Fine-Grained Sentiment Analysis on Financial Microblogs and News, where we limit ourselves to performing sentiment analysis on news headlines only (track 2). The approach presented in this paper uses a Support Vector Machine to do the

  15. Components of Task-Based Needs Analysis of the ESP Learners with the Specialization of Business and Tourism

    Science.gov (United States)

    Poghosyan, Naira

    2016-01-01

    In the following paper we shall thoroughly analyze the target learning needs of the learners within an ESP (English for Specific Purposes) context. The main concerns of ESP have always been and remain with the needs analysis, text analysis and preparing learners to communicate effectively in the tasks prescribed by their study or work situation.…

  16. Towards Diagram Understanding: A Pilot Study Measuring Cognitive Workload Through Eye-Tracking

    DEFF Research Database (Denmark)

    Maier, Anja; Baltsen, Nick; Christoffersen, Henrik

    2014-01-01

    We investigate model understanding, in particular , how the quality of the UML diagram layout impacts cognitive load. We hypothesize that this w ill have a significant impact on the structure and effectiveness of engineers’ communication. In previous work, we have studied task performance...... measurements and subjective assessments; here, we also investigate behavioral indicators such as fixation and pupillary dilation. We use such indicators to explore diagram understanding- and reading strategies and how such strategies are impacted, e.g. by diagram type and expertise level. In the pilot eye...

  17. Analysis of Skeletal Muscle Metrics as Predictors of Functional Task Performance

    Science.gov (United States)

    Ryder, Jeffrey W.; Buxton, Roxanne E.; Redd, Elizabeth; Scott-Pandorf, Melissa; Hackney, Kyle J.; Fiedler, James; Ploutz-Snyder, Robert J.; Bloomberg, Jacob J.; Ploutz-Snyder, Lori L.

    2010-01-01

    PURPOSE: The ability to predict task performance using physiological performance metrics is vital to ensure that astronauts can execute their jobs safely and effectively. This investigation used a weighted suit to evaluate task performance at various ratios of strength, power, and endurance to body weight. METHODS: Twenty subjects completed muscle performance tests and functional tasks representative of those that would be required of astronauts during planetary exploration (see table for specific tests/tasks). Subjects performed functional tasks while wearing a weighted suit with additional loads ranging from 0-120% of initial body weight. Performance metrics were time to completion for all tasks except hatch opening, which consisted of total work. Task performance metrics were plotted against muscle metrics normalized to "body weight" (subject weight + external load; BW) for each trial. Fractional polynomial regression was used to model the relationship between muscle and task performance. CONCLUSION: LPMIF/BW is the best predictor of performance for predominantly lower-body tasks that are ambulatory and of short duration. LPMIF/BW is a very practical predictor of occupational task performance as it is quick and relatively safe to perform. Accordingly, bench press work best predicts hatch-opening work performance.

  18. APPLICATION OF FISHBONE DIAGRAM TO DETERMINE THE RISK OF AN EVENT WITH MULTIPLE CAUSES

    OpenAIRE

    Gheorghe ILIE; Carmen Nadia CIOCOIU

    2010-01-01

    Fishbone diagram (also known as Ishikawa diagram) was created with the goal of identifying and grouping the causes which generate a quality problem. Gradually, the method has been used also to group in categories the causes of other types of problems which an organization confronts with. This made Fishbone diagram become a very useful instrument in risk identification stage. The article proposes to extend the applicability of the method by including in the analysis the probabilities and the i...

  19. Matrix model approximations of fuzzy scalar field theories and their phase diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Tekel, Juraj [Department of Theoretical Physics, Faculty of Mathematics, Physics and Informatics, Comenius University, Mlynska Dolina, Bratislava, 842 48 (Slovakia)

    2015-12-29

    We present an analysis of two different approximations to the scalar field theory on the fuzzy sphere, a nonperturbative and a perturbative one, which are both multitrace matrix models. We show that the former reproduces a phase diagram with correct features in a qualitative agreement with the previous numerical studies and that the latter gives a phase diagram with features not expected in the phase diagram of the field theory.

  20. Effect of deformation on the continuous cooling transformation (CCT) diagram of steel 32CRB4

    OpenAIRE

    Kawulok, R.; Schindler, I.; Kawulok, P.; Rusz, S.; Opěla, P.; Solowski, Z.; Čmiel, K. M.

    2015-01-01

    CCT and DCCT steel diagrams of the steel 32CrB4 were determined by the universal plastometer GLEEBLE 3 800 on the basis of dilatometric tests. Dilatometric analysis showed that compared to the diagram provided by the software QTSteel th e noses of individual curves are in fact shifted towards shorter times. Preceding deformation significantly affected the decay diagram of the investigated steel. Shorter times, which were available for recovery of the deformed structure during more...

  1. Disconnected Diagrams in Lattice QCD

    Science.gov (United States)

    Gambhir, Arjun Singh

    In this work, we present state-of-the-art numerical methods and their applications for computing a particular class of observables using lattice quantum chromodynamics (Lattice QCD), a discretized version of the fundamental theory of quarks and gluons. These observables require calculating so called "disconnected diagrams" and are important for understanding many aspects of hadron structure, such as the strange content of the proton. We begin by introducing the reader to the key concepts of Lattice QCD and rigorously define the meaning of disconnected diagrams through an example of the Wick contractions of the nucleon. Subsequently, the calculation of observables requiring disconnected diagrams is posed as the computationally challenging problem of finding the trace of the inverse of an incredibly large, sparse matrix. This is followed by a brief primer of numerical sparse matrix techniques that overviews broadly used methods in Lattice QCD and builds the background for the novel algorithm presented in this work. We then introduce singular value deflation as a method to improve convergence of trace estimation and analyze its effects on matrices from a variety of fields, including chemical transport modeling, magnetohydrodynamics, and QCD. Finally, we apply this method to compute observables such as the strange axial charge of the proton and strange sigma terms in light nuclei. The work in this thesis is innovative for four reasons. First, we analyze the effects of deflation with a model that makes qualitative predictions about its effectiveness, taking only the singular value spectrum as input, and compare deflated variance with different types of trace estimator noise. Second, the synergy between probing methods and deflation is investigated both experimentally and theoretically. Third, we use the synergistic combination of deflation and a graph coloring algorithm known as hierarchical probing to conduct a lattice calculation of light disconnected matrix elements

  2. Disconnected Diagrams in Lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Gambhir, Arjun [College of William and Mary, Williamsburg, VA (United States)

    2017-08-01

    In this work, we present state-of-the-art numerical methods and their applications for computing a particular class of observables using lattice quantum chromodynamics (Lattice QCD), a discretized version of the fundamental theory of quarks and gluons. These observables require calculating so called \\disconnected diagrams" and are important for understanding many aspects of hadron structure, such as the strange content of the proton. We begin by introducing the reader to the key concepts of Lattice QCD and rigorously define the meaning of disconnected diagrams through an example of the Wick contractions of the nucleon. Subsequently, the calculation of observables requiring disconnected diagrams is posed as the computationally challenging problem of finding the trace of the inverse of an incredibly large, sparse matrix. This is followed by a brief primer of numerical sparse matrix techniques that overviews broadly used methods in Lattice QCD and builds the background for the novel algorithm presented in this work. We then introduce singular value deflation as a method to improve convergence of trace estimation and analyze its effects on matrices from a variety of fields, including chemical transport modeling, magnetohydrodynamics, and QCD. Finally, we apply this method to compute observables such as the strange axial charge of the proton and strange sigma terms in light nuclei. The work in this thesis is innovative for four reasons. First, we analyze the effects of deflation with a model that makes qualitative predictions about its effectiveness, taking only the singular value spectrum as input, and compare deflated variance with different types of trace estimator noise. Second, the synergy between probing methods and deflation is investigated both experimentally and theoretically. Third, we use the synergistic combination of deflation and a graph coloring algorithm known as hierarchical probing to conduct a lattice calculation of light disconnected matrix elements

  3. Job task and functional analysis of the Division of Reactor Projects, office of Nuclear Reactor Regulation. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, J.A.; Gilmore, W.; Hahn, H.A.

    1998-07-10

    A job task and functional analysis was recently completed for the positions that make up the regional Divisions of Reactor Projects. Among the conclusions of that analysis was a recommendation to clarify roles and responsibilities among site, regional, and headquarters personnel. As that analysis did not cover headquarters personnel, a similar analysis was undertaken of three headquarters positions within the Division of Reactor Projects: Licensing Assistants, Project Managers, and Project Directors. The goals of this analysis were to systematically evaluate the tasks performed by these headquarters personnel to determine job training requirements, to account for variations due to division/regional assignment or differences in several experience categories, and to determine how, and by which positions, certain functions are best performed. The results of this analysis include recommendations for training and for job design. Data to support this analysis was collected by a survey instrument and through several sets of focus group meetings with representatives from each position.

  4. Human factors assessment in PRA using task analysis linked evaluation technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1990-01-01

    Human error is a primary contributor to risk in complex high-reliability systems. A 1985 U.S. Nuclear Regulatory Commission (USNRC) study of licensee event reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Since then, the USNRC has initiated research to fully and properly integrate human errors into the probabilistic risk assessment (PRA) process. The resulting implementation procedure is known as the Task Analysis Linked Evaluation Technique (TALENT). As indicated, TALENT is a broad-based method for integrating human factors expertise into the PRA process. This process achieves results which: (1) provide more realistic estimates of the impact of human performance on nuclear power safety, (2) can be fully audited, (3) provide a firm technical base for equipment-centered and personnel-centered retrofit/redesign of plants enabling them to meet internally and externally imposed safety standards, and (4) yield human and hardware data capable of supporting inquiries into human performance issues that transcend the individual plant. The TALENT procedure is being field-tested to verify its effectiveness and utility. The objectives of the field-test are to examine (1) the operability of the process, (2) its acceptability to the users, and (3) its usefulness for achieving measurable improvements in the credibility of the analysis. The field-test will provide the information needed to enhance the TALENT process

  5. Bayesian change-point analysis reveals developmental change in a classic theory of mind task.

    Science.gov (United States)

    Baker, Sara T; Leslie, Alan M; Gallistel, C R; Hood, Bruce M

    2016-12-01

    Although learning and development reflect changes situated in an individual brain, most discussions of behavioral change are based on the evidence of group averages. Our reliance on group-averaged data creates a dilemma. On the one hand, we need to use traditional inferential statistics. On the other hand, group averages are highly ambiguous when we need to understand change in the individual; the average pattern of change may characterize all, some, or none of the individuals in the group. Here we present a new method for statistically characterizing developmental change in each individual child we study. Using false-belief tasks, fifty-two children in two cohorts were repeatedly tested for varying lengths of time between 3 and 5 years of age. Using a novel Bayesian change point analysis, we determined both the presence and-just as importantly-the absence of change in individual longitudinal cumulative records. Whenever the analysis supports a change conclusion, it identifies in that child's record the most likely point at which change occurred. Results show striking variability in patterns of change and stability across individual children. We then group the individuals by their various patterns of change or no change. The resulting patterns provide scarce support for sudden changes in competence and shed new light on the concepts of "passing" and "failing" in developmental studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Kolla, Hemanth [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Borghesi, Giulio [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-05-01

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- merical tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.

  7. Comparison of causality analysis on simultaneously measured fMRI and NIRS signals during motor tasks.

    Science.gov (United States)

    Anwar, Abdul Rauf; Muthalib, Makii; Perrey, Stephane; Galka, Andreas; Granert, Oliver; Wolff, Stephan; Deuschl, Guenther; Raethjen, Jan; Heute, Ulrich; Muthuraman, Muthuraman

    2013-01-01

    Brain activity can be measured using different modalities. Since most of the modalities tend to complement each other, it seems promising to measure them simultaneously. In to be presented research, the data recorded from Functional Magnetic Resonance Imaging (fMRI) and Near Infrared Spectroscopy (NIRS), simultaneously, are subjected to causality analysis using time-resolved partial directed coherence (tPDC). Time-resolved partial directed coherence uses the principle of state space modelling to estimate Multivariate Autoregressive (MVAR) coefficients. This method is useful to visualize both frequency and time dynamics of causality between the time series. Afterwards, causality results from different modalities are compared by estimating the Spearman correlation. In to be presented study, we used directionality vectors to analyze correlation, rather than actual signal vectors. Results show that causality analysis of the fMRI correlates more closely to causality results of oxy-NIRS as compared to deoxy-NIRS in case of a finger sequencing task. However, in case of simple finger tapping, no clear difference between oxy-fMRI and deoxy-fMRI correlation is identified.

  8. Multifamily Building Operator Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Building Operator JTA identifies and catalogs all of the tasks performed by multifamily building operators, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  9. Multifamily Retrofit Project Manager Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Retrofit Project Manager JTA identifies and catalogs all of the tasks performed by multifamily retrofit project managers, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  10. Multifamily Energy Auditor Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Energy Auditor JTA identifies and catalogs all of the tasks performed by multifamily energy auditors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  11. Multifamily Quality Control Inspector Job/Task Analysis and Report: September 2013

    Energy Technology Data Exchange (ETDEWEB)

    Owens, C. M.

    2013-09-01

    The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Quality Control Inspector JTA identifies and catalogs all of the tasks performed by multifamily quality control inspectors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.

  12. The use of a cognitive task analysis-based multimedia program to teach surgical decision making in flexor tendon repair.

    Science.gov (United States)

    Luker, Kali R; Sullivan, Maura E; Peyre, Sarah E; Sherman, Randy; Grunwald, Tiffany

    2008-01-01

    The aim of this study was to compare the surgical knowledge of residents before and after receiving a cognitive task analysis-based multimedia teaching module. Ten plastic surgery residents were evaluated performing flexor tendon repair on 3 occasions. Traditional learning occurred between the first and second trial and served as the control. A teaching module was introduced as an intervention between the second and third trial using cognitive task analysis to illustrate decision-making skills. All residents showed improvement in their decision-making ability when performing flexor tendon repair after each surgical procedure. The group improved through traditional methods as well as exposure to our talk-aloud protocol (P > .01). After being trained using the cognitive task analysis curriculum the group displayed a statistically significant knowledge expansion (P multimedia surgical curriculum instruction achieved greater command of problem solving and are better equipped to make correct decisions in flexor tendon repair.

  13. Gravity on-shell diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, Enrico [Walter Burke Institute for Theoretical Physics, California Institute of Technology,Pasadena, CA 91125 (United States); Trnka, Jaroslav [Center for Quantum Mathematics and Physics (QMAP),Department of Physics, University of California,Davis, CA 95616 (United States)

    2016-11-22

    We study on-shell diagrams for gravity theories with any number of supersymmetries and find a compact Grassmannian formula in terms of edge variables of the graphs. Unlike in gauge theory where the analogous form involves only dlog-factors, in gravity there is a non-trivial numerator as well as higher degree poles in the edge variables. Based on the structure of the Grassmannian formula for N=8 supergravity we conjecture that gravity loop amplitudes also possess similar properties. In particular, we find that there are only logarithmic singularities on cuts with finite loop momentum and that poles at infinity are present, in complete agreement with the conjecture presented in http://dx.doi.org/10.1007/JHEP06(2015)202.

  14. Phase diagram of ammonium nitrate

    International Nuclear Information System (INIS)

    Dunuwille, Mihindra; Yoo, Choong-Shik

    2013-01-01

    Ammonium Nitrate (AN) is a fertilizer, yet becomes an explosive upon a small addition of chemical impurities. The origin of enhanced chemical sensitivity in impure AN (or AN mixtures) is not well understood, posing significant safety issues in using AN even today. To remedy the situation, we have carried out an extensive study to investigate the phase stability of AN and its mixtures with hexane (ANFO–AN mixed with fuel oil) and Aluminum (Ammonal) at high pressures and temperatures, using diamond anvil cells (DAC) and micro-Raman spectroscopy. The results indicate that pure AN decomposes to N 2 , N 2 O, and H 2 O at the onset of the melt, whereas the mixtures, ANFO and Ammonal, decompose at substantially lower temperatures. The present results also confirm the recently proposed phase IV-IV ′ transition above 17 GPa and provide new constraints for the melting and phase diagram of AN to 40 GPa and 400°C

  15. VORONOI DIAGRAMS WITHOUT BOUNDING BOXES

    Directory of Open Access Journals (Sweden)

    E. T. K. Sang

    2015-10-01

    Full Text Available We present a technique for presenting geographic data in Voronoi diagrams without having to specify a bounding box. The method restricts Voronoi cells to points within a user-defined distance of the data points. The mathematical foundation of the approach is presented as well. The cell clipping method is particularly useful for presenting geographic data that is spread in an irregular way over a map, as for example the Dutch dialect data displayed in Figure 2. The automatic generation of reasonable cell boundaries also makes redundant a frequently used solution to this problem that requires data owners to specify region boundaries, as in Goebl (2010 and Nerbonne et al (2011.

  16. Anatomy of geodesic Witten diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Heng-Yu; Kuo, En-Jui [Department of Physics and Center for Theoretical Sciences, National Taiwan University,Taipei 10617, Taiwan (China); Kyono, Hideki [Department of Physics, Kyoto University,Kitashirakawa Oiwake-cho, Kyoto 606-8502 (Japan)

    2017-05-12

    We revisit the so-called “Geodesic Witten Diagrams” (GWDs) https://www.doi.org/10.1007/JHEP01(2016)146, proposed to be the holographic dual configuration of scalar conformal partial waves, from the perspectives of CFT operator product expansions. To this end, we explicitly consider three point GWDs which are natural building blocks of all possible four point GWDs, discuss their gluing procedure through integration over spectral parameter, and this leads us to a direct identification with the integral representation of CFT conformal partial waves. As a main application of this general construction, we consider the holographic dual of the conformal partial waves for external primary operators with spins. Moreover, we consider the closely related “split representation” for the bulk to bulk spinning propagator, to demonstrate how ordinary scalar Witten diagram with arbitrary spin exchange, can be systematically decomposed into scalar GWDs. We also discuss how to generalize to spinning cases.

  17. Study on the utilization of the cognitive architecture EPIC to the task analysis of a nuclear power plant operator

    International Nuclear Information System (INIS)

    Soares, Herculano Vieira

    2003-02-01

    This work presents a study of the use of the integrative cognitive architecture EPIC - Executive-Process - Interactive-Control, designed to evaluate the performance of a person performing tasks in parallel in a man-machine interface, as a methodology for Cognitive Task Analysis of a nuclear power plant operator. A comparison of the results obtained by the simulation by EPIC and the results obtained by application of the MHP model to the tasks performed by a shift operator during the execution of the procedure PO-E-3 - Steam Generator Tube Rupture of Angra 1 Nuclear Power Plant is done. To subsidize that comparison, an experiment was performed at the Angra 2 Nuclear Power Plant Full Scope Simulator in which three operator tasks were executed, its completion time measured and compared with the results of MHP and EPIC modeling. (author)

  18. Set-based Tasks within the Singularity-robust Multiple Task-priority Inverse Kinematics Framework: General Formulation, Stability Analysis and Experimental Results

    Directory of Open Access Journals (Sweden)

    Signe eMoe

    2016-04-01

    Full Text Available Inverse kinematics algorithms are commonly used in robotic systems to transform tasks to joint references, and several methods exist to ensure the achievement of several tasks simultaneously. The multiple task-priority inverse kinematicsframework allows tasks to be considered in a prioritized order by projecting task velocities through the nullspaces of higherpriority tasks. This paper extends this framework to handle setbased tasks, i.e. tasks with a range of valid values, in addition to equality tasks, which have a specific desired value. Examples of set-based tasks are joint limit and obstacle avoidance. The proposed method is proven to ensure asymptotic convergence of the equality task errors and the satisfaction of all high-priority set-based tasks. The practical implementation of the proposed algorithm is discussed, and experimental results are presented where a number of both set-based and equality tasks have been implemented on a 6 degree of freedom UR5 which is an industrial robotic arm from Universal Robots. The experiments validate thetheoretical results and confirm the effectiveness of the proposed approach.

  19. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  20. Applications of phase diagrams in metallurgy and ceramics

    International Nuclear Information System (INIS)

    Carter, G.C.

    1978-03-01

    The workshop represents an effort to coordinate and reinforce the current efforts on compilation of phase diagrams of alloys and ceramics. Many research groups and individual scientists throughout the world are concerned with phase equilibrium data. Specialized expertise exists in small institutions as well as large laboratories. If this talent can be effecively utilized through a cooperative effort, the needs for such data can be met. The Office of Standard Reference Data, which serves as the program management office for the National Standard Reference Data System, is eager to work with all groups concerned with this problem. Through a cooperative international effort we can carry out a task which has become too large for an individual. Volume 2 presents computational techniques for phase diagram construction

  1. Brain-computer interface analysis of a dynamic visuo-motor task.

    Science.gov (United States)

    Logar, Vito; Belič, Aleš

    2011-01-01

    The area of brain-computer interfaces (BCIs) represents one of the more interesting fields in neurophysiological research, since it investigates the development of the machines that perform different transformations of the brain's "thoughts" to certain pre-defined actions. Experimental studies have reported some successful implementations of BCIs; however, much of the field still remains unexplored. According to some recent reports the phase coding of informational content is an important mechanism in the brain's function and cognition, and has the potential to explain various mechanisms of the brain's data transfer, but it has yet to be scrutinized in the context of brain-computer interface. Therefore, if the mechanism of phase coding is plausible, one should be able to extract the phase-coded content, carried by brain signals, using appropriate signal-processing methods. In our previous studies we have shown that by using a phase-demodulation-based signal-processing approach it is possible to decode some relevant information on the current motor action in the brain from electroencephalographic (EEG) data. In this paper the authors would like to present a continuation of their previous work on the brain-information-decoding analysis of visuo-motor (VM) tasks. The present study shows that EEG data measured during more complex, dynamic visuo-motor (dVM) tasks carries enough information about the currently performed motor action to be successfully extracted by using the appropriate signal-processing and identification methods. The aim of this paper is therefore to present a mathematical model, which by means of the EEG measurements as its inputs predicts the course of the wrist movements as applied by each subject during the task in simulated or real time (BCI analysis). However, several modifications to the existing methodology are needed to achieve optimal decoding results and a real-time, data-processing ability. The information extracted from the EEG could

  2. Stereo 3D spatial phase diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Jinwu, E-mail: kangjw@tsinghua.edu.cn; Liu, Baicheng, E-mail: liubc@tsinghua.edu.cn

    2016-07-15

    Phase diagrams serve as the fundamental guidance in materials science and engineering. Binary P-T-X (pressure–temperature–composition) and multi-component phase diagrams are of complex spatial geometry, which brings difficulty for understanding. The authors constructed 3D stereo binary P-T-X, typical ternary and some quaternary phase diagrams. A phase diagram construction algorithm based on the calculated phase reaction data in PandaT was developed. And the 3D stereo phase diagram of Al-Cu-Mg ternary system is presented. These phase diagrams can be illustrated by wireframe, surface, solid or their mixture, isotherms and isopleths can be generated. All of these can be displayed by the three typical display ways: electronic shutter, polarization and anaglyph (for example red-cyan glasses). Especially, they can be printed out with 3D stereo effect on paper, and watched by the aid of anaglyph glasses, which makes 3D stereo book of phase diagrams come to reality. Compared with the traditional illustration way, the front of phase diagrams protrude from the screen and the back stretches far behind of the screen under 3D stereo display, the spatial structure can be clearly and immediately perceived. These 3D stereo phase diagrams are useful in teaching and research. - Highlights: • Stereo 3D phase diagram database was constructed, including binary P-T-X, ternary, some quaternary and real ternary systems. • The phase diagrams can be watched by active shutter or polarized or anaglyph glasses. • The print phase diagrams retains 3D stereo effect which can be achieved by the aid of anaglyph glasses.

  3. Selected topics on the nonrelativistic diagram technique

    International Nuclear Information System (INIS)

    Blokhintsev, L.D.; Narodetskij, I.M.

    1983-01-01

    The construction of the diagrams describing various processes in the four-particle systems is considered. It is shown that these diagrams, in particular the diagrams corresponding to the simple mechanisms often used in nuclear and atomic reaction theory, are readily obtained from the Faddeev-Yakubovsky equations. The covariant four-dimensional formalism of nonrelativistic Feynman graphs and its connection to the three-dimensional graph technique are briefly discussed

  4. Electroweak penguin diagrams and two-body B decays

    International Nuclear Information System (INIS)

    Gronau, M.; Hernandez, O.F.; London, D.; Rosner, J.L.

    1995-01-01

    We discuss the role of electroweak penguin diagrams in B decays to two light pseudoscalar mesons. We confirm that the extraction of the weak phase α through the isospin analysis involving B→ππ decays is largely unaffected by such operators. However, the methods proposed to obtain weak and strong phases by relating B→ππ, B→πK, and B→K bar K decays through flavor SU(3) will be invalidated if eletroweak penguin diagrams are large. We show that, although the introduction of electroweak penguin contributions introduces no new amplitudes of flavor SU(3), there are a number of ways to experimentally measure the size of such effects. Finally, using SU(3) amplitude relations we present a new way of measuring the weak angle γ which holds even in the presence of electroweak penguin diagrams

  5. Sedimentation stacking diagram of binary colloidal mixtures and bulk phases in the plane of chemical potentials

    International Nuclear Information System (INIS)

    Heras, Daniel de las; Schmidt, Matthias

    2015-01-01

    We give a full account of a recently proposed theory that explicitly relates the bulk phase diagram of a binary colloidal mixture to its phase stacking phenomenology under gravity (de las Heras and Schmidt 2013 Soft Matter 9 8636). As we demonstrate, the full set of possible phase stacking sequences in sedimentation-diffusion equilibrium originates from straight lines (sedimentation paths) in the chemical potential representation of the bulk phase diagram. From the analysis of various standard topologies of bulk phase diagrams, we conclude that the corresponding sedimentation stacking diagrams can be very rich, even more so when finite sample height is taken into account. We apply the theory to obtain the stacking diagram of a mixture of nonadsorbing polymers and colloids. We also present a catalog of generic phase diagrams in the plane of chemical potentials in order to facilitate the practical application of our concept, which also generalizes to multi-component mixtures. (paper)

  6. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...

  7. Unnecessary work tasks and mental health: a prospective analysis of Danish human service workers.

    Science.gov (United States)

    Madsen, Ida E H; Tripathi, Manisha; Borritz, Marianne; Rugulies, Reiner

    2014-11-01

    According to the "stress-as-offense-to-self" perspective, work tasks that are considered unnecessary or unreasonable - so-called "illegitimate work tasks" - are likely to elicit stress-reactions. Previous studies, mostly cross-sectional, have shown that illegitimate tasks are associated with increased self-reported stress, cortisol, and counterproductive work behavior. In this article, we examine the prospective association between unnecessary work tasks, one type of illegitimate work tasks, and mental health among Danish human service workers. Further, we explore whether this association is modified by sex, age, occupational position, and baseline mental health status. The data were obtained from self-administered questionnaires from 1351 Danish human service workers in three waves of data-collection during 1999-2005. We measured unnecessary work tasks by a single item, and assessed mental health using the 5-item mental health inventory from the Short form 36 questionnaire. We analyzed data using multi-level modeling, adjusting for potential confounding by sex, age, cohabitation, occupational position, and baseline mental health. Unnecessary work tasks were prospectively associated with a decreased level of mental health. This association was stronger for employees with poor baseline mental health and tended to be more pronounced among older employees. Among participants with poor baseline mental health, the association was explained by neither psychological demands nor decision latitude. Our findings suggest that the prevention of unnecessary work tasks may benefit employee mental health, particularly among employees with pre-existing mental health problems.

  8. A Task-Based Analysis of Information Requirements of Tactical Maps

    Science.gov (United States)

    1979-08-01

    work began with the precept that military maps are primarily intended to serve users in the performance of functional tasks. By capitalizing on the task...Recherche Des Facteurs, Humaine de la Defense Natimla Onissels 2 Canadian ,losir Stall Washtington 1 C/Air Staff. Royal Canadian AF, ATTN: Pars Slid

  9. Analysis of the Latin Square Task with Linear Logistic Test Models

    Science.gov (United States)

    Zeuch, Nina; Holling, Heinz; Kuhn, Jorg-Tobias

    2011-01-01

    The Latin Square Task (LST) was developed by Birney, Halford, and Andrews [Birney, D. P., Halford, G. S., & Andrews, G. (2006). Measuring the influence of cognitive complexity on relational reasoning: The development of the Latin Square Task. Educational and Psychological Measurement, 66, 146-171.] and represents a non-domain specific,…

  10. Physical activity interventions differentially affect exercise task and barrier self-efficacy: a meta-analysis.

    Science.gov (United States)

    Higgins, Torrance J; Middleton, Kathryn R; Winner, Larry; Janelle, Christopher M

    2014-08-01

    Researchers have yet to establish how interventions to increase physical activity influence specific self-efficacy beliefs. The current study sought to quantify the effect of interventions to increase physical activity among healthy adults on exercise task (EXSE) and barrier self-efficacy (BSE) via meta-analysis. Intervention characteristics associated with self-efficacy and physical activity changes were also identified. A systematic database search and manual searches through reference lists of related publications were conducted for articles on randomized, controlled physical activity interventions. Published intervention studies reporting changes in physical activity behavior and either EXSE or BSE in healthy adults were eligible for inclusion. Of the 1,080 studies identified, 20 were included in the meta-analyses. Interventions had a significant effect of g = 0.208, 95% confidence interval (CI) [0.027, 0.388], p exercise sessions effectively increased EXSE and physical activity, whereas long interventions improved BSE. Interventions that did not provide support increased BSE and physical activity levels. Further, interventions that did not require the use of daily exercise logs improved EXSE and physical activity behavior. Interventions designed to increase physical activity differentially influenced EXSE and BSE. EXSE appeared to play a more significant role during exercise adoption, whereas BSE was involved in the maintenance of exercise behavior. Recommendations are offered for the design of future interventions.

  11. Cognitive Task Analysis of Business Jet Pilots' Weather Flying Behaviors: Preliminary Results

    Science.gov (United States)

    Latorella, Kara; Pliske, Rebecca; Hutton, Robert; Chrenka, Jason

    2001-01-01

    This report presents preliminary findings from a cognitive task analysis (CTA) of business aviation piloting. Results describe challenging weather-related aviation decisions and the information and cues used to support these decisions. Further, these results demonstrate the role of expertise in business aviation decision-making in weather flying, and how weather information is acquired and assessed for reliability. The challenging weather scenarios and novice errors identified in the results provide the basis for experimental scenarios and dependent measures to be used in future flight simulation evaluations of candidate aviation weather information systems. Finally, we analyzed these preliminary results to recommend design and training interventions to improve business aviation decision-making with weather information. The primary objective of this report is to present these preliminary findings and to document the extended CTA methodology used to elicit and represent expert business aviator decision-making with weather information. These preliminary findings will be augmented with results from additional subjects using this methodology. A summary of the complete results, absent the detailed treatment of methodology provided in this report, will be documented in a separate publication.

  12. The emotional Stroop task and posttraumatic stress disorder: a meta-analysis.

    Science.gov (United States)

    Cisler, Josh M; Wolitzky-Taylor, Kate B; Adams, Thomas G; Babson, Kimberly A; Badour, Christal L; Willems, Jeffrey L

    2011-07-01

    Posttraumatic stress disorder (PTSD) is associated with significant impairment and lowered quality of life. The emotional Stroop task (EST) has been one means of elucidating some of the core deficits in PTSD, but this literature has remained inconsistent. We conducted a meta-analysis of EST studies in PTSD populations in order to synthesize this body of research. Twenty-six studies were included with 538 PTSD participants, 254 non-trauma exposed control participants (NTC), and 276 trauma exposed control participants (TC). PTSD-relevant words impaired EST performance more among PTSD groups and TC groups compared to NTC groups. PTSD groups and TC groups did not differ. When examining within-subject effect sizes, PTSD-relevant words and generally threatening words impaired EST performance relative to neutral words among PTSD groups, and only PTSD-relevant words impaired performance among the TC groups. These patterns were not found among the NTC groups. Moderator analyses suggested that these effects were significantly greater in blocked designs compared to randomized designs, toward unmasked compared to masked stimuli, and among samples exposed to assaultive traumas compared to samples exposed to non-assaultive traumas. Theoretical and clinical implications are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Physical activity interventions differentially affect exercise task and barrier self-efficacy: A meta-analysis

    Science.gov (United States)

    Higgins, Torrance J.; Middleton, Kathryn R.; Winner, Larry; Janelle, Christopher M.; Middleton, Kathryn R.

    2014-01-01

    Objective Researchers have yet to establish how interventions to increase physical activity influence specific self-efficacy beliefs. The current study sought to quantify the effect of interventions to increase physical activity among healthy adults on exercise task (EXSE) and barrier self-efficacy (BSE) via meta-analysis. Intervention characteristics associated with self-efficacy and physical activity changes were also identified. Methods A systematic database search and manual searches through reference lists of related publications were conducted for articles on randomized, controlled physical activity interventions. Published intervention studies reporting changes in physical activity behavior and either EXSE or BSE in healthy adults were eligible for inclusion. Results Of the 1,080 studies identified, 20 were included in the meta-analyses. Interventions had a significant effect of g = 0.208, 95% confidence interval (CI) [0.027, 0.388], p physical activity. Moderator analyses indicated shorter interventions that did not include structured exercise sessions effectively increased EXSE and physical activity, whereas long interventions improved BSE. Interventions that did not provide support increased BSE and physical activity levels. Further, interventions that did not require the use of daily exercise logs improved EXSE and physical activity behavior. Conclusion Interventions designed to increase physical activity differentially influenced EXSE and BSE. EXSE appeared to play a more significant role during exercise adoption, whereas BSE was involved in the maintenance of exercise behavior. Recommendations are offered for the design of future interventions. PMID:23957904

  14. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.

  15. Stage line diagram: An age-conditional reference diagram for tracking development

    NARCIS (Netherlands)

    Buuren, S. van; Ooms, J.C.L.

    2009-01-01

    This paper presents a method for calculating stage line diagrams, a novel type of reference diagram useful for tracking developmental processes over time. Potential fields of applications include: dentistry (tooth eruption), oncology (tumor grading, cancer staging), virology (HIV infection and

  16. Stage line diagram: an age-conditional reference diagram for tracking development.

    NARCIS (Netherlands)

    Van Buuren, S.; Ooms, J.C.L.

    2009-01-01

    This paper presents a method for calculating stage line diagrams, a novel type of reference diagram useful for tracking developmental processes over time. Potential fields of applications include: dentistry (tooth eruption), oncology (tumor grading, cancer staging), virology (HIV infection and

  17. Dependency of human target detection performance on clutter and quality of supporting image analysis algorithms in a video surveillance task

    Science.gov (United States)

    Huber, Samuel; Dunau, Patrick; Wellig, Peter; Stein, Karin

    2017-10-01

    Background: In target detection, the success rates depend strongly on human observer performances. Two prior studies tested the contributions of target detection algorithms and prior training sessions. The aim of this Swiss-German cooperation study was to evaluate the dependency of human observer performance on the quality of supporting image analysis algorithms. Methods: The participants were presented 15 different video sequences. Their task was to detect all targets in the shortest possible time. Each video sequence showed a heavily cluttered simulated public area from a different viewing angle. In each video sequence, the number of avatars in the area was altered to 100, 150 and 200 subjects. The number of targets appearing was kept at 10%. The number of marked targets varied from 0, 5, 10, 20 up to 40 marked subjects while keeping the positive predictive value of the detection algorithm at 20%. During the task, workload level was assessed by applying an acoustic secondary task. Detection rates and detection times for the targets were analyzed using inferential statistics. Results: The study found Target Detection Time to increase and Target Detection Rates to decrease with increasing numbers of avatars. The same is true for the Secondary Task Reaction Time while there was no effect on Secondary Task Hit Rate. Furthermore, we found a trend for a u-shaped correlation between the numbers of markings and RTST indicating increased workload. Conclusion: The trial results may indicate useful criteria for the design of training and support of observers in observational tasks.

  18. Resting-state brain activity in the motor cortex reflects task-induced activity: A multi-voxel pattern analysis.

    Science.gov (United States)

    Kusano, Toshiki; Kurashige, Hiroki; Nambu, Isao; Moriguchi, Yoshiya; Hanakawa, Takashi; Wada, Yasuhiro; Osu, Rieko

    2015-08-01

    It has been suggested that resting-state brain activity reflects task-induced brain activity patterns. In this study, we examined whether neural representations of specific movements can be observed in the resting-state brain activity patterns of motor areas. First, we defined two regions of interest (ROIs) to examine brain activity associated with two different behavioral tasks. Using multi-voxel pattern analysis with regularized logistic regression, we designed a decoder to detect voxel-level neural representations corresponding to the tasks in each ROI. Next, we applied the decoder to resting-state brain activity. We found that the decoder discriminated resting-state neural activity with accuracy comparable to that associated with task-induced neural activity. The distribution of learned weighted parameters for each ROI was similar for resting-state and task-induced activities. Large weighted parameters were mainly located on conjunctive areas. Moreover, the accuracy of detection was higher than that for a decoder whose weights were randomly shuffled, indicating that the resting-state brain activity includes multi-voxel patterns similar to the neural representation for the tasks. Therefore, these results suggest that the neural representation of resting-state brain activity is more finely organized and more complex than conventionally considered.

  19. Gait disorders in the elderly and dual task gait analysis: a new approach for identifying motor phenotypes.

    Science.gov (United States)

    Auvinet, Bernard; Touzard, Claude; Montestruc, François; Delafond, Arnaud; Goeb, Vincent

    2017-01-31

    Gait disorders and gait analysis under single and dual-task conditions are topics of great interest, but very few studies have looked for the relevance of gait analysis under dual-task conditions in elderly people on the basis of a clinical approach. An observational study including 103 patients (mean age 76.3 ± 7.2, women 56%) suffering from gait disorders or memory impairment was conducted. Gait analysis under dual-task conditions was carried out for all patients. Brain MRI was performed in the absence of contra-indications. Three main gait variables were measured: walking speed, stride frequency, and stride regularity. For each gait variable, the dual task cost was computed and a quartile analysis was obtained. Nonparametric tests were used for all the comparisons (Wilcoxon, Kruskal-Wallis, Fisher or Chi 2 tests). Four clinical subgroups were identified: gait instability (45%), recurrent falls (29%), memory impairment (18%), and cautious gait (8%). The biomechanical severity of these subgroups was ordered according to walking speed and stride regularity under both conditions, from least to most serious as follows: memory impairment, gait instability, recurrent falls, cautious gait (p < 0.01 for walking speed, p = 0.05 for stride regularity). According to the established diagnoses of gait disorders, 5 main pathological subgroups were identified (musculoskeletal diseases (n = 11), vestibular diseases (n = 6), mild cognitive impairment (n = 24), central nervous system pathologies, (n = 51), and without diagnosis (n = 8)). The dual task cost for walking speed, stride frequency and stride regularity were different among these subgroups (p < 0.01). The subgroups mild cognitive impairment and central nervous system pathologies both showed together a higher dual task cost for each variable compared to the other subgroups combined (p = 0.01). The quartile analysis of dual task cost for stride frequency and stride regularity

  20. Multiple-task real-time PDP-15 operating system for data acquisition and analysis

    International Nuclear Information System (INIS)

    Myers, W.R.

    1974-01-01

    The RAMOS operating system is capable of handling up to 72 simultaneous tasks in an interrupt-driven environment. The minimum viable hardware configuration includes a Digital Equipment Corporation PDP-15 computer with 16384 words of memory, extended arithmetic element, automatic priority interrupt, a 256K-word RS09 DECdisk, two DECtape transports, and an alphanumeric keyboard/typer. The monitor executes major tasks by loading disk-resident modules to memory for execution; modules are written in a format that allows page-relocation by the monitor, and can be loaded into any available page. All requests for monitor service by tasks, including input/output, floating point arithmetic, request for additional memory, task initiation, etc., are implemented by privileged monitor calls (CAL). All IO device handlers are capable of queuing requests for service, allowing several tasks ''simultaneous'' use of all resources. All alphanumeric IO (including the PC05) is completely buffered and handled by a single multiplexing routine. The floating point arithmetic software is re-entrant to all operating modules and includes matrix arithmetic functions. One of the system tasks can be a ''batch'' job, controlled by simulating an alphanumeric command terminal through cooperative functions of the disk handler and alphanumeric device software. An alphanumeric control sequence may be executed, automatically accessing disk-resident tasks in any prescribed order; a library of control sequences is maintained on bulk storage for access by the monitor. (auth)

  1. Analysis of human-in-the-loop tele-operated maintenance inspection tasks using VR

    International Nuclear Information System (INIS)

    Boessenkool, H.; Abbink, D.A.; Heemskerk, C.J.M.; Steinbuch, M.; Baar, M.R. de; Wildenbeest, J.G.W.; Ronden, D.; Koning, J.F.

    2013-01-01

    Highlights: ► Execution of tele-operated inspection tasks for ITER maintenance was analyzed. ► Human factors experiments using Virtual Reality showed to be a valuable approach. ► A large variation in time performance and number of collisions was found. ► Results indicate significant room for improvement for teleoperated free space tasks. ► A promising solution is haptic shared control: assist operator with guiding forces. -- Abstract: One of the challenges in future fusion plants such as ITER is the remote maintenance of the plant. Foreseen human-in-the-loop tele-operation is characterized by limited visual and haptic feedback from the environment, which results in degraded task performance and increased operator workload. For improved tele-operated task performance it is required to get insight in the expected tasks and problems during maintenance at ITER. By means of an exploratory human factor experiment, this paper analyses problems and bottlenecks during the execution of foreseen tele-operated maintenance at ITER, identifying most promising areas of improvement. The focus of this paper is on free space (sub)tasks where contact with the environment needs to be avoided. A group of 5 subjects was asked to carry-out an ITER related free space task (visual inspection), using a six degree of freedom master device connected to a simulated hot cell environment. The results show large variation in time performance between subjects and an increasing number of collisions for more difficult tasks, indicating room for improvement for free space (sub)tasks. The results will be used in future research on the haptic guidance strategies in the ITER Remote Handling framework

  2. Analysis of human-in-the-loop tele-operated maintenance inspection tasks using VR

    Energy Technology Data Exchange (ETDEWEB)

    Boessenkool, H., E-mail: h.boessenkool@differ.nl [FOM Institute DIFFER (Dutch Institute for Fundamental Energy Research), Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, PO Box 1207, 3430 BE Nieuwegein (Netherlands); Eindhoven University of Technology, Department of Mechanical Engineering, Dynamics and Control Group, PO Box 513, 5600 MB Eindhoven (Netherlands); Abbink, D.A. [Delft University of Technology, Faculty of 3mE, BioMechanical Engineering Department, Mekelweg 2, 2628 CD Delft (Netherlands); Heemskerk, C.J.M. [Heemskerk Innovative Technology B.V., Jonckerweg 12, 2201 DZ Noordwijk (Netherlands); Steinbuch, M. [Eindhoven University of Technology, Department of Mechanical Engineering, Dynamics and Control Group, PO Box 513, 5600 MB Eindhoven (Netherlands); Baar, M.R. de [FOM Institute DIFFER (Dutch Institute for Fundamental Energy Research), Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, PO Box 1207, 3430 BE Nieuwegein (Netherlands); Eindhoven University of Technology, Department of Mechanical Engineering, Dynamics and Control Group, PO Box 513, 5600 MB Eindhoven (Netherlands); Wildenbeest, J.G.W. [Delft University of Technology, Faculty of 3mE, BioMechanical Engineering Department, Mekelweg 2, 2628 CD Delft (Netherlands); Heemskerk Innovative Technology B.V., Jonckerweg 12, 2201 DZ Noordwijk (Netherlands); Ronden, D. [FOM Institute DIFFER (Dutch Institute for Fundamental Energy Research), Association EURATOM-FOM, Partner in the Trilateral Euregio Cluster, PO Box 1207, 3430 BE Nieuwegein (Netherlands); Koning, J.F. [Heemskerk Innovative Technology B.V., Jonckerweg 12, 2201 DZ Noordwijk (Netherlands)

    2013-10-15

    Highlights: ► Execution of tele-operated inspection tasks for ITER maintenance was analyzed. ► Human factors experiments using Virtual Reality showed to be a valuable approach. ► A large variation in time performance and number of collisions was found. ► Results indicate significant room for improvement for teleoperated free space tasks. ► A promising solution is haptic shared control: assist operator with guiding forces. -- Abstract: One of the challenges in future fusion plants such as ITER is the remote maintenance of the plant. Foreseen human-in-the-loop tele-operation is characterized by limited visual and haptic feedback from the environment, which results in degraded task performance and increased operator workload. For improved tele-operated task performance it is required to get insight in the expected tasks and problems during maintenance at ITER. By means of an exploratory human factor experiment, this paper analyses problems and bottlenecks during the execution of foreseen tele-operated maintenance at ITER, identifying most promising areas of improvement. The focus of this paper is on free space (sub)tasks where contact with the environment needs to be avoided. A group of 5 subjects was asked to carry-out an ITER related free space task (visual inspection), using a six degree of freedom master device connected to a simulated hot cell environment. The results show large variation in time performance between subjects and an increasing number of collisions for more difficult tasks, indicating room for improvement for free space (sub)tasks. The results will be used in future research on the haptic guidance strategies in the ITER Remote Handling framework.

  3. CERPHASE: Computer-generated phase diagrams

    International Nuclear Information System (INIS)

    Ruys, A.J.; Sorrell, C.C.; Scott, F.H.

    1990-01-01

    CERPHASE is a collection of computer programs written in the programming language basic and developed for the purpose of teaching the principles of phase diagram generation from the ideal solution model of thermodynamics. Two approaches are used in the generation of the phase diagrams: freezing point depression and minimization of the free energy of mixing. Binary and ternary phase diagrams can be generated as can diagrams containing the ideal solution parameters used to generate the actual phase diagrams. Since the diagrams generated utilize the ideal solution model, data input required from the operator is minimal: only the heat of fusion and melting point of each component. CERPHASE is menu-driven and user-friendly, containing simple instructions in the form of screen prompts as well as a HELP file to guide the operator. A second purpose of CERPHASE is in the prediction of phase diagrams in systems for which no experimentally determined phase diagrams are available, enabling the estimation of suitable firing or sintering temperatures for otherwise unknown systems. Since CERPHASE utilizes ideal solution theory, there are certain limitations imposed on the types of systems that can be predicted reliably. 6 refs., 13 refs

  4. How design guides learning from matrix diagrams

    NARCIS (Netherlands)

    van der Meij, Jan; Amelsvoort, Marije; Anjewierden, Anjo

    2017-01-01

    Compared to text, diagrams are superior in their ability to structure and summarize information and to show relations between concepts and ideas. Perceptual cues, like arrows, are expected to improve the retention of diagrams by guiding the learner towards important elements or showing a preferred

  5. Diagram of state of stiff amphiphilic macromolecules

    NARCIS (Netherlands)

    Markov, Vladimir A.; Vasilevskaya, Valentina V.; Khalatur, Pavel G.; ten Brinke, Gerrit; Khokhlov, Alexei R.

    2007-01-01

    We studied coil-globule transitions in stiff-chain amphiphilic macromolecules via computer modeling and constructed phase diagrams for such molecules in terms of solvent quality and persistence length. We showed that the shape of the phase diagram essentially depends on the macromolecule degree of

  6. Compact flow diagrams for state sequences

    NARCIS (Netherlands)

    Buchin, Kevin; Buchin, Maike; Gudmundsson, Joachim; Horton, Michael; Sijben, Stef

    2017-01-01

    We introduce the concept of using a flow diagram to compactly represent the segmentation of a large number of state sequences according to a set of criteria. We argue that this flow diagram representation gives an intuitive summary that allows the user to detect patterns within the segmentations. In

  7. Compact flow diagrams for state sequences

    NARCIS (Netherlands)

    Buchin, K.A.; Buchin, M.E.; Gudmundsson, J.; Horton, M.J.; Sijben, S.

    2016-01-01

    We introduce the concept of compactly representing a large number of state sequences, e.g., sequences of activities, as a flow diagram. We argue that the flow diagram representation gives an intuitive summary that allows the user to detect patterns among large sets of state sequences. Simplified,

  8. How Design Guides Learning from Matrix Diagrams

    Science.gov (United States)

    van der Meij, Jan; van Amelsvoort, Marije; Anjewierden, Anjo

    2017-01-01

    Compared to text, diagrams are superior in their ability to structure and summarize information and to show relations between concepts and ideas. Perceptual cues, like arrows, are expected to improve the retention of diagrams by guiding the learner towards important elements or showing a preferred reading sequence. In our experiment, we analyzed…

  9. Social Network Analysis as an Analytic Tool for Task Group Research: A Case Study of an Interdisciplinary Community of Practice

    Science.gov (United States)

    Lockhart, Naorah C.

    2017-01-01

    Group counselors commonly collaborate in interdisciplinary settings in health care, substance abuse, and juvenile justice. Social network analysis is a methodology rarely used in counseling research yet has potential to examine task group dynamics in new ways. This case study explores the scholarly relationships among 36 members of an…

  10. Obtaining Content Weights for Test Specifications from Job Analysis Task Surveys: An Application of the Many-Facets Rasch Model

    Science.gov (United States)

    Wang, Ning; Stahl, John

    2012-01-01

    This article discusses the use of the Many-Facets Rasch Model, via the FACETS computer program (Linacre, 2006a), to scale job/practice analysis survey data as well as to combine multiple rating scales into single composite weights representing the tasks' relative importance. Results from the Many-Facets Rasch Model are compared with those…

  11. Leak before break piping evaluation diagram

    International Nuclear Information System (INIS)

    Fabi, R.J.; Peck, D.A.

    1994-01-01

    Traditionally Leak Before Break (LBB) has been applied to the evaluation of piping in existing nuclear plants. This paper presents a simple method for evaluating piping systems for LBB during the design process. This method produces a piping evaluation diagram (PED) which defines the LBB requirements to the piping designer for use during the design process. Several sets of LBB analyses are performed for each different pipe size and material considered in the LBB application. The results of this method are independent of the actual pipe routing. Two complete LBB evaluations are performed to determine the maximum allowable stability load, one evaluation for a low normal operating load, and the other evaluation for a high normal operating load. These normal operating loads span the typical loads for the particular system being evaluated. In developing the allowable loads, the appropriate LBB margins are included in the PED preparation. The resulting LBB solutions are plotted as a set of allowable curves for the maximum design basis load, such is the seismic load versus the normal operating load. Since the required margins are already accounted for in the LBB PED, the piping designer can use the diagram directly with the results of the piping analysis and determine immediately if the current piping arrangement passes LBB. Since the LBB PED is independent of pipe routing, changes to the piping system can be evaluated using the existing PED. For a particular application, all that remains is to confirm that the actual materials and pipe sizes assumed in creating the particular design are built into the plant

  12. The Effect of Social Network Diagrams on a Virtual Network of Practice: A Korean Case

    Science.gov (United States)

    Jo, Il-Hyun

    2009-01-01

    This study investigates the effect of the presentation of social network diagrams on virtual team members' interaction behavior via e-mail. E-mail transaction data from 22 software developers in a Korean IT company was analyzed and depicted as diagrams by social network analysis (SNA), and presented to the members as an intervention. Results…

  13. Na-Si binary phase diagram and solution growth of silicon crystals

    International Nuclear Information System (INIS)

    Morito, H.; Yamada, T.; Ikeda, T.; Yamane, H.

    2009-01-01

    In the present study, a Na-Si binary phase diagram was first presented from the results of differential thermal analysis and X-ray diffraction. Based on the phase diagram, we performed low-temperature formation of single crystals, film and porous bulk of Si by vaporizing Na from a Na-Si melt at 800 or 900 deg. C.

  14. The Traders' Cross: Identifying Traders' Surpluses in the Traditional Edgeworth Exchange Diagram

    Science.gov (United States)

    Beaulier, Scott A.; Prychitko, David L.

    2010-01-01

    The Edgeworth exchange diagram is a traditional tool of undergraduate microeconomic theory that depicts the mutually beneficial gains from voluntary trade. The authors take the analysis one step further. They identify the buyer's and seller's surpluses that accrue to both trading parties in the Edgeworth diagram. This is a straightforward exercise…

  15. Phase diagram of ammonium nitrate

    Energy Technology Data Exchange (ETDEWEB)

    Dunuwille, Mihindra; Yoo, Choong-Shik, E-mail: csyoo@wsu.edu [Department of Chemistry and Institute for Shock Physics, Washington State University, Pullman, Washington 99164 (United States)

    2013-12-07

    Ammonium Nitrate (AN) is a fertilizer, yet becomes an explosive upon a small addition of chemical impurities. The origin of enhanced chemical sensitivity in impure AN (or AN mixtures) is not well understood, posing significant safety issues in using AN even today. To remedy the situation, we have carried out an extensive study to investigate the phase stability of AN and its mixtures with hexane (ANFO–AN mixed with fuel oil) and Aluminum (Ammonal) at high pressures and temperatures, using diamond anvil cells (DAC) and micro-Raman spectroscopy. The results indicate that pure AN decomposes to N{sub 2}, N{sub 2}O, and H{sub 2}O at the onset of the melt, whereas the mixtures, ANFO and Ammonal, decompose at substantially lower temperatures. The present results also confirm the recently proposed phase IV-IV{sup ′} transition above 17 GPa and provide new constraints for the melting and phase diagram of AN to 40 GPa and 400°C.

  16. Towards the QCD phase diagram

    CERN Document Server

    De Forcrand, Philippe; Forcrand, Philippe de; Philipsen, Owe

    2006-01-01

    We summarize our recent results on the phase diagram of QCD with N_f=2+1 quark flavors, as a function of temperature T and quark chemical potential \\mu. Using staggered fermions, lattices with temporal extent N_t=4, and the exact RHMC algorithm, we first determine the critical line in the quark mass plane (m_{u,d},m_s) where the finite temperature transition at \\mu=0 is second order. We confirm that the physical point lies on the crossover side of this line. Our data are consistent with a tricritical point at (m_{u,d},m_s) = (0,\\sim 500) MeV. Then, using an imaginary chemical potential, we determine in which direction this second-order line moves as the chemical potential is turned on. Contrary to standard expectations, we find that the region of first-order transitions shrinks in the presence of a chemical potential, which is inconsistent with the presence of a QCD critical point at small chemical potential. The emphasis is put on clarifying the translation of our results from lattice to physical units, and ...

  17. Performance Measure Analysis of Command and Control Organizational and Task Structures

    National Research Council Canada - National Science Library

    Smith, Neil

    1996-01-01

    .... The purpose of the initial A2C2 experiment was to examine the relationships between organizational structures and task structures involving competition for scarce assets, to serve as an integration...

  18. Analysis of internal and external validity criteria for a computerized visual search task: A pilot study.

    Science.gov (United States)

    Richard's, María M; Introzzi, Isabel; Zamora, Eliana; Vernucci, Santiago

    2017-01-01

    Inhibition is one of the main executive functions, because of its fundamental role in cognitive and social development. Given the importance of reliable and computerized measurements to assessment inhibitory performance, this research intends to analyze the internal and external criteria of validity of a computerized conjunction search task, to evaluate the role of perceptual inhibition. A sample of 41 children (21 females and 20 males), aged between 6 and 11 years old (M = 8.49, SD = 1.47), intentionally selected from a private management school of Mar del Plata (Argentina), middle socio-economic level were assessed. The Conjunction Search Task from the TAC Battery, Coding and Symbol Search tasks from Wechsler Intelligence Scale for Children were used. Overall, results allow us to confirm that the perceptual inhibition task form TAC presents solid rates of internal and external validity that make a valid measurement instrument of this process.

  19. Impact of Dual Task on Parkinson's Disease, Stroke and Ataxia Patients' Gait: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Michelly Arjona Maciel

    2014-01-01

    Full Text Available Introduction: Performing dual task for neurological patients is complex and it can be influenced by the localization of the neurological lesion. Objective: Comparing the impact of dual task on gait in patients with Parkinson's disease, stroke and ataxia. Method: Subjects with Parkinson's disease (PD in initial phase, stroke and ataxia, with independent gait, were evaluated while doing simple gait, with cognitive, motor and cognitive-motor gait demand, assessing average speed and number of steps. Results: Ataxia and stroke patients, compared with PD, showed an increase in the number of steps and decrease the average speed on the march with cognitive demand. Subjects with PD performed better on tasks when compared to others. Conclusion: In this study the impact of dual task was lower in Parkinson's disease patients.

  20. The Effect of Corrective Feedback on Performance in Basic Cognitive Tasks: An Analysis of RT Components

    Directory of Open Access Journals (Sweden)

    Carmen Moret-Tatay

    2016-12-01

    Full Text Available The current work examines the effect of trial-by-trial feedback about correct and error responding on performance in two basic cognitive tasks: a classic Stroop task (n = 40 and a color-word matching task ('n' = 30. Standard measures of both RT and accuracy were examined in addition to measures obtained from fitting the ex-Gaussian distributional model to the correct RTs. For both tasks, RTs were faster in blocks of trials with feedback than in blocks without feedback, but this difference was not significant. On the other hand, with respect to the distributional analyses, providing feedback served to significantly reduce the size of the tails of the RT distributions. Such results suggest that, for conditions in which accuracy is fairly high, the effect of corrective feedback might either be to reduce the tendency to double-check before responding or to decrease the amount of attentional lapsing.

  1. Re: Madsen et al. "Unnecessary work tasks and mental health: a prospective analysis of Danish human service workers".

    Science.gov (United States)

    Durand-Moreau, Quentin; Loddé, Brice; Dewitte, Jean-Dominique

    2015-03-01

    Madsen et al (1) recently published a secondary analysis on data provided by the Project on Burnout, Motivation and Job Satisfaction (PUMA). The aim of their study, published in the Scandinavian Journal of Work, Environment & Health was to examine the associations between unnecessary work tasks and a decreased level of mental health. Though the topic was quite novel, reading this work proved disturbing and raised issues. Based on the results of this study, the authors stated that there is an association between unnecessary work tasks (assessed by a single question) and a decreased level of mental health, idem [assessed by the Mental Health Inventory (MHI-5)], in the specific population included in this PUMA survey. The authors point out a limitation of the study, namely that unnecessary work tasks were evaluated using one single question: "Do you sometimes have to do things in your job which appear to be unnecessary?". Semmer defines unnecessary work task as "tasks that should not be carried out at all because they do not make sense or because they could have been avoided, or could be carried out with less effort if things were organized more efficiently" (2). De facto, qualifying what an unnecessary task is requires stating or explaining whether the task makes sense. Making sense or not is not an objective notion. It is very difficult for either a manager or an employee to say if a task is necessary or not. Most important is that it makes sense from the worker's point of view. Making sense and being necessary are not synonyms. Some tasks do not make sense but are economically necessary (eg, when, as physicians, we are reporting our activity using ICD-10 on computers instead of being at patients' bedsides or reading this journal). Thus, there is a wide gap between Semmer's definition and the question used by the authors to evaluate his concept. A secondary analysis based on a single question is not adequate to evaluate unnecessary tasks. Nowadays, the general trend

  2. Analysis of Time-Dependent Brain Network on Active and MI Tasks for Chronic Stroke Patients.

    Directory of Open Access Journals (Sweden)

    Da-Hye Kim

    Full Text Available Several researchers have analyzed brain activities by investigating brain networks. However, there is a lack of the research on the temporal characteristics of the brain network during a stroke by EEG and the comparative studies between motor execution and imagery, which became known to have similar motor functions and pathways. In this study, we proposed the possibility of temporal characteristics on the brain networks of a stroke. We analyzed the temporal properties of the brain networks for nine chronic stroke patients by the active and motor imagery tasks by EEG. High beta band has a specific role in the brain network during motor tasks. In the high beta band, for the active task, there were significant characteristics of centrality and small-worldness on bilateral primary motor cortices at the initial motor execution. The degree centrality significantly increased on the contralateral primary motor cortex, and local efficiency increased on the ipsilateral primary motor cortex. These results indicate that the ipsilateral primary motor cortex constructed a powerful subnetwork by influencing the linked channels as compensatory effect, although the contralateral primary motor cortex organized an inefficient network by using the connected channels due to lesions. For the MI task, degree centrality and local efficiency significantly decreased on the somatosensory area at the initial motor imagery. Then, there were significant correlations between the properties of brain networks and motor function on the contralateral primary motor cortex and somatosensory area for each motor execution/imagery task. Our results represented that the active and MI tasks have different mechanisms of motor acts. Based on these results, we indicated the possibility of customized rehabilitation according to different motor tasks. We expect these results to help in the construction of the customized rehabilitation system depending on motor tasks by understanding temporal

  3. Analysis of a physics teacher's pedagogical `micro-actions' that support 17-year-olds' learning of free body diagrams via a modelling approach

    Science.gov (United States)

    Tay, Su Lynn; Yeo, Jennifer

    2018-01-01

    Great teaching is characterised by the specific actions a teacher takes in the classroom to bring about learning. In the context of model-based teaching (MBT), teachers' difficulty in working with students' models that are not scientifically consistent is troubling. To address this problem, the aim of this study is to identify the pedagogical micro-actions to support the development of scientific models and modelling skills during the evaluation and modification stages of MBT. Taking the perspective of pedagogical content knowing (PCKg), it identifies these micro-actions as an in-situ, dynamic transformation of knowledges of content, pedagogy, student and environment context. Through a case study approach, a lesson conducted by an experienced high-school physics teacher was examined. Audio and video recordings of the lesson contributed to the data sources. Taking a grounded approach in the analysis, eight pedagogical micro-actions enacted by the teacher were identified, namely 'clarification', 'evaluation', 'explanation', 'modification', 'exploration', 'referencing conventions', 'focusing' and 'meta-representing'. These micro-actions support students' learning related to the conceptual, cognitive, discursive and epistemological aspects of modelling. From the micro-actions, we identify the aspects of knowledges of PCKg that teachers need in order to competently select and enact these micro-actions. The in-situ and dynamic transformation of these knowledges implies that professional development should also be situated in the context in which these micro-actions are meaningful.

  4. eulerAPE: drawing area-proportional 3-Venn diagrams using ellipses.

    Science.gov (United States)

    Micallef, Luana; Rodgers, Peter

    2014-01-01

    Venn diagrams with three curves are used extensively in various medical and scientific disciplines to visualize relationships between data sets and facilitate data analysis. The area of the regions formed by the overlapping curves is often directly proportional to the cardinality of the depicted set relation or any other related quantitative data. Drawing these diagrams manually is difficult and current automatic drawing methods do not always produce appropriate diagrams. Most methods depict the data sets as circles, as they perceptually pop out as complete distinct objects due to their smoothness and regularity. However, circles cannot draw accurate diagrams for most 3-set data and so the generated diagrams often have misleading region areas. Other methods use polygons to draw accurate diagrams. However, polygons are non-smooth and non-symmetric, so the curves are not easily distinguishable and the diagrams are difficult to comprehend. Ellipses are more flexible than circles and are similarly smooth, but none of the current automatic drawing methods use ellipses. We present eulerAPE as the first method and software that uses ellipses for automatically drawing accurate area-proportional Venn diagrams for 3-set data. We describe the drawing method adopted by eulerAPE and we discuss our evaluation of the effectiveness of eulerAPE and ellipses for drawing random 3-set data. We compare eulerAPE and various other methods that are currently available and we discuss differences between their generated diagrams in terms of accuracy and ease of understanding for real world data.

  5. Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.

    Science.gov (United States)

    Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

    2015-01-01

    Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended.

  6. IGDS/TRAP Interface Program (ITIP). Software User Manual (SUM). [network flow diagrams for coal gasification studies

    Science.gov (United States)

    Jefferys, S.; Johnson, W.; Lewis, R.; Rich, R.

    1981-01-01

    This specification establishes the requirements, concepts, and preliminary design for a set of software known as the IGDS/TRAP Interface Program (ITIP). This software provides the capability to develop at an Interactive Graphics Design System (IGDS) design station process flow diagrams for use by the NASA Coal Gasification Task Team. In addition, ITIP will use the Data Management and Retrieval System (DMRS) to maintain a data base from which a properly formatted input file to the Time-Line and Resources Analysis Program (TRAP) can be extracted. This set of software will reside on the PDP-11/70 and will become the primary interface between the Coal Gasification Task Team and IGDS, DMRS, and TRAP. The user manual for the computer program is presented.

  7. Knee Arthroscopy Simulation: A Randomized Controlled Trial Evaluating the Effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) Tool.

    Science.gov (United States)

    Bhattacharyya, Rahul; Davidson, Donald J; Sugand, Kapil; Bartlett, Matthew J; Bhattacharya, Rajarshi; Gupte, Chinmay M

    2017-10-04

    Virtual-reality and cadaveric simulations are expensive and not readily accessible. Innovative and accessible training adjuncts are required to help to meet training needs. Cognitive task analysis has been used extensively to train pilots and in other surgical specialties. However, the use of cognitive task analyses within orthopaedics is in its infancy. The purpose of this study was to evaluate the effectiveness of a novel cognitive task analysis tool to train novice surgeons in diagnostic knee arthroscopy in high-fidelity, phantom-limb simulation. Three expert knee surgeons were interviewed independently to generate a list of technical steps, decision points, and errors for diagnostic knee arthroscopy. A modified Delphi technique was used to generate the final cognitive task analysis. A video and a voiceover were recorded for each phase of this procedure. These were combined to produce the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool that utilizes written and audiovisual stimuli to describe each phase of a diagnostic knee arthroscopy. In this double-blinded, randomized controlled trial, a power calculation was performed prior to recruitment. Sixteen novice orthopaedic trainees who performed ≤10 diagnostic knee arthroscopies were randomized into 2 equal groups. The intervention group (IKACTA group) was given the IKACTA tool and the control group had no additional learning material. They were assessed objectively (validated Arthroscopic Surgical Skill Evaluation Tool [ASSET] global rating scale) on a high-fidelity, phantom-knee simulator. All participants, using the Likert rating scale, subjectively rated the tool. The mean ASSET score (and standard deviation) was 19.5 ± 3.7 points in the IKACTA group and 10.6 ± 2.3 points in the control group, resulting in an improvement of 8.9 points (95% confidence interval, 7.6 to 10.1 points; p = 0.002); the score was determined as 51.3% (19.5 of 38) for the IKACTA group, 27.9% (10.6 of 38) for the

  8. An analysis of confidence limit calculations used in AAPM Task Group No. 119

    International Nuclear Information System (INIS)

    Knill, Cory; Snyder, Michael

    2011-01-01

    Purpose: The report issued by AAPM Task Group No. 119 outlined a procedure for evaluating the effectiveness of IMRT commissioning. The procedure involves measuring gamma pass-rate indices for IMRT plans of standard phantoms and determining if the results fall within a confidence limit set by assuming normally distributed data. As stated in the TG report, the assumption of normally distributed gamma pass rates is a convenient approximation for commissioning purposes, but may not accurately describe the data. Here the authors attempt to better describe gamma pass-rate data by fitting it to different distributions. The authors then calculate updated confidence limits using those distributions and compare them to those derived using TG No. 119 method. Methods: Gamma pass-rate data from 111 head and neck patients are fitted using the TG No. 119 normal distribution, a truncated normal distribution, and a Weibull distribution. Confidence limits to 95% are calculated for each and compared. A more general analysis of the expected differences between the TG No. 119 method of determining confidence limits and a more time-consuming curve fitting method is performed. Results: The TG No. 119 standard normal distribution does not fit the measured data. However, due to the small range of measured data points, the inaccuracy of the fit has only a small effect on the final value of the confidence limits. The confidence limits for the 111 patient plans are within 0.1% of each other for all distributions. The maximum expected difference in confidence limits, calculated using TG No. 119's approximation and a truncated distribution, is 1.2%. Conclusions: A three-parameter Weibull probability distribution more accurately fits the clinical gamma index pass-rate data than the normal distribution adopted by TG No. 119. However, the sensitivity of the confidence limit on distribution fit is low outside of exceptional circumstances.

  9. Upper Extremity Motor Learning among Individuals with Parkinson's Disease: A Meta-Analysis Evaluating Movement Time in Simple Tasks

    Directory of Open Access Journals (Sweden)

    K. Felix

    2012-01-01

    Full Text Available Motor learning has been found to occur in the rehabilitation of individuals with Parkinson's disease (PD. Through repetitive structured practice of motor tasks, individuals show improved performance, confirming that motor learning has probably taken place. Although a number of studies have been completed evaluating motor learning in people with PD, the sample sizes were small and the improvements were variable. The purpose of this meta-analysis was to determine the ability of people with PD to learn motor tasks. Studies which measured movement time in upper extremity reaching tasks and met the inclusion criteria were included in the analysis. Results of the meta-analysis indicated that people with PD and neurologically healthy controls both demonstrated motor learning, characterized by a decrease in movement time during upper extremity movements. Movement time improvements were greater in the control group than in individuals with PD. These results support the findings that the practice of upper extremity reaching tasks is beneficial in reducing movement time in persons with PD and has important implications for rehabilitation.

  10. The effect analysis of mediation variable of task productivity on the self-efficacy and employees’ performance

    Directory of Open Access Journals (Sweden)

    Annissa Chairum Soebandono

    2016-11-01

    Full Text Available This research was conducted for analyzing the effect of mediating variable that is individual task proactivity as one of the proactivities of behavior towards the relationship between self-efficacy and the employees’ performance in the logistics companies of shipping the goods. It uses convenience sampling, which is a non-probability sampling method for getting the sample of 52 employees. They were divided into two divisions, namely infrastructure and quality assurance. The questionnaire consists of two parts, in which some were assessed by themselves and others that were assessed by the supervisor. They were analyzed using path analysis using analytical tools developed by Hayes, Preacher-Hayes with the simple mediation models. It was found that employees have self-efficacy, individual task proactivity, and relatively high performance, in which individual task proactivity can be a mediating variable on the effect self-efficacy on performance.

  11. Lessons from a pilot project in cognitive task analysis: the potential role of intermediates in preclinical teaching in dental education.

    Science.gov (United States)

    Walker, Judith; von Bergmann, HsingChi

    2015-03-01

    The purpose of this study was to explore the use of cognitive task analysis to inform the teaching of psychomotor skills and cognitive strategies in clinical tasks in dental education. Methods used were observing and videotaping an expert at one dental school thinking aloud while performing a specific preclinical task (in a simulated environment), interviewing the expert to probe deeper into his thinking processes, and applying the same procedures to analyze the performance of three second-year dental students who had recently learned the analyzed task and who represented a spectrum of their cohort's ability to undertake the procedure. The investigators sought to understand how experts (clinical educators) and intermediates (trained students) overlapped and differed at points in the procedure that represented the highest cognitive load, known as "critical incidents." Findings from this study and previous research identified possible limitations of current clinical teaching as a result of expert blind spots. These findings coupled with the growing evidence of the effectiveness of peer teaching suggest the potential role of intermediates in helping novices learn preclinical dentistry tasks.

  12. Task analysis of human-in-the-loop tele-operated maintenance: What can be learned from JET?

    International Nuclear Information System (INIS)

    Boessenkool, H.; Thomas, J.; Heemskerk, C.J.M.; Baar, M.R. de; Steinbuch, M.; Abbink, D.A.

    2014-01-01

    Highlights: •Maintenance task execution at JET was analyzed to guide improvements for ITER. •A large variation in task duration was found for various operator experience levels. •Results indicate significant room for improvement for tele-operated performance. •Improvent of visual feedback and artificial guiding forces was considered promising. -- Abstract: Remote maintenance will determine the available uptime of future fusion plants such as ITER. Experience at predecessor JET showed that a human-in-the-loop tele-operated approach is crucial, although this approach entails drawbacks such as the unavoidable extensive operator training and relatively long execution times. These drawbacks are common knowledge, but little quantitative research is available to guide improvements (such as improved training methods, or active operator support systems). The aim of this paper is to identify the key areas for further improvement of tele-operated maintenance. This is achieved by a detailed task analysis based on recent maintenance at JET, using task logbooks and video data as well as interviews with experienced master–slave operators. The resulting task analysis shows the (sub)tasks that were most time-consuming and shows a large variance in time performance within operators, but also substantial differences between qualified operators with different levels of experience. The operator interviews indicate that intuitive (virtual) visual feedback and artificial (guiding) forces are promising directions for improvement. The results found in this study will be used for future research and development activities focusing on haptic guiding strategies, with the aim to further design and optimize RH maintenance systems for ITER and beyond

  13. A task analysis-linked approach for integrating the human factor in reliability assessments of nuclear power plants

    International Nuclear Information System (INIS)

    Ryan, T.G.

    1988-01-01

    This paper describes an emerging Task Analysis-Linked Evaluation Technique (TALENT) for assessing the contributions of human error to nuclear power plant systems unreliability and risk. Techniques such as TALENT are emerging as a recognition that human error is a primary contributor to plant safety, however, it has been a peripheral consideration to data in plant reliability evaluations. TALENT also recognizes that involvement of persons with behavioral science expertise is required to support plant reliability and risk analyses. A number of state-of-knowledge human reliability analysis tools are also discussed which support the TALENT process. The core of TALENT is comprised of task, timeline and interface analysis data which provide the technology base for event and fault tree development, serve as criteria for selecting and evaluating performance shaping factors, and which provide a basis for auditing TALENT results. Finally, programs and case studies used to refine the TALENT process are described along with future research needs in the area. (author)

  14. Incorporating Language Structure in a Communicative Task: An Analysis of the Language Component of a Communicative Task in the LINC Home Study Program

    Science.gov (United States)

    Lenchuk, Iryna

    2014-01-01

    The purpose of this article is to analyze a task included in the LINC Home Study (LHS) program. LHS is a federally funded distance education program offered to newcomers to Canada who are unable to attend regular LINC classes. A task, in which a language structure (a gerund) is chosen and analyzed, was selected from one instructional module of LHS…

  15. The Scaling of Water Governance Tasks: A Comparative Federal Analysis of the European Union and Australia

    Science.gov (United States)

    Benson, David; Jordan, Andrew

    2010-07-01

    Conflicts over how to “scale” policy-making tasks have characterized environmental governance since time immemorial. They are particularly evident in the area of water policy and raise important questions over the democratic legitimacy, economic efficiency and effectiveness of allocating (or “scaling”) tasks to some administrative levels as opposed to others. This article adopts a comparative federalism perspective to assess the “optimality” of scaling—either upward or downward—in one issue area, namely coastal recreational water quality. It does so by comparing the scaling of recreational water quality tasks in the European Union (EU) and Australia. It reveals that the two systems have adopted rather different approaches to scaling and that this difference can partly be accounted for in federal theoretical terms. However, a much greater awareness of the inescapably political nature of scaling processes is nonetheless required. Finally, some words of caution are offered with regard to transferring policy lessons between these two jurisdictions.

  16. Automated acoustic analysis of task dependency in adductor spasmodic dysphonia versus muscle tension dysphonia.

    Science.gov (United States)

    Roy, Nelson; Mazin, Alqhazo; Awan, Shaheen N

    2014-03-01

    Distinguishing muscle tension dysphonia (MTD) from adductor spasmodic dysphonia (ADSD) can be difficult. Unlike MTD, ADSD is described as "task-dependent," implying that dysphonia severity varies depending upon the demands of the vocal task, with connected speech thought to be more symptomatic than sustained vowels. This study used an acoustic index of dysphonia severity (i.e., the Cepstral Spectral Index of Dysphonia [CSID]) to: 1) assess the value of "task dependency" to distinguish ADSD from MTD, and to 2) examine associations between the CSID and listener ratings. Case-Control Study. CSID estimates of dysphonia severity for connected speech and sustained vowels of patients with ADSD (n = 36) and MTD (n = 45) were compared. The diagnostic precision of task dependency (as evidenced by differences in CSID-estimated dysphonia severity between connected speech and sustained vowels) was examined. In ADSD, CSID-estimated severity for connected speech (M = 39. 2, SD = 22.0) was significantly worse than for sustained vowels (M = 29.3, SD = 21.9), [P = .020]. Whereas in MTD, no significant difference in CSID-estimated severity was observed between connected speech (M = 55.1, SD = 23.8) and sustained vowels (M = 50.0, SD = 27.4), [P = .177]. CSID evidence of task dependency correctly identified 66.7% of ADSD cases (sensitivity) and 64.4% of MTD cases (specificity). CSID and listener ratings were significantly correlated. Task dependency in ADSD, as revealed by differences in acoustically-derived estimates of dysphonia severity between connected speech and sustained vowel production, is a potentially valuable diagnostic marker. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  17. Near threshold expansion of Feynman diagrams

    International Nuclear Information System (INIS)

    Mendels, E.

    2005-01-01

    The near threshold expansion of Feynman diagrams is derived from their configuration space representation, by performing all x integrations. The general scalar Feynman diagram is considered, with an arbitrary number of external momenta, an arbitrary number of internal lines and an arbitrary number of loops, in n dimensions and all masses may be different. The expansions are considered both below and above threshold. Rules, giving real and imaginary part, are derived. Unitarity of a sunset diagram with I internal lines is checked in a direct way by showing that its imaginary part is equal to the phase space integral of I particles

  18. Between Analogue and Digital Diagrams

    Directory of Open Access Journals (Sweden)

    Zoltan Bun

    2012-10-01

    Full Text Available This essay is about the interstitial. About how the diagram, as a method of design, has lead fromthe analogue deconstruction of the eighties to the digital processes of the turn of the millennium.Specifically, the main topic of the text is the interpretation and the critique of folding (as a diagramin the beginning of the nineties. It is necessary then to unfold its relationship with immediatelypreceding and following architectural trends, that is to say we have to look both backwards andforwards by about a decade. The question is the context of folding, the exchange of the analogueworld for the digital. To understand the process it is easier to investigate from the fields of artand culture, rather than from the intentionally perplicated1 thoughts of Gilles Deleuze. Both fieldsare relevant here because they can similarly be used as the yardstick against which the era itselfit measured. The cultural scene of the eighties and nineties, including performing arts, movies,literature and philosophy, is a wide milieu of architecture. Architecture responds parallel to itsera; it reacts to it, and changes with it and within it. Architecture is a medium, it has always beena medium, yet the relations are transformed. That’s not to say that technical progress, for exampleusing CAD-software and CNC-s, has led to the digital thinking of certain movements ofarchitecture, (it is at most an indirect tool. But the ‘up-to-dateness’ of the discipline, however,a kind of non-servile reading of an ‘applied culture’ or ‘used philosophy’2 could be the key.(We might recall here, parenthetically, the fortunes of the artistic in contemporary mass society.The proliferation of museums, the magnification of the figure of the artist, the existence of amassive consumption of printed and televised artistic images, the widespread appetite for informationabout the arts, all reflect, of course, an increasingly leisured society, but also relateprecisely to the fact

  19. Involvement of the anterior cingulate cortex in time-based prospective memory task monitoring: An EEG analysis of brain sources using Independent Component and Measure Projection Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela Cruz

    Full Text Available Time-based prospective memory (PM, remembering to do something at a particular moment in the future, is considered to depend upon self-initiated strategic monitoring, involving a retrieval mode (sustained maintenance of the intention plus target checking (intermittent time checks. The present experiment was designed to explore what brain regions and brain activity are associated with these components of strategic monitoring in time-based PM tasks.24 participants were asked to reset a clock every four minutes, while performing a foreground ongoing word categorisation task. EEG activity was recorded and data were decomposed into source-resolved activity using Independent Component Analysis. Common brain regions across participants, associated with retrieval mode and target checking, were found using Measure Projection Analysis.Participants decreased their performance on the ongoing task when concurrently performed with the time-based PM task, reflecting an active retrieval mode that relied on withdrawal of limited resources from the ongoing task. Brain activity, with its source in or near the anterior cingulate cortex (ACC, showed changes associated with an active retrieval mode including greater negative ERP deflections, decreased theta synchronization, and increased alpha suppression for events locked to the ongoing task while maintaining a time-based intention. Activity in the ACC was also associated with time-checks and found consistently across participants; however, we did not find an association with time perception processing per se.The involvement of the ACC in both aspects of time-based PM monitoring may be related to different functions that have been attributed to it: strategic control of attention during the retrieval mode (distributing attentional resources between the ongoing task and the time-based task and anticipatory/decision making processing associated with clock-checks.

  20. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    Ren, M J; Cheung, C F; Kong, L B

    2012-01-01

    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  1. Development of the complex of nuclear-physical methods of analysis for geology and technology tasks in Kazakhstan

    International Nuclear Information System (INIS)

    Solodukhin, V.; Silachyov, I.; Poznyak, V.; Gorlachev, I.

    2016-01-01

    The paper describes the development of nuclear-physical methods of analysis and their applications in Kazakhstan for geological tasks and technology. The basic methods of this complex include instrumental neutron-activation analysis, x-ray fluorescent analysis and instrumental γ-spectrometry. The following aspects are discussed: applications of developed and adopted analytical techniques for assessment and calculations of rare-earth metal reserves at various deposits in Kazakhstan, for technology development of mining and extraction from uranium-phosphorous ore and wastes, for radioactive coal gasification technology, for studies of rare metal contents in chromite, bauxites, black shales and their processing products. (author)

  2. A diffusion decision model analysis of evidence variability in the lexical decision task

    NARCIS (Netherlands)

    Tillman, Gabriel; Osth, Adam F.; van Ravenzwaaij, Don; Heathcote, Andrew

    2017-01-01

    The lexical-decision task is among the most commonly used paradigms in psycholinguistics. In both the signal-detection theory and Diffusion Decision Model (DDM; Ratcliff, Gomez, & McKoon, Psychological Review, 111, 159–182, 2004) frameworks, lexical-decisions are based on a continuous source of

  3. How Can Writing Tasks Be Characterized in a Way Serving Pedagogical Goals and Automatic Analysis Needs?

    Science.gov (United States)

    Quixal, Martí; Meurers, Detmar

    2016-01-01

    The paper tackles a central question in the field of Intelligent Computer-Assisted Language Learning (ICALL): How can language learning tasks be conceptualized and made explicit in a way that supports the pedagogical goals of current Foreign Language Teaching and Learning and at the same time provides an explicit characterization of the Natural…

  4. Task and person-focused leadership behaviors and team performance : A meta-analysis

    NARCIS (Netherlands)

    Ceri-Booms, Meltem; Curseu, P.L.; Oerlemans, L.A.G.

    2017-01-01

    This paper reports the results of a meta-analytic review of the relationship between person and task oriented leader behaviors, on the one hand, and team performance, on the other hand. The results, based on 89 independent samples, show a moderate positive (ρ=.33) association between both types of

  5. Path Analysis Examining Self-Efficacy and Decision-Making Performance on a Simulated Baseball Task

    Science.gov (United States)

    Hepler, Teri J.; Feltz, Deborah L.

    2012-01-01

    The purpose of this study was to examine the relationship between decision-making self-efficacy and decision-making performance in sport. Undergraduate students (N = 78) performed 10 trials of a decision-making task in baseball. Self-efficacy was measured before performing each trial. Decision-making performance was assessed by decision speed and…

  6. Gender Perspectives on Spatial Tasks in a National Assessment: A Secondary Data Analysis

    Science.gov (United States)

    Logan, Tracy; Lowrie, Tom

    2017-01-01

    Most large-scale summative assessments present results in terms of cumulative scores. Although such descriptions can provide insights into general trends over time, they do not provide detail of how students solved the tasks. Less restrictive access to raw data from these summative assessments has occurred in recent years, resulting in…

  7. A Longitudinal Analysis of Adolescent Decision-Making with the Iowa Gambling Task

    Science.gov (United States)

    Almy, Brandon; Kuskowski, Michael; Malone, Stephen M.; Myers, Evan; Luciana, Monica

    2018-01-01

    Many researchers have used the standard Iowa Gambling Task (IGT) to assess decision-making in adolescence given increased risk-taking during this developmental period. Most studies are cross-sectional and do not observe behavioral trajectories over time, limiting interpretation. This longitudinal study investigated healthy adolescents' and young…

  8. Iowa Gambling Task in patients with early-onset Parkinson’s disease: strategy analysis

    Czech Academy of Sciences Publication Activity Database

    Gescheidt, T.; Czekóová, Kristína; Urbánek, Tomáš; Mareček, R.; Mikl, M.; Kubíková, R.; Telecká, S.; Andrlová, H.; Husárová, I.; Bareš, M.

    2012-01-01

    Roč. 33, č. 6 (2012), s. 1329-1335 ISSN 1590-1874 R&D Projects: GA ČR(CZ) GAP407/12/2432 Institutional support: RVO:68081740 Keywords : Parkinson’s disease * decision making * Iowa gambling task * executive function Subject RIV: FL - Psychiatry, Sexuology Impact factor: 1.412, year: 2012

  9. A meta-analysis of the impact of situationally induced achievement goals on task performance

    NARCIS (Netherlands)

    Van Yperen, Nico W.; Blaga, Monica; Postmes, Thomas

    2015-01-01

    The purpose of this research was to meta-analyze studies which experimentally induced an achieve- ment goal state to examine its causal effect on the individual’s performance at the task at hand, and to investigate the moderator effects of feedback anticipation and time pressure. The data set

  10. Quantitative Analysis of Language Production in Parkinson's Disease Using a Cued Sentence Generation Task

    Science.gov (United States)

    Vanhoutte, Sarah; De Letter, Miet; Corthals, Paul; Van Borsel, John; Santens, Patrick

    2012-01-01

    The present study examined language production skills in Parkinson's disease (PD) patients. A unique cued sentence generation task was created in order to reduce demands on memory and attention. Differences in sentence production abilities according to disease severity and cognitive impairments were assessed. Language samples were obtained from 20…

  11. Voronoi diagram and microstructure of weldment

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jung Ho [Chungbuk National University, Cheongju (Korea, Republic of)

    2015-01-15

    Voronoi diagram, one of the well-known space decomposition algorithms has been applied to express the microstructure of a weldment for the first time due to the superficial analogy between a Voronoi cell and a metal's grain. The area of the Voronoi cells can be controlled by location and the number of the seed points. This can be correlated to the grain size in the microstructure and the number of nuclei formed. The feasibility of representing coarse and fine grain structures were tested through Voronoi diagrams and it is applied to expression of cross-sectional bead shape of a typical laser welding. As result, it successfully described coarsened grain size of heat affected zone and columnar crystals in fusion zone. Although Voronoi diagram showed potential as a microstructure prediction tool through this feasible trial but direct correlation control variable of Voronoi diagram to solidification process parameter is still remained as further works.

  12. Covariant diagrams for one-loop matching

    International Nuclear Information System (INIS)

    Zhang, Zhengkang

    2016-10-01

    We present a diagrammatic formulation of recently-revived covariant functional approaches to one-loop matching from an ultraviolet (UV) theory to a low-energy effective field theory. Various terms following from a covariant derivative expansion (CDE) are represented by diagrams which, unlike conventional Feynman diagrams, involve gaugecovariant quantities and are thus dubbed ''covariant diagrams.'' The use of covariant diagrams helps organize and simplify one-loop matching calculations, which we illustrate with examples. Of particular interest is the derivation of UV model-independent universal results, which reduce matching calculations of specific UV models to applications of master formulas. We show how such derivation can be done in a more concise manner than the previous literature, and discuss how additional structures that are not directly captured by existing universal results, including mixed heavy-light loops, open covariant derivatives, and mixed statistics, can be easily accounted for.

  13. A novel decision diagrams extension method

    International Nuclear Information System (INIS)

    Li, Shumin; Si, Shubin; Dui, Hongyan; Cai, Zhiqiang; Sun, Shudong

    2014-01-01

    Binary decision diagram (BDD) is a graph-based representation of Boolean functions. It is a directed acyclic graph (DAG) based on Shannon's decomposition. Multi-state multi-valued decision diagram (MMDD) is a natural extension of BDD for the symbolic representation and manipulation of the multi-valued logic functions. This paper proposes a decision diagram extension method based on original BDD/MMDD while the scale of a reliability system is extended. Following a discussion of decomposition and physical meaning of BDD and MMDD, the modeling method of BDD/MMDD based on original BDD/MMDD is introduced. Three case studies are implemented to demonstrate the presented methods. Compared with traditional BDD and MMDD generation methods, the decision diagrams extension method is more computationally efficient as shown through the running time

  14. Covariant diagrams for one-loop matching

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Zhengkang [Michigan Center for Theoretical Physics (MCTP), University of Michigan,450 Church Street, Ann Arbor, MI 48109 (United States); Deutsches Elektronen-Synchrotron (DESY),Notkestraße 85, 22607 Hamburg (Germany)

    2017-05-30

    We present a diagrammatic formulation of recently-revived covariant functional approaches to one-loop matching from an ultraviolet (UV) theory to a low-energy effective field theory. Various terms following from a covariant derivative expansion (CDE) are represented by diagrams which, unlike conventional Feynman diagrams, involve gauge-covariant quantities and are thus dubbed “covariant diagrams.” The use of covariant diagrams helps organize and simplify one-loop matching calculations, which we illustrate with examples. Of particular interest is the derivation of UV model-independent universal results, which reduce matching calculations of specific UV models to applications of master formulas. We show how such derivation can be done in a more concise manner than the previous literature, and discuss how additional structures that are not directly captured by existing universal results, including mixed heavy-light loops, open covariant derivatives, and mixed statistics, can be easily accounted for.

  15. Covariant diagrams for one-loop matching

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Zhengkang [Michigan Univ., Ann Arbor, MI (United States). Michigan Center for Theoretical Physics; Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2016-10-15

    We present a diagrammatic formulation of recently-revived covariant functional approaches to one-loop matching from an ultraviolet (UV) theory to a low-energy effective field theory. Various terms following from a covariant derivative expansion (CDE) are represented by diagrams which, unlike conventional Feynman diagrams, involve gaugecovariant quantities and are thus dubbed ''covariant diagrams.'' The use of covariant diagrams helps organize and simplify one-loop matching calculations, which we illustrate with examples. Of particular interest is the derivation of UV model-independent universal results, which reduce matching calculations of specific UV models to applications of master formulas. We show how such derivation can be done in a more concise manner than the previous literature, and discuss how additional structures that are not directly captured by existing universal results, including mixed heavy-light loops, open covariant derivatives, and mixed statistics, can be easily accounted for.

  16. Covariant diagrams for one-loop matching

    International Nuclear Information System (INIS)

    Zhang, Zhengkang

    2017-01-01

    We present a diagrammatic formulation of recently-revived covariant functional approaches to one-loop matching from an ultraviolet (UV) theory to a low-energy effective field theory. Various terms following from a covariant derivative expansion (CDE) are represented by diagrams which, unlike conventional Feynman diagrams, involve gauge-covariant quantities and are thus dubbed “covariant diagrams.” The use of covariant diagrams helps organize and simplify one-loop matching calculations, which we illustrate with examples. Of particular interest is the derivation of UV model-independent universal results, which reduce matching calculations of specific UV models to applications of master formulas. We show how such derivation can be done in a more concise manner than the previous literature, and discuss how additional structures that are not directly captured by existing universal results, including mixed heavy-light loops, open covariant derivatives, and mixed statistics, can be easily accounted for.

  17. Compatible growth models and stand density diagrams

    International Nuclear Information System (INIS)

    Smith, N.J.; Brand, D.G.

    1988-01-01

    This paper discusses a stand average growth model based on the self-thinning rule developed and used to generate stand density diagrams. Procedures involved in testing are described and results are included

  18. Modern summation methods and the computation of 2- and 3-loop Feynman diagrams

    International Nuclear Information System (INIS)

    Ablinger, Jakob; Schneider, Carsten; Bluemlein, Johannes; Klein, Sebastian

    2010-06-01

    By symbolic summation methods based on difference fields we present a general strategy that transforms definite multi-sums, e.g., in terms of hypergeometric terms and harmonic sums, to indefinite nested sums and products. We succeeded in this task with all our concrete calculations of 2-loop and 3-loop massive single scale Feynman diagrams with local operator insertion. (orig.)

  19. Modern Summation Methods and the Computation of 2- and 3-loop Feynman Diagrams

    International Nuclear Information System (INIS)

    Ablinger, Jakob; Bluemlein, Johannes; Klein, Sebastian; Schneider, Carsten

    2010-01-01

    By symbolic summation methods based on difference fields we present a general strategy that transforms definite multi-sums, e.g., in terms of hypergeometric terms and harmonic sums, to indefinite nested sums and products. We succeeded in this task with all our concrete calculations of 2-loop and 3-loop massive single scale Feynman diagrams with local operator insertion.

  20. Modern summation methods and the computation of 2- and 3-loop Feynman diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Ablinger, Jakob; Schneider, Carsten [Linz Univ. (AT). Research Inst. for Symbolic Computation (RISC); Bluemlein, Johannes [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Klein, Sebastian [RWTH Aachen (Germany). Inst. fuer Theoretische Teilchenphysik und Kosmologie

    2010-06-15

    By symbolic summation methods based on difference fields we present a general strategy that transforms definite multi-sums, e.g., in terms of hypergeometric terms and harmonic sums, to indefinite nested sums and products. We succeeded in this task with all our concrete calculations of 2-loop and 3-loop massive single scale Feynman diagrams with local operator insertion. (orig.)

  1. Finding and Accessing Diagrams in Biomedical Publications

    OpenAIRE

    Kuhn, Tobias; Luong, ThaiBinh; Krauthammer, Michael

    2012-01-01

    Complex relationships in biomedical publications are often communicated by diagrams such as bar and line charts, which are a very effective way of summarizing and communicating multi-faceted data sets. Given the ever-increasing amount of published data, we argue that the precise retrieval of such diagrams is of great value for answering specific and otherwise hard-to-meet information needs. To this end, we demonstrate the use of advanced image processing and classification for identifying bar...

  2. Ferroelectric Phase Diagram of PVDF:PMMA

    OpenAIRE

    Li, Mengyuan; Stingelin, Natalie; Michels, Jasper J.; Spijkman, Mark-Jan; Asadi, Kamal; Feldman, Kirill; Blom, Paul W. M.; de Leeuw, Dago M.

    2012-01-01

    We have investigated the ferroelectric phase diagram of poly(vinylidene fluoride) (PVDF) and poly(methyl methacrylate) (PMMA). The binary nonequilibrium temperature composition diagram was determined and melting of alpha- and beta-phase PVDF was identified. Ferroelectric beta-PVDF:PMMA blend films were made by melting, ice quenching, and subsequent annealing above the glass transition temperature of PMMA, close to the melting temperature of PVDF. Addition of PMMA suppresses the crystallizatio...

  3. Beyond Feynman Diagrams (1/3)

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    For decades the central theoretical tool for computing scattering amplitudes has been the Feynman diagram. However, Feynman diagrams are just too slow, even on fast computers, to be able to go beyond the leading order in QCD, for complicated events with many jets of hadrons in the final state. Such events are produced copiously at the LHC, and constitute formidable backgrounds to many searches for new physics. Over the past few years, alternative methods that go beyond ...

  4. Atomic energy levels and Grotrian diagrams

    CERN Document Server

    Bashkin, Stanley

    1975-01-01

    Atomic Energy Levels and Grotrian Diagrams, Volume I: Hydrogen I - Phosphorus XV presents diagrams of various elements that show their energy level and electronic transitions. The book covers the first 15 elements according to their atomic number. The text will be of great use to researchers and practitioners of fields such as astrophysics that requires pictorial representation of the energy levels and electronic transitions of elements.

  5. An Introduction to Binary Decision Diagrams

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif

    1996-01-01

    This note is a short introduction to Binary Decision Diagrams (BDDs). It provides some background knowledge and describes the core algorithms. It is used in the course "C4340 Advanced Algorithms" at the Technical University of Denmark, autumn 1996.......This note is a short introduction to Binary Decision Diagrams (BDDs). It provides some background knowledge and describes the core algorithms. It is used in the course "C4340 Advanced Algorithms" at the Technical University of Denmark, autumn 1996....

  6. Gluing Ladder Feynman Diagrams into Fishnets

    International Nuclear Information System (INIS)

    Basso, Benjamin; Dixon, Lance J.; Stanford University, CA; University of California, Santa Barbara, CA

    2017-01-01

    We use integrability at weak coupling to compute fishnet diagrams for four-point correlation functions in planar Φ "4 theory. Our results are always multilinear combinations of ladder integrals, which are in turn built out of classical polylogarithms. The Steinmann relations provide a powerful constraint on such linear combinations, which leads to a natural conjecture for any fishnet diagram as the determinant of a matrix of ladder integrals.

  7. Random Young diagrams in a Rectangular Box

    DEFF Research Database (Denmark)

    Beltoft, Dan; Boutillier, Cédric; Enriquez, Nathanaël

    We exhibit the limit shape of random Young diagrams having a distribution proportional to the exponential of their area, and confined in a rectangular box. The Ornstein-Uhlenbeck bridge arises from the fluctuations around the limit shape.......We exhibit the limit shape of random Young diagrams having a distribution proportional to the exponential of their area, and confined in a rectangular box. The Ornstein-Uhlenbeck bridge arises from the fluctuations around the limit shape....

  8. The influence of emotional interference on cognitive control: A meta-analysis of neuroimaging studies using the emotional Stroop task

    OpenAIRE

    Song, Sensen; Zilverstand, Anna; Song, Hongwen; d?Oleire Uquillas, Federico; Wang, Yongming; Xie, Chao; Cheng, Li; Zou, Zhiling

    2017-01-01

    The neural correlates underlying the influence of emotional interference on cognitive control remain a topic of discussion. Here, we assessed 16 neuroimaging studies that used an emotional Stroop task and that reported a significant interaction effect between emotion (stimulus type) and cognitive conflict. There were a total of 330 participants, equaling 132 foci for an activation likelihood estimation (ALE) analysis. Results revealed consistent brain activation patterns related to emotionall...

  9. Bifurcation diagram features of a dc-dc converter under current-mode control

    International Nuclear Information System (INIS)

    Ruzbehani, Mohsen; Zhou Luowei; Wang Mingyu

    2006-01-01

    A common tool for analysis of the systems dynamics when the system has chaotic behaviour is the bifurcation diagram. In this paper, the bifurcation diagram of an ideal model of a dc-dc converter under current-mode control is analysed. Algebraic relations that give the critical points locations and describe the pattern of the bifurcation diagram are derived. It is shown that these simple algebraic and geometrical relations are responsible for the complex pattern of the bifurcation diagrams in such circuits. More explanation about the previously observed properties and introduction of some new ones are exposited. In addition, a new three-dimensional bifurcation diagram that can give better imagination of the parameters role is introduced

  10. Estimated D2--DT--T2 phase diagram in the three-phase region

    International Nuclear Information System (INIS)

    Souers, P.C.; Hickman, R.G.; Tsugawa, R.T.

    1976-01-01

    A composite of experimental eH 2 -D 2 phase-diagram data at the three-phase line is assembled from the literature. The phase diagram is a smooth cigar shape without a eutectic point, indicating complete miscibility of liquid and solid phases. Additional data is used to estimate the D 2 -T 2 , D 2 DT, and DT-T 2 binary phase diagrams. These are assembled into the ternary D 2 -DT-T 2 phase diagram. A surface representing the chemical equilibrium of the three species is added to the phase diagram. At chemical equilibrium, it is estimated that 50-50 liquid D-T at 19.7 0 K is in equilibrium with 42 mole percent T vapor and 54 percent T solid. Infrared spectroscopy is suggested as a means of component analysis of liquid and solid mixtures

  11. Reading fitness landscape diagrams through HSAB concepts

    Energy Technology Data Exchange (ETDEWEB)

    Vigneresse, Jean-Louis, E-mail: jean-louis.vigneresse@univ-lorraine.fr

    2014-10-31

    Highlights: • Qualitative information from HSAB descriptors. • 2D–3D diagrams using chemical descriptors (χ, η, ω, α) and principles (MHP, mEP, mPP). • Estimate of the energy exchange during reaction paths. • Examples from complex systems (geochemistry). - Abstract: Fitness landscapes are conceived as range of mountains, with local peaks and valleys. In terms of potential, such topographic variations indicate places of local instability or stability. The chemical potential, or electronegativity, its value changed of sign, carries similar information. In addition to chemical descriptors defined through hard-soft acid-base (HSAB) concepts and computed through density functional theory (DFT), the principles that rule chemical reactions allow the design of such landscape diagrams. The simplest diagram uses electrophilicity and hardness as coordinates. It allows examining the influence of maximum hardness or minimum electrophilicity principles. A third dimension is introduced within such a diagram by mapping the topography of electronegativity, polarizability or charge exchange. Introducing charge exchange during chemical reactions, or mapping a third parameter (f.i. polarizability) reinforces the information carried by a simple binary diagram. Examples of such diagrams are provided, using data from Earth Sciences, simple oxides or ligands.

  12. The amplituhedron from momentum twistor diagrams

    International Nuclear Information System (INIS)

    Bai, Yuntao; He, Song

    2015-01-01

    We propose a new diagrammatic formulation of the all-loop scattering amplitudes/Wilson loops in planar N=4 SYM, dubbed the “momentum-twistor diagrams”. These are on-shell-diagrams obtained by gluing trivalent black and white vertices in momentum twistor space, which, in the reduced diagram case, are known to be related to diagrams in the original twistor space. The new diagrams are manifestly Yangian invariant, and they naturally represent factorization and forward-limit contributions in the all-loop BCFW recursion relations in momentum twistor space, in a fashion that is completely different from those in momentum space. We show how to construct and evaluate momentum-twistor diagrams, and how to use them to obtain tree-level amplitudes and loop-level integrands; in particular the latter involve isolated bubble-structures for loop variables arising from forward limits, or the entangled removal of particles. From each diagram, the generalized “boundary measurement” directly gives the C, D matrices, thus a cell in the amplituhedron associated with the amplitude, and we expect that our diagrammatic representations of the amplitude provide triangulations of the amplituhedron. To demonstrate the computational power of the formalism, we give explicit results for general two-loop integrands, and the cells of the amplituhedron for two-loop MHV amplitudes.

  13. Asymptotic laws for random knot diagrams

    Science.gov (United States)

    Chapman, Harrison

    2017-06-01

    We study random knotting by considering knot and link diagrams as decorated, (rooted) topological maps on spheres and pulling them uniformly from among sets of a given number of vertices n, as first established in recent work with Cantarella and Mastin. The knot diagram model is an exciting new model which captures both the random geometry of space curve models of knotting as well as the ease of computing invariants from diagrams. We prove that unknot diagrams are asymptotically exponentially rare, an analogue of Sumners and Whittington’s landmark result for self-avoiding polygons. Our proof uses the same key idea: we first show that knot diagrams obey a pattern theorem, which describes their fractal structure. We examine how quickly this behavior occurs in practice. As a consequence, almost all diagrams are asymmetric, simplifying sampling from this model. We conclude with experimental data on knotting in this model. This model of random knotting is similar to those studied by Diao et al, and Dunfield et al.

  14. Single-particle potential from resummed ladder diagrams

    International Nuclear Information System (INIS)

    Kaiser, N.

    2013-01-01

    A recent work on the resummation of fermionic in-medium ladder diagrams to all orders is extended by calculating the complex single-particle potential U(p, k f ) + i W(p, k f ) p > k f . The on-shell single-particle potential is constructed by means of a complex-valued in-medium loop that includes corrections from a test particle of momentum vector p added to the filled Fermi sea. The single-particle potential U(k f , k f ) at the Fermi surface as obtained from the resummation of the combined particle and hole ladder diagrams is shown to satisfy the Hugenholtz-Van-Hove theorem. The perturbative contributions at various orders a n in the scattering length are deduced and checked against the known analytical results at order a 1 and a 2 . The limit a → ∞ is studied as a special case and a strong momentum dependence of the real (and imaginary) single-particle potential is found. This feature indicates an instability against a phase transition to a state with an empty shell inside the Fermi sphere such that the density gets reduced by about 5%. The imaginary single-particle potential vanishes linearly at the Fermi surface. For comparison, the same analysis is performed for the resummed particle-particle ladder diagrams alone. In this truncation an instability for hole excitations near the Fermi surface is found at strong coupling. For the set of particle-hole ring diagrams the single-particle potential is calculated as well. Furthermore, the resummation of in-medium ladder diagrams to all orders is studied for a two-dimensional Fermi gas with a short-range two-body contact interaction. (orig.)

  15. Molecular modeling and structural analysis of two-pore domain potassium channels TASK1 interactions with the blocker A1899

    Directory of Open Access Journals (Sweden)

    David Mauricio Ramirez

    2015-03-01

    Full Text Available A1899 is a potent and highly selective blocker of the Two-pore domain potassium (K2P channel TASK-1, it acts as an antagonist blocking the K+ flux and binds to TASK-1 in the inner cavity and shows an activity in nanomolar order. This drug travels through the central cavity and finally binds in the bottom of the selectivity filter with some threonines and waters molecules forming a H-bond network and several hydrophobic interactions. Using alanine mutagenesis screens the binding site was identify involving residues in the P1 and P2 pore loops, the M2 and M4 transmembrane segments, and the halothane response element; mutations were introduced in the human TASK-1 (KCNK3, NM_002246 expressed in Oocytes from anesthetized Xenopus laevis frogs. Based in molecular modeling and structural analysis as such as molecular docking and binding free energy calculations a pose was suggested using a TASK-1 homology models. Recently, various K2P crystal structures have been obtained. We want redefined – from a structural point of view – the binding mode of A1899 in TASK-1 homology models using as a template the K2P crystal structures. By computational structural analysis we describe the molecular basis of the A1899 binding mode, how A1899 travel to its binding site and suggest an interacting pose (Figure 1. after 100 ns of molecular dynamics simulation (MDs we found an intra H-Bond (80% of the total MDs, a H-Bond whit Thr93 (42% of the total MDs, a pi-pi stacking interaction between a ring and Phe125 (88% of the total MDs and several water bridges. Our experimental and computational results allow the molecular understanding of the structural binding mechanism of the selective blocker A1899 to TASK-1 channels. We identified the structural common and divergent features of TASK-1 channel through our theoretical and experimental studies of A1899 drug action.

  16. Planning performance in schizophrenia patients: a meta-analysis of the influence of task difficulty and clinical and sociodemographic variables.

    Science.gov (United States)

    Knapp, F; Viechtbauer, W; Leonhart, R; Nitschke, K; Kaller, C P

    2017-08-01

    Despite a large body of research on planning performance in adult schizophrenia patients, results of individual studies are equivocal, suggesting either no, moderate or severe planning deficits. This meta-analysis therefore aimed to quantify planning deficits in schizophrenia and to examine potential sources of the heterogeneity seen in the literature. The meta-analysis comprised outcomes of planning accuracy of 1377 schizophrenia patients and 1477 healthy controls from 31 different studies which assessed planning performance using tower tasks such as the Tower of London, the Tower of Hanoi and the Stockings of Cambridge. A meta-regression analysis was applied to assess the influence of potential moderator variables (i.e. sociodemographic and clinical variables as well as task difficulty). The findings indeed demonstrated a planning deficit in schizophrenia patients (mean effect size: ; 95% confidence interval 0.56-0.78) that was moderated by task difficulty in terms of the minimum number of moves required for a solution. The results did not reveal any significant relationship between the extent of planning deficits and sociodemographic or clinical variables. The current results provide first meta-analytic evidence for the commonly assumed impairments of planning performance in schizophrenia. Deficits are more likely to become manifest in problem items with higher demands on planning ahead, which may at least partly explain the heterogeneity of previous findings. As only a small fraction of studies reported coherent information on sample characteristics, future meta-analyses would benefit from more systematic reports on those variables.

  17. The application of subjective job task analysis techniques in physically demanding occupations: evidence for the presence of self-serving bias.

    Science.gov (United States)

    Lee-Bates, Benjamin; Billing, Daniel C; Caputi, Peter; Carstairs, Greg L; Linnane, Denise; Middleton, Kane

    2017-09-01

    The aim of this study was to determine if perceptions of physically demanding job tasks are biased by employee demographics and employment profile characteristics including: age, sex, experience, length of tenure, rank and if they completed or supervised a task. Surveys were administered to 427 Royal Australian Navy personnel who characterised 33 tasks in terms of physical effort, importance, frequency, duration and vertical/horizontal distance travelled. Results showed no evidence of bias resulting from participant characteristics, however participants who were actively involved in both task participation and supervision rated these tasks as more important than those involved only in the supervision of that task. This may indicate self-serving bias in which participants that are more actively involved in a task had an inflated perception of that task's importance. These results have important implications for the conduct of job task analyses, especially the use of subjective methodologies in the development of scientifically defensible physical employment standards. Practitioner Summary: To examine the presence of systematic bias in subjective job task analysis methodologies, a survey was conducted on a sample of Royal Australian Navy personnel. The relationship between job task descriptions and participant's demographic and job profile characteristics revealed the presence of self-serving bias affecting perceptions of task importance.

  18. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  19. Logical and Geometrical Distance in Polyhedral Aristotelian Diagrams in Knowledge Representation

    Directory of Open Access Journals (Sweden)

    Lorenz Demey

    2017-09-01

    Full Text Available Aristotelian diagrams visualize the logical relations among a finite set of objects. These diagrams originated in philosophy, but recently, they have also been used extensively in artificial intelligence, in order to study (connections between various knowledge representation formalisms. In this paper, we develop the idea that Aristotelian diagrams can be fruitfully studied as geometrical entities. In particular, we focus on four polyhedral Aristotelian diagrams for the Boolean algebra B 4 , viz. the rhombic dodecahedron, the tetrakis hexahedron, the tetraicosahedron and the nested tetrahedron. After an in-depth investigation of the geometrical properties and interrelationships of these polyhedral diagrams, we analyze the correlation (or lack thereof between logical (Hamming and geometrical (Euclidean distance in each of these diagrams. The outcome of this analysis is that the Aristotelian rhombic dodecahedron and tetrakis hexahedron exhibit the strongest degree of correlation between logical and geometrical distance; the tetraicosahedron performs worse; and the nested tetrahedron has the lowest degree of correlation. Finally, these results are used to shed new light on the relative strengths and weaknesses of these polyhedral Aristotelian diagrams, by appealing to the congruence principle from cognitive research on diagram design.

  20. EEG Analysis during complex diagnostic tasks in Nuclear Power Plants - Simulator-based Experimental Study

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2005-01-01

    In literature, there are a lot of studies based on EEG signals during cognitive activities of human-beings but most of them dealt with simple cognitive activities such as transforming letters into Morse code, subtraction, reading, semantic memory search, visual search, memorizing a set of words and so on. In this work, EEG signals were analyzed during complex diagnostic tasks in NPP simulator-based environment. Investigated are the theta, alpha, beta, and gamma band EEG powers during the diagnostic tasks. The experimental design and procedure are represented in section 2 and the results are shown in section 3. Finally some considerations are discussed and the direction for the further work is proposed in section 4